Stationarity is the new speed

The broadband industry has a lot of unhappy customers complaining about poor speed. They should instead be asking for better stationarity. Here’s why.

Stationarity is the new speed

 

Did you know that the UK’s “superfast” broadband access infrastructure is unfit for 5G due to non-stationarity? You do now!That fact is demonstrated in an important new presentation, stuffed full of important data and information that everyone important in broadband Internet and cloud application access needs to see and grasp.

The most important term in packet networking you’ve never heard of is “stationarity“. It is the statistical stability that is necessary for all broadband applications to work. After all, packet networking is, by definition, statistical multiplexing of datagrams. So don’t be too surprised if its statistical nature is figural to its utility.

For several technical reasons (including bad sciencewrong architecturemissing mathematicsubiquitous design mistakesmisguided regulationunhelpful workarounds, and inadequate mechanisms), this property is being lost in packet networks. Without stationarity, packet networks are useless.

These issues mean there is a growing “statistical noise” (i.e. non-stationarity) that causes networks and applications to fail. Enjoy today’s prototype Internetwhile it lasts! For the reality we face is that many of the complaints of broadband users about poor “speed” are really QoE problems caused by a lack of stationarity.

I have got permission from the network performance science experts at Predictable Network Solutions Ltd to share data from some of their high-end consulting activities. Note that the data is not showing “faults”, but the present unsatisfactory state-of-play in even the best-managed broadband networks.

These charts have been annotated by me with a commentary to explain their nature and significance, and turned into a presentation for everyone to read. Any errors are to be attributed to me.

The goal of this presentation is to share exemplars of common broadband Internet access performance phenomena. It offers real-world examples of both stationarity and non-stationarity, and discusses the implications for broadband stakeholders.

These phenomena are only fully visible when using state-of-the-art high-fidelity metrics and measures that capture instantaneous flow. (If you want early demo access, do the survey before it closes.)

With respect to my attention-grabber at the start, I have to emphasise that is not a scandal that UK “superfast” broadband lacks the stationarity needed for 5G deployment, and nobody is to blame. It is not reasonable to have expected Ofcom at the time to have asked BT to deliver a “5G-ready” network. There is also no reason BT would have delivered a network for a technology that didn’t exist using metrics they didn’t have.

This is a new and immature industry grappling with hard problems. However, this unexpected 5G readiness issue does signify a need for the whole industry to “up its game” in terms of quality management and performance engineering. The good news is that this is essentially a solved technical problem via the quality attenuation framework.

Using this framework and its allied tools and technologies, we can measuremodel and manage (non-)stationarity. The problem is that “we” is not many people. “We” all would fit in a minivan.

As an industry collective, everyone now needs to educate themselves on stationarity, get the word out, disseminate the measurement tools, develop standard metrics and models, and drive industry adoption of new techniques.

The prize is to turn broadband from low-value non-stationary “best effort” connectivity into high-value stationary cloud application access.

Want to join us? After all, it is important.

 

For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.