The labyrinth of luck

Over the last few years I have come to appreciate how broadband networks are not widely understood, despite their ubiquity and importance. In particular, the relationship between a network’s basic operation and the resulting user experience is commonly misrepresented.

A quick recap of the “birds and bees” of broadband: a packet network is a dynamic system with many components. The state of these components is constantly evolving. Their evolution is based on probabilistic processes. The formal term to describe this setup is that it is a stochastic system.

For example, there is a “game of chance” to get into any packet buffer, and there is another one to get out of it again. Whether you get in, and if/when you get out, are both acts of pure randomness. At best we can bias the “dice” at the “network casino” towards certain desired outcomes.

As these mechanisms and processes all interact they result in a collective quality of experience (QoE), which is an emergent property. This is a subtle but crucial fact. It means that there is a mathematical maze – a “labyrinth of luck” – between the local network mechanisms and the end global performance.

This “labyrinth of luck” is conspicuously missing from the standard academic literature on broadband policy. Yet it has many important implications.

For instance, the local traffic management rules tell you very little about what the global performance will be. The only way to relate the two is to go through the labyrinth…and the only (current) way to do that is to actually operate the network! Users have no prior means of knowing what the resulting performance will be from the interaction of all the network mechanisms with the rules and probabilistic processes.

The results can even be quite counter-intuitive: giving an application lower “priority” might even result in better performance, if you have the “right wrong” kind of interactions between various protocols and system bottlenecks. Conversely, “neutral” packet handling can result in performance collapse! The same rules on different bearers (cable, 4G, DSL, etc.) might result in wildly different outcomes.

The idea that our intuition is a poor guide to probabilistic processes is widely documented. There are many examples, like the Monty Hall problem and quantum mechanics. In the context of broadband, the emergent nature of network and application performance means a widely-held assumption about QoE is false.

The common belief is that you can readily trace “bad” QoE outcomes to specific rules or mechanisms. This relationship does not exist. It’s not just grandma who can’t tell you why Skype or Netflix doesn’t work. I can’t either, at least not by looking at the network design and management rules alone.

This is because the intuitive “clockwork” model of a “linear” relationship between QoE and network operation is untrue. It’s especially untrue when the network is under heavy load, when it can exhibit nondeterministic behaviours. Yet this is the very situation when regulatory rules about “fair” QoE outcomes matter the most.

The “labyrinth of luck” means it’s also impossible (in general) to tell the difference between bad QoE due to “discrimination” versus the “wrong kinds of statistical fluke”. This means any regulatory attempt to define and detect “discrimination” is doomed, due to overwhelming false positives and negatives. Emergent unintentional unfairness dominates intentional unfairness (i.e. “neutrality violations”). So worrying about the latter is at best a distraction, and at worst a fool’s errand.

For the broadband industry to make progress, we need to anchor our engineering, economics and policy in the reality of the world. That means we need to spread the word that the stochastic “labyrinth of luck” both exists and has important consequences.

For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.