The history and philosophy of ∆Q

This is the second in a series of articles introducing ∆Q, the breakthrough new science of network performance. For the first article, click here.

A brief history of ΔQ

The seed of the idea for ∆Q came from safety-critical systems: how does performance affect safety? There were models of how to create “functional isolation” for distributed systems (e.g. so the states of traffic lights interact in the right order). Yet there was no equivalent for acceptable “performance isolation”, so that the timing of those changes would be suitably constrained.

Over time it became clear that the real issue of performance over statistically shared media is what happens in saturation. ΔQ allows us to model that situation (and nothing else does!). This is of great interest, as being able to safely run networks in saturation means that the utilisation of the assets can be significantly increased.

These modelling techniques are now well-proven in actual use in many environments. They have been used at places like BoeingBT [PDF], CERNKPSNas well as in several mobile networks. For the technical detail, see PhD theses from Kent (“A New Blueprint for Network QoS”) and CERN (“Analysis and predictive modeling of the performance of the ATLAS TDAQ network”) [PDF].

Why “ΔQ”?

The “Q” in “ΔQ” naturally indicates we are in the world of quality. So what is ‘quality’, at least in terms of our world of distributed computation?

Quality is something that can only ever be lost. What networks do is to lose quality whilst replicating information at a distance. By that we mean the ideal network instantly and perfectly replicates information, with no quality loss. All real networks are worse than that ideal, and impair quality. Since the word ‘impair’ has a derogatory overtone that repels engineers, we substitute ‘attenuate’ as a more neutral term.

With quality attenuation, we are in the world of phenomena that are called ‘privations’. For instance, noise is the privation of silence, as silence is being lost. What we colloquially call ‘good quality’ in packet networks is actually the absence of excessive quality attenuation. This topsy-turvy way of thinking can be a little hard to grasp at first: it’s Alice stepping through the mirror in Wonderland to the other side.

A way of thinking about quality attenuation is that it accumulates along any network path in each direction, since the amount of delay or loss only ever grows as information is copied over and over. The result of this impairment is to eventually degrade the user’s perceived application performance.

Now remember, we’re in Alice’s mirror world: networks don’t enable performance as a positive attribute, but instead place a limit on negative effects. We are interested in having this impairment (“attenuation”) being low enough so the degradation is acceptable.

The “Δ” in “ΔQ” indicates that we are interested in relative changes in the attenuation, as there is no absolute metric of the “quality” of any packet. This is like how we measure noise in decibels, as a relative measure.

A method of abstraction and reification

The next thing ΔQ offers is a way of relating the impairment of the packets to the impairment of the user experience.

If you think of a game of golf, you can see it from the molecular, up through to the motion of the ball on the course, to the score, which affects a handicap or tournament outcome, which in turn might relate to the relative popularity of golf to other games.

Likewise we can think of networks at many levels, each more abstract: the individual packets, probability distributions of loss and delay, application outcomes, customer experiences, and operator profitability.

In golf, the process of abstraction is what relates the motion of the ball to the score. Reification goes the other way. Similarly, in packet networks, we want to abstract what happens to the individual packets, and relate it to the customer experience. Conversely, we want to be able to define a customer experience, and work out what needs to happen to the packets to deliver it.

At every level between the absolutely concrete and the completely abstract we think of the problem in the same way: to what extent has quality has been “attenuated”?

Engineering is the act of (correctly) abstracting and reifying. This consistent ‘quality attenuation’ framing of ∆Q allows us to link every level in each direction. This is crucial to making precision performance engineering possible.

For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.