This third and final article completes our introduction to ∆Q and the new science of network performance. It follows on from the first and second articles.

**Three elements of the ∆Q framework**

∆Q is about *quantitatively* capturing the relationship between the objective physical world (of packets), and the subjective world of the user experience. It comprises three key elements:

- ∆Q
*metrics*, which capture the critical essence of the domains of*both*the network*and*customer, and at every level of abstraction; and a - ∆Q
*calculus*, that formally relates the sets of metrics; and a - ∆Q
*algebra*, that lets us meaningfully do “what if?” and “so what?” types of calculations.

The big ‘aha!’ of the ∆Q framework is with the metric part. ∆Q-based metrics *extend the idea of randomness* to include ‘non-termination’. For example, when you roll a dice, ‘non-termination’ might include losing it behind the sofa, or it landing miraculously balanced on an edge rather than flat on its side.

**A new branch of mathematics**

∆Q metrics are part of new branch of mathematics that sits *underneath* probability theory. They let us reason about both how long things take, and also whether they might not happen, *at the same time*. As such, it takes on board the imperfection of the world, by not reasoning about failure as a separate case.

When probability theory was being formally defined this idea was not developed, because there were no applications foreseen for it. Unlike physical objects, packets can be erased, so their delivery doesn’t occur. That means we now have a major application important to society!

We use ‘*improper* random variables’, which capture (in packet networks) both continuous delay and discrete loss into *one* probability measure. Indeed, ΔQ captures the probability of all causes of ‘non-termination’, with packet loss by erasure being merely one example. Others causes might include electrical noise on the line, or a cosmic ray damaging a packet in transit.

This doesn’t seem like a big deal at first, but it is. It’s a basic conceptual innovation like ‘zero’ in arithmetic, or ‘imaginary numbers’ in complex analysis.

Until you unify delay and loss, you are left with standard queuing theory. When applied to networks, this fails to model reality. Why? It inadequately models ‘non-termination’ (or other forms of ‘non-working’ like network equipment temporarily being driven too hard.)

The resulting models of networks then don’t reflect their actual behaviour, and often create multiple metrics with unknown couplings. So if we try to use those models to control networks (e.g. via QoS algorithms), we don’t get the results we desire. As such, we fail to understand and contain the resulting QoE hazards.

**A conceptual innovation about constraints**

This conceptual jump is a tiny bit like how in in physics Einstein came up with E=mc². This equation links energy and mass, which had previously been seen as wholly distinct. It was rightly seen as a huge leap in our understanding of the cosmos at large scales, even if it took a while to be accepted.

Indeed, the analogy works quite well: E=mc^{2} comes as a consequence of the Lorenz transformation, which shows momentum being *conserved*. Likewise, ΔQ is conserved. The speed of any object has a boundary (‘c’), and it was the acceptance that there *was* a boundary that spurred on the changes. Similarly, ΔQ enables us to model networks in saturation, and reason about the absolute boundaries to performance.

**The start of “unnatural science”**

ΔQ is different to Einstein’s relativity equations in one critical respect. Packet networks are *man-made *worlds, which run ‘games of chance’ to rules *we ourselves create. *These worlds then operate within real-world constraints of physics, technology and economics.

Hence you can think of Einstein’s relativity as being a key idea of *natural* science, and ΔQ as being a foundational idea of a new ‘*unnatural* science’. (Computer science is arguably ‘unnatural science’, too.) The newness is that we add in humans, our intentions, and our subjective evaluation of success.

Indeed, ΔQ goes one-up on Einstein: in these man-made worlds, it lets us model how the tiny ‘quantum’ random effects of packets interacting relate *all the way up* to the most ‘cosmic’ view of the customer experience. That’s something we had in the world of circuits, but lost when we went to packets. We can only achieve this feat because ΔQ adopts a paradoxical ‘attenuation’ viewpoint on the world.

To take another analogy from physics, ΔQ could be seen as a distant parallel to how we understand electromagnetism. The behaviour of light can be accurately modelled by Maxwell’s wave equations. Interpretations of light as waves or particles are simplified views, each capturing only *part* of that reality. Likewise, packet loss and delay are only facets of a unifying phenomenon.

**Applications of ∆Q go beyond packet networks**

We have only touched on a small part of the ∆Q framework here. There’s a lot more to ∆Q than one unification step of loss and delay. We have not discussed the specifics of the metrics at each level of abstraction and how to measure them. We have not touched on the refinement calculus and algebra, or their uses and benefits.

Nonetheless, my hope is that you will have caught a glimpse of the single simple omission that has sabotaged the science of data networking for several decades. Moving to a unified stochastic model that joins loss and delay is the key to opening up a new science of network performance. ∆Q is that vital missing step in our understanding.

Indeed, ∆Q is a quantitative model not just of networks, but of a class of systems which ‘attenuate quality’. The SDN/NFV re-incarnation of the distributed computing world is one of many examples.

You can think of this as being like entropy, where some form of ‘badness’ accumulates, and sometimes you get ‘total badness’. We could be modelling interacting business processes, which also take time to complete, and sometimes fail to complete at all. That can also be modelled using ∆Q.

As such, ∆Q is a philosophical advance for the whole of science, not just packet networking.

**For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.**