A beginner’s guide to ∆Q

I would like to offer you a different way of thinking about networks. The ∆Q framework (shortened to “ΔQ”) may be hard to type, but it’s quite easy to understand. As the ‘ideal’ metric, it is also rather useful, since it enables the precision engineering of performance and cost. In this three-part series, we will take a look at the key features of what I believe to be a breakthrough in our discipline.

∆Q is a fundamental new idea

Imagine for a moment a library of all human knowledge about networking, which is built as a tower block. The top floor has the most ephemeral knowledge, such as who won what deals with which clients last week. Below that are the products being put on the market for each selling season. In the upper middle floors are ideas about specific network technologies, whose time may soon pass.

As we go downward, the knowledge has more longevity. Nearer the ground are principles of engineering, developed over decades, and which stand the test of time. The ground floor is dedicated to science, with longstanding ideas about data encoding and transmission, and their relationship to physics.

The (perhaps surprising and unsettling) claim we are making is that important intellectual foundations have been missing As a result the conceptual edifice above is swaying and tilting. So with ∆Q we’re going right down into the basement of the building and underpinning the foundations, to sturdily hold up everything above.

∆Q has elements of philosophy, in that we are reconceptualising what a network is, and how to reason about its performance. ∆Q also establishes a new branch of mathematics, to enable quantified reasoning with high reliability. Philosophy and mathematics definitely belong in the cellars of networking knowledge.

To give you an analogy, it’s a bit like we’ve been making mobile phones and antennas for decades, with books on them on upper floors. But had yet to figure out the underlying physics, with empty shelves on lower floors. The way electromagnetism works is described by the squiggles of Maxwell’s wave equations. Similarly, ∆Q is a way of describing the “equations” of network performance.

Why should you care about ∆Q?

It is common with many great innovations for theory to lag behind practise. For example, with steam engines it took time to uncover the kinetic theory of gases; with aircraft, the principles of aerodynamics. The ΔQ framework helps to fill a networking ‘theory gap’ that previously had gone unrecognised.

Theory is important, as it gives us intellectual ‘leverage’ over the world. As the famous industrialist and polymath W Edwards Deming said, “experience without theory teaches nothing”. A craft-based approach to engineering leaves us all at risk of nasty surprises. Techniques we have relied upon may reach unforeseen limits, rather like the sound barrier for aircraft. Theory (when backed by scientific reasoning) gives us predictability to our processes and inventions.

It is the result of good theory, as well as applied engineering, that we have modern steam turbines and aircraft. Their performance and capability vastly exceed those of previous centuries. Likewise with networks, new theory allows us to transcend previous limits of the state of the art, and greatly advances the state of the possible.

It is not every day that we get to witness the birth of a new science, especially in an industry as mature as networking. So it’s rather exciting, once you see the practical potential for its application. But what kind of theory are we advancing?

A new concept: information ‘translocatability’

The new science that ΔQ offers us is a theory of information ‘translocatability’. That’s a fancy word we’ve appropriated, rather than invented. It refers to the logical thing data networks do, which is to replicate information at a distance, albeit with imperfection and tardiness. We are making a similar distinction to the one between the abstract idea of computation, versus the highly concrete one of a computer.

My big claim (but a readily substantiated one) is that ∆Q belongs alongside Church and Turing’s theory of computability from the 1930s, and Shannon’s information theory from the 1940s. If true, that’s a big deal.

In subsequent articles, we cover the essentials of ∆Q: How it re-frames the issue of “quality” in networks; lets us undertake performance engineering via a new resource model; and the wider significance of ΔQ. If you can’t manage to wait, you are welcome to read this PhD thesis!

For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.