Announcing the Overture ∆Q testbed

Just as a new aircraft is tested in a wind tunnel, any new application needs to have its performance limits discovered in a network emulation testbed.

Two of the talks from last week’s Scientific Network Management for Cloud Computing workshop were recorded on video. The first one is now available for everyone to view.

In this first video, Peter Thompson provides an overview of the new Overture ∆Q testbed. One of the key examples shown is how it can reproduce the performance of any application (in this case, a speed test). This is achieved by precision control of the emulated packet flow.

In electrical engineering we measure flow in amps. With water pipes, we use litres per second. Oddly, telecoms has failed to agree a unit standard unit of supply and demand for information flow. When you point this out, it comes as rather a surprise. How can something so basic be missing!

The good news is that finding this “unit” is now a solved science problem. It just happens to require a new branch of mathematics called ∆Q. You can think of ∆Q as being the “complex numbers of probability”, bringing packet loss and delay into a unified probabilistic metric.

I have already introduced ∆Q measurements, and the next step is to create ∆Q models. A key modelling technology is a network emulator, which allows you to characterise the performance envelope of any application when running under different deployment conditions.

Packet networks are stochastic systems, meaning the task requires us to replicate the quality variability due to randomness. Whilst there are existing emulators like NetEM, the precision of the emulation they offer is relatively weak. You can’t really be sure that the result is trustworthy.

(As an aside, imaging you worked in oil and gas, and your senior management hadn’t ever heard the word “hydrocarbon”, the very thing that defines your resource. Well, the absence of awareness of “stochastic” is the broadband industry equivalent. It’s crazy, and it’s the reality we face.)

The development of the Overture testbed was part-funded by Innovate UK. The UK is arguably the world leader in network performance measurement and management, with many of the leading players headquartered here, and a number of notable scientific firsts and breakthroughs.

You can think of the testbed as being a “software-based flow regulator”, a bit like the arrestor valve you might put into you home plumbing to stop the knocking noise when you flush a toilet. (Telecoms is basically digital sewage disposal… but I didn’t tell you that, it’s an industry secret.)

What is special about this Overture testbed is that it recreates the “quality attenuation” of any network, as specified using the ∆Q calculus. It’s like having a high-fidelity stereo system hooked up to a CD player, in a world where nobody has heard anything more sophisticated than a gramophone playing 78rpm shellac records.


For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.