As industries mature they develop quality management systems, and a dedicated sector offering testing, inspection and certification (TIC). Given the importance of user experience and hence service quality in telecoms, why are there apparently few or no TIC organisations for network performance?
No doubt as you have gone around the world you will have noticed many labels and marks on objects that declare their fitness-for-purpose. An egg will have been tested for cracks, an elevator will have been inspected for wear, and an aircraft type certified for flying passengers.
The industrial sector of testing, inspection and certification (TICs) is a very large one, turning over more than $80bn a year. It also comes with an interesting history. See my previous article “Are quality systems sexy?” for more background on a book that covers this subject.
I cannot help but notice the general absence of TICs involved in certifying the performance of networks. So, what is “quality”, how did TICs evolve as an industry dedicated to its management, and why are they missing in the network performance sector?
The nature of quality
What the book author tells us is that quality has both objective and subjective aspects. For example, the size of an egg is an objective fact; what size range and variation is acceptable for a given use or user is a more subjective matter.
Quality also has three levels:
- safety and environmental compliance (is it legal to sell it at all?);
- market perception (how is quality influenced by NGOs, media, consumer reviews, religious law?);
- and individual needs (is it fitness-for-purpose and have sufficiently managed variability?).
The development of quality systems
Quality systems have strong roots in international trade and the co-development of goods transport systems. In particular, they have developed alongside a state interest in ships and systems of power including warfare.
There is a process of quality improvement where high failure and variation starts as being seen as the expected norm. The status quo of unmanaged quality is then over time challenged and eventually replaced by a new paradigm of managed quality. Sometimes this is achieved by industry by insiders (e.g. the lightbulb), and other times by outsiders (e.g. the rifle).
Interchangeable parts and standardised systems are required to industrialise and scale, for which quality systems are a necessary co-development. Bodies to regulate quality slowly emerge in response to this need, e.g. credit rating agencies and accountancy companies. Eventually we develop quality metasystems to align standards and quality control across geographies and sectors, e.g. ISO.
TICs are a subset of the quality control industry
The category of testing, inspection and certification (TIC) is a notable one, but does not represent the whole quality sector. TICs often develop to satisfy internal needs of a specific industry. This could be for the safety of workers (e.g. with boilers), to protect investors (e.g. audit systems), or to support government taxation (e.g. with accounting).
There is a repeating process over time whereby quality standards start local, then become national and transnational. There may be a messy overlap of standards and TICs which never fully resolves itself.
Cheats never prosper (but they try)
Fraud is a constant problem in quality management, e.g. in certification processes where people want to claim benefits of quality without the costs. This is as much a B2B issue as a B2C one. It can be the quality systems themselves who perpetrate the fraud, not just those using them.
Quality systems can also become tools of competition and market manipulation, being used to gain advantage and special favours. This can be a result of consolidation, and the growth of centres of power and influence (e.g. GMOs, TTIP).
There are global “quality centres” like Hong Kong, Shanghai and Singapore that are based on historical trade patterns. Action in quality management is now moving to the “Post-China 16” countries, including Myanmar, Peru and Kenya.
So what about telecoms?
The only thing that the telecoms industry manufactures that nobody else can is performance for distributed computing applications. Mere connectivity comes from the post, or hand-delivery of data using public and private vehicles, ships and aircraft.
Thus is seems rather odd that in my entire telecoms career, including the best part of a decade dedicated to network performance, I have never encountered a single TIC player in this space.
One reason is that the basic science has been missing. We are still at the very early “unmanaged chaos” level of quality. This has been embedded culturally by the promotion of the false doctrine of “best effort”, which somehow liberates you from the awful liability of predictable performance delivered by a telco.
The military has only just got engaged in the area, with the first project to take the problem seriously being Future Combat Systems in 2003-5. Until someone loses a drone war due to bad mathematics, the demand for improvement will remain muted.
The rise of IoT and an “industrial internet” will create new demand for managed performance. It will require us to cleanly separate the world of datacomms (which is dominated by electrical engineering) from networking (which is a matter of distributed computing and hence computer science).
Anecdotally from my professional world I also see the action moving away from Europe, the US and China to dynamic emerging economies. They are willing to try new ideas, and it could well be there that we see new TICs emerge for the digital supply chain world. If so, that could also transform the power structures of global commerce in this century.
For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.