Are quality systems sexy?

The history commerce is also the story of the development of quality systems and their supporting institutions and conventions. Here I summarise a book that documents this evolutionary process.

On my flight from London to Oakland yesterday I had time to read “In Search of Quality” by Roland Kolb. It’s a somewhat haphazard book, and interesting nonetheless.

My present interest is in the need to develop more capable quality management systems for telecoms. Below is a summary of the key points from the book, and my own thoughts on what this might mean for network operators and their suppliers.

Quality is both objective and subjective, in that things have defined and measurable properties that are evaluated against criteria of acceptability. This evaluation appears to happen at three levels: safety and environmental compliance (so is it legal to sell the thing at all); market perception (influenced by NGOs, media, consumer reviews, religious law); and individual need (hence fitness-for-purpose, which has high variability as individuals vary in need).

Development of quality systems has strong roots in international trade and co-development of transport systems for physical goods. There has been, over many centuries, a state interest in ships (and them not sinking), and systems of power (including warfare) that result from having navies.

In each domain of quality improvement there is a natural process of improvement where high failure and variation goes from being seen as the expected and accepted norm, to that status quo being challenged and eventually replaced. This is sometimes done by insiders (e.g. the lightbulb), other times by outsiders (e.g. the musket).

The removal of unwanted variation results in interchangeable parts and standardised systems. These are required to industrialise and scale, for which quality systems are a necessary co-development. Hence quality cannot be seen as an “add on” to the attributes of a product or service, but is intrinsic to its level of development and democratisation.

Various bodies to regulate quality slowly emerge in response to need, e.g. credit rating agencies and accountancy companies for financial services. Eventually each industry develops quality metasystems to align standards and quality control across these, e.g. ISO.

There is a notable category of testing, inspection and certification (TIC) bodies. TICs often develop to satisfy internal needs of industry, not the needs of consumers or customers. This includes the safety of workers (e.g. steam boilers), to protect investors (audit systems), or support government taxation (accounting).

There is a common process over time where quality standards start local, then become national and transnational. There may be a messy overlap of standards and TICs which never fully resolves itself.

Fraud is a constant problem, e.g. in certification processes, as people want to claim benefits of quality without the costs. This is as much (if not more) a B2B issue as a B2C one. There providers of the quality systems themselves can also perpetrate fraud, not just those using them.

As they mature, quality systems become tools of competition and market manipulation, being used to gain advantage and special favours. You see a consolidation of quality approval, under the direction of centres of power and influence (e.g. how GMOs are promoted by TTIP).

There are global “quality centres” like Hong Kong, Shanghai and Singapore that are based on historical trade patterns. The action is now moving to the “Post-China 16” countries, like Myanmar.

So what for telecoms? Well, the claim that quality systems are “sexy” may be taking things a bit too far, even if they book does ponder quality control for erotic toys. This isn’t going to help you have a better orgasm (or if it does, please share the secret).

The telecoms industry faces a strange situation whereby it has highly mature quality systems for its historical circuit business, but has remarkably weak control over quality for newer broadband and cloud products whose performance is based on statistical resource sharing processes.

My own operator and application service clients are emphatic: they find the user experience too unpredictable, and want better visibility and control over it, but can’t easily relate it to the network service quality on offer. Their technology vendors aren’t yet offering them the measurement tools or management frameworks to let them achieve their goals.

Perhaps the main take-away for me from this book was that quality control should be first seen as a B2B issue. The quality system for an application provider needs to understand how the supply chain results in the present experience. Only then can any programme of work be undertaken to change that experience. Hence the initial focus for improvement is on better visibility and control over the internal management boundaries and how that results in the current reality.

For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.