Measurement and the Eureka Moment

As we reach the end of the year, my thoughts turn to the passage of time, and its measurement. Do any of us know how to measure? The question may sound facetious, but I want you to contemplate what I am asking. Measurement is a starting point for risk management and business assurance. We make management decisions based on what is measured. But how much effort is expended on measurement, and is the measurement fit for purpose? Perhaps this is a good time of year to reflect on how the human race measure those things that are not easy to measure – like the passage of time – and some lessons from the history of measurement.

Consider the following. All of us know how to read dials. But measurement is not the same as reading a result from a measurement device. I know how to read the hands on my watch. But I have never made a watch. I can read the speedometer in my car. But I have never made a speedometer. I can understand a dashboard report, even if I have no idea about the way data was collected and manipulated to produce that dashboard report. And I can imagine making a tape measure, but I would probably make it by marking off the units of length by comparing it to an existing standard – like another tape measure. So I know some things about measurement, and also some of the limits of my knowledge. Would I know how to measure a property that has never been measured before, when I have no pre-defined tools or standards to help me? This is exactly the challenge faced by business assurance and risk management. We are confronted by the need to measure things that are not easily susceptible to measurement, and for which there is no great history of developing the methods for measurement. We want to measure error rates, and their consequences. We want to measure the probability of uncertain outcomes. We want to measure the costs and benefits associated with those outcomes. Nobody disagrees with those assertions, and they are often repeated. But how often do we discuss and improve the tools and standards used for measurement? Rarely. Too rarely. And lacking the tools, we can fall into a trap, building towers of rational decision-making upon irrational foundations.

I devoted a chapter of our book to the topic of epistemology, the study of what can be known and the limits of knowledge, and how it applies to revenue assurance. At the time, I knew this was a challenge to many. Some were bound to consider it vexatious. Others would take a more convenient approach, and just pretend there are no obstacles to overcome – just delegate measurement to somebody else, in the hopes they solve the problem or else will make up answers to appease their bosses. In hindsight, I should have been put even more effort into dealing with the crux of the epistemological obstacle, which is our ability to measure. It is not enough to identify that we have problems, or even their causes. We also need to know the scale of our problems, in order to decide if it is cost-effective to address those problems.

Consider the following scenario. A consultant wants to puff out their chest and show off their expertise. They are asked about rates of revenue leakage, in order to justify the cost of the services they offer. So they give an answer. What method did they pursue to arrive at their answer? Well, they asked the opinions of others. They reported what their instincts told them to be true. They state that somebody else found something else somewhere else in the past, and so, by analogy, the scale of problems will be similar in the here and the now. Now imagine a scenario where I ask the same consultant to measure time, without the use of a timepiece. The consultant has an instinct for whether one duration was longer or shorter than another. The consultant can ask other people what they thought was the amount of time that elapsed. And, in a crude way, for long enough periods, the consultant will be able to measure time by the movement of the sun, and the passing of days. But that is the limit of what they offer. And what they offer is very crude, and hence not useful for decision-making. We cannot use the consultant’s estimations to manage our daily affairs, and nor can we afford to suffer devastating losses just so we can find out that we are suffering from serious problems. Now imagine I am on a 17th century ship, sailing the oceans, and I want to know where I am. Once again, I ask the modern-day consultant, who has been magically transported back in time and placed upon the deck of this ship. We know that if we can measure the current time and compare this to the time at an independent reference point, we can determine our longitude. But none of us possess a clock. So the consultant does a survey of the sailors on the ship, asking them what they think is the time at Greenwich. He tells me what they said, and what his instincts say. So, in a way, he answered the question. But his answer is less than useless, because it is so likely to be wrong. If we want a proper answer, somebody needs to invent a precise clock that works at sea. And that is no trivial problem. In fact, that particular problem was so difficult that the British government offered an enormous prize to anyone who could make such a clock. As the Greenwich Museum points out, what we take for granted now was an extraordinary challenge back then:

Like squaring the circle or inventing a perpetual motion machine, the phrase ‘finding the longitude’ became a sort of catchphrase for the pursuits of fools and lunatics. Many people believed that the problem simply could not be solved.

The successful inventor needed to devise a practical solution that represented the cutting edge of science and engineering. John Harrison was the man who eventually solved the problem of how to make an accurate maritime clock, and he spent most of his life doing so. Even then, there was resistance to giving him his proper thanks and reward. Harrison had to petition the king to receive his prize!

And therein lies the heart of our problem too. No ship’s crew was going to solve the longitude problem on their own. The solution was produced because a great deal of investment was needed to come up with a solution that benefited many. The same is true of our many metaphorical ships, our telcos, bobbing around on the endless ocean of rough guesses, lacking the tools and standards to measure properly. In other words, the solution to our problem lies in overcoming a very serious obstacle that none are properly incentivized to tackle. Most of the people tackling the problem have entirely the wrong skills. It takes a degree of expertise to conduct a survey, but even the best survey is woefully inadequate for this challenge. And measuring the time and location accurately means resisting all temptation to give the answer we would like to be true. This challenge will not be addressed by opinion and puffed-out chests. It demands science and engineering.

“At the heart of science is an essential balance between two seemingly contradictory attitudes – an openness to new ideas, no matter how bizarre or counterintuitive they may be, and the most ruthless skeptical scrutiny of all ideas, old and new. This is how deep truths are winnowed from deep nonsense.”

Carl Sagan

Risk management and business assurance pass Sagan’s first test. Any practitioner must be open to the idea that people and their systems are, in general and in aggregate, more fallible than any specific individual believes themselves to be. That is an incredibly counterintuitive idea. Well done to us, for having apprehended the truth. But we fail the second test. Our scrutiny is far from ruthlessly sceptical. When reading through the literature on enterprise risk management, I am permanently reminded that for all the advice and theorizing on display, there is a systemic failure to address the fundamental challenge: how to measure risk. People skip ahead to outlining what to do about it, as if they already had a decent measure. That is like deciding to change course on a ship, without knowing where the ship currently is. Or they spend all their time identifying risks, as if a long laundry list of worries can magically rearrange itself into a prioritized plan of action. That is like reading a map without ever plotting a course. And compared to the literature on enterprise risk management, the literature on business assurance for communications providers is greatly inferior. This should not be a surprise. The former draws upon a vast pool of global resources. The latter draws on a much smaller collective pool of knowledge and talent. At least sailors had a bedrock of truth they could use to calibrate poor guesswork. They would spot land, and find themselves a long way from where they thought they were. Or they would starve at sea. Starving at sea is a non-trivial consequence of inaccurate map-reading and measurement. Perhaps some consultants would improve if they were made to fast for every inaccurate forecast they made, but even this would be an imperfect feedback mechanism. Some of our leakages may never be detected, if we do not look for them, and the nature of uncertainty means it is always debatable how far we can calibrate measurement of future risks based on what has happened in the past.

Many years ago, a few people liked a metaphor I came up with, which was designed to illustrate the essence of our epistemological dilemma. I described leakage as an iceberg, with some above water, and hence visible, and some below water, and hence invisible. We know our knowledge is imperfect, and incomplete. When we measure visible leakage, we are not truly measuring the whole iceberg, because we also expect some of it is hidden from view. Those were the days before I started blogging, and I would routinely include a few slides on the iceberg whenever I did a conference presentation. They say imitation is the sincerest form of flattery, which I suppose means that some people flattered me a lot, not that they had the good grace to mention their source. Such is the wisdom of some consultants – they claim to know a lot, but cannot say where their knowledge comes from. The metaphor of an iceberg floating in the water also leads me to another historical analogy. Many know the story of Archimedes, who had a great insight whilst lying in his bath, exclaimed ‘eureka!’, leapt out and ran down the street naked. Archimedes had discovered a new and precise form of measurement, which worked by calculating the volume of water displaced by a submerged object. It was a moment of genius. And that is what we need, and what I have been waiting for since coining the iceberg metaphor. Guesstimates, instincts and crowd-sourced surveys will never deliver useful results. What we need is a eureka moment, where practical and precise measures will revolutionize decision-making, giving meaning to all the activities that have already been mapped out as coming after proper measurement. Or maybe we need many eureka moments, like John Harrison’s many timekeeping innovations. What we cannot do is to treat the problem as solved just because it is darned inconvenient that the problem has not, in fact, been solved.

Writing that, I feel like my next words should be to share a eureka moment on how to improve measurement. Sadly, words fail me. I have no bold new insight on how to radically improve measurement. Maybe our measures will develop like John Harrison’s timepieces – through many small but important improvements over the course of a lifetime. But perhaps I should forgive myself. For now, my eureka moment is much simpler: to motivate an improvement to our measures, we must begin by consciously recognizing the need to improve. And our next task also becomes plain. We need somebody to offer a great big reward to whoever invents those improvements. Perhaps somebody will step forward to motivate progress in 2013. Either way, as the calendar turns, I wish you a happy new year.

Eric Priezkalns
Eric Priezkalnshttp://revenueprotect.com

Eric is the Editor of Commsrisk. Look here for more about the history of Commsrisk and the role played by Eric.

Eric is also the Chief Executive of the Risk & Assurance Group (RAG), an association of professionals working in risk management and business assurance for communications providers. RAG was founded in 2003 and Eric was appointed CEO in 2016.

Previously Eric was Director of Risk Management for Qatar Telecom and he has worked with Cable & Wireless, T‑Mobile, Sky, Worldcom and other telcos. He was lead author of Revenue Assurance: Expert Opinions for Communications Providers, published by CRC Press.

Related Articles

Get Our Weekly Newsletter by Email