Not so Clear-Cut: Measures of Success

The much-loved metrics of measuring the success of an assurance program hardly ever tell the whole story. If you are the IT security fellow in your CSP, how do we measure your work? The number of systems scanned for vulnerabilities according to some agreed plan? The number of vulnerabilities identified and sealed? If you are in revenue assurance, what marks your good work? The amount of revenue loss identified? The amount of revenue loss prevented? The amount of revenue recovered? If you are in audit, what should we look at? The number of audits completed according to schedule? The number of accepted findings? The number of recommendations accepted and implemented by the auditee within the agreed timelines? The non-compliance exposure (in $ terms) prevented as a result of your audits? What if your job includes ensuring data privacy? Should we look at the number and reach of staff “sensitization” workshops and forums organized? The number of surveys done? The number of processes implemented, changed or discarded due to privacy concerns identified? The number of whistle-blower reports received and followed up to completion?  Forget the dizzying ambiguities in some of the terms I have used, some are deliberate but most are just the way things are. I assure you that I have seen worse!

The focus on the “quantifiable”, when taken to another level, can completely debase the work we do. A colleague of mine in a previous job used to brag about the number of RA controls he had implemented. We would be sitting in a meeting and he would announce with pomp that he had just launched another 25 brand new controls for mobile money. Sadly we did not have drumrolls to accompany such pronouncements. I have no proof but I suspect those controls were deployed merely to give an illusion of progress. Every time he launched into this predictable game of vanity, my mind would form a mental picture of him telling his subordinates “Folks, by the end of the week we need some 25 new controls. So, X, I want 9 from you… 7 from Y and 9 from Z.” As expected, his controls (mostly duplicates/rephrased versions of previously poorly-designed ones) were hardly addressing his risks. That did not stop him from regularly asking me how many controls I had for mobile money. Knowing his game, I would always meekly answer, “About 7… 3 of which are key.”  Ah, the look of gratification on his face. Then he would look at me with pity. I, of course, pitied him more. I must say this was a nice symbiotic relationship – I fed his ego generously and he, in turn, provided much needed amusement during some incredibly difficult times.

Truth be told, we must provide a way to measure our work – performance management, remuneration and incentives all require some solid numbers that determine why person/team A is better that person/team B and the resulting differences in performance rating and reward. Unfortunately, measuring the achievements of an assurance function in numbers is extremely basic. We assume that the “soft non-metric” components of our work are simply additional, side-benefits and not core to our purpose. As expected, since nobody is talking about these “unquantifiables”, our attention is taken up by the numbers. We soon find ourselves in a position where we eat, breathe and live the metrics. Taking the specific case of revenue assurance, as expected, management and risk committees will home in on the numbers. How much did we lose? How much would we have lost had it not been for RA work and so forth? Hardly anybody asks much about the culture changes that RA drives (or should drive) in the people who interact with revenue assurance teams. These aspects are difficult (if not impossible) to measure but it is hardly wise for us to ignore them entirely.

Using the example of good old audits, the scenario is applicable as well. The audit report will always be based on findings. It simply cannot capture the full discussions, the back-and-forth, the push back, the consensus and sometimes the full chaos that characterize audits. However, it is these very things (which are integral to the audit process) that the auditee or the “small man” in the organization uses to buy in or to reject our cause as assurance professionals. Can we put a value to that buy-in and can we measure how well we are doing in accumulating it? If we approach assurance as bullies, our metrics can even look better. We can, for example, push through more audits, faster, wider, and deeper even when auditees are swamped with other projects and we can drive more findings and remedial actions and deadlines that make little sense, simply relying on the power of our mandate.

Forgive my convoluted arguments, the point is rather simple. Even if we may not be able to put a number to our effectiveness at this, we have no choice but to refocus efforts in winning the souls of all CSP employees in the fight to embed risk management holistically. Whenever I am in a meeting and somebody mentions revenue leakage, if people turn instinctively to look at me, I know I have not won the battle yet. The battle is only won when the “small man” passionately feels that revenue assurance is not just Joseph’s problem.

I would wager that if your metrics are looking good but you are not “feeling effective” in the organization, you are probably on to something. Chances are that you have focused on the metrics, delivered great results and won many battles. However, if you have not already lost the war, you are on the way to your Waterloo.  Your armour was breached because you focused on what you could count, while all along the things that you couldn’t count were chipping away.

It is time we lost that glow we exude because of hitting target after target. Stop milking the metrics. Ultimately, the numbers of today may look good but it is the “unquantifiables” of today that shall determine the numbers of tomorrow – and therein lies the arduous task because it turns out that the soft non-metric components of assurance are anything but soft.

Not everything that counts can be counted, and not everything that can be counted counts.

William Bruce Cameron

Joseph Nderitu
Joseph Nderitu

Joseph Nderitu is a director at Integrated Risk Services Ltd and specializes in revenue assurance. He previously worked as Head of Revenue Assurance and Fraud Management at Vodacom's operation in Tanzania, having previously served in the same role at Vodacom Mozambique.

Before his work with Vodacom, Joseph was an internal audit manager for Airtel, with responsibility that covered their 17 countries in Africa. Whilst at Airtel, Joseph led reviews of the Revenue Assurance, Customer Service and Sales & Marketing functions.

Prior to his stint at Airtel, Joseph was an RA manager at Safaricom in Kenya. He holds an MSc Degree in Information Systems.

Related Articles

Get Our Weekly Newsletter by Email