Can you predict what will happen tomorrow? Yes… but your prediction may be wrong. Can you improve the reliability of your predictions? Yes… but you will not know if you succeeded until after you see the results. Can you predict with certainty? No… but you may be ignorant enough to feel certain. How do you measure your own ignorance? You cannot… you are too ignorant. But surely you can make inferences based on the success of your previous predictions? Yes… but only in the long run, and even then, you may get caught out because of gaps in your model of cause and effect, or gaps in the data you put into your model. Should we just give up trying to predict the future? No, but we should be realistic about the chances of success. Why are you writing this? It is really hard to compellingly explain the fundamentals of risk without ending up re-writing a book by some genius like Nassim Teleb. Come on, what is the real reason for writing this? Because I found this excellent BBC article which succinctly explains how businesses tend to over-manage in response to common cause variation, leading to lower levels of performance. Excuse me? In short, statistically speaking, businesses can over-interpret data, implement too many management controls, and make bad decisions because they think they are responding to cause-and-effect, when really their causal model (or their data) is not able to explain the random variances they observe. Errr… excuse me? Just read the article.
[…some time passes…]
Okay, I read the article. Are they saying that businesses can implement too many controls, spend too much time trying to eliminate variances in performance, and do more harm than good? Yes, that is what it says. But what about Louis Khor and that blog he wrote about implementing RA and fraud controls to improve financial forecasting? I think I already explained why fundamental limits on forecasting accuracy make a nonsense of Louis’ argument (and that he was looking at the wrong data anyway). That was a lucky coincidence, you making fun of Louis and then the BBC explaining how similar errors are made right across the business world. Yes, it was a lucky coincidence… which just goes to show. But what about Gadi Solotorevsky and the TM Forum – they just devised a new method to reduce risk by analysing lots of data? Have you actually read it? Errr… no. I just saw the presentation at a conference. It sounded great. Then go read it, before we start talking about it.
[…some more time passes…]
Okay, I read it. It still sounds great! It says we need more and more and more RA in order to reduce risk. What could be wrong with that? Doing more and more of the same kind of analysis is not a method to reduce risk. It is a method to sell lots of software with the erroneous justification of reducing risk. Hold on! That is a bit strong, is it not? Let me put it this way. Suppose I spend five years training you to work as a coastguard. I train you to use all the latest technology to rescue ships and boats in distress. I train you to fly a helicopter and to pilot a great big boat. I train you to use radar. I train you how to manage a team of coastguards that will report to you. You complete your training and are walking into work on your first day in charge. BAM! You got run over by a car and killed. You spent five years in coastguard training but you did not look before you crossed the road. Now, all the resources that went into your coastguard training were wasted. That is what Gadi’s new RA paper promises to do for risk management – devote excessive resources to one place, at huge and disproportionate cost, resulting in wastage and shortages of resources where the business really needs them. So, you are saying that I could spend huge amounts on the coastguard, but maybe I should have spent more on road safety instead? Exactly. If we care about risk, and care about the business, we want money to be spent where most effective, not wastefully concentrated on one kind of risk at the expense of all the others. Surely the TMF paper gives guidance on how to decide when the level of risk is tolerable, and when there are enough controls in place, and when the costs of doing more outweigh the benefits gained? You clearly did not read it properly. No, it says none of those things. It just says to spend more, and more, and even more, on controls. It never says anything about how much is too much. But those statisticians in the BBC article argue that you actually make things worse through excessive controls!? Well, I guess the ‘scientists’ behind the TMF paper are a bit backward with their understanding of statistics – which is why they always assert you should check everything rather than take the ‘risk’ of relying on a statistical sample. If they do not know how to use basic sampling techniques, then they have no hope of doing something more statistically sophisticated, like evaluating risk. Hang on, I thought that new TMF paper has your name on it? Impossible! This is the first I heard of it. Maybe you did not read it properly [chortle]? Let me just quickly double-check…
Cripes! They do have my name on it. So you must have contributed to the paper? Yes… but only in the sense that I told them their work was utter rubbish, and why. Did they listen to your criticism? No, they ignored it. Well then, it sounds rather cheeky to put your name on a paper that you 100% disagree with. It makes me wonder how many of the people listed really did provide input. Maybe some of them are glad to get their name on the paper so they can impress their boss? Good point. You never see any of those people on the TMF’s web forum… and it makes me wonder about data quality too – why list somebody as contributing if they did not contribute? Perhaps they should have had more controls over their document [snigger]? No, they were happy to take the risk [chortle]. What do you predict for Gadi’s TMF RA team, based on this recent output? I predict they will produce more rubbish in future. Are you certain? Hmmm… pretty certain.