When in the business of Emergency Management and/or Emergency Response, an outcome bias or a `it wont happen` (therefore I won’t get caught out) approach often erodes your sense of risk and makes you blind to error; explaining everything from fatal plane crashes to the Piper Alpha and Deepwater Horizon incidents?
I pose the question; is it the case, that we often judge the quality of a emergency preparation decision or behaviour by its endpoint, while ignoring the many mitigating factors that might have contributed to success or failure?
“We often judge the quality of a decision or behaviour by its endpoint, while ignoring the many mitigating factors that might have contributed to success or failure.”
I recently had a conversation with a frustrated colleague where he had found that `outcome biases` had been the case when a company conducted its own internal audit on a serious mine/rescue training incident, where a trainer got burnt – it seemed to my buddy that early in the enquiry process that the investigator and her supervisors already had their outcome pre-set. My associate was understandably frustrated, where he thought that the parent company had predetermined and deliberately found no fault on a potentially very dangerous situation.
It seems the case that some of the true response professionals and trainers that worked on that site hoped for the reveal of regular and constant disregard for best practise via the investigation; as the problem was apparently systemic (the company) and behavioural (the person hurt) by most of the underdone responders doing the training.
In this scenario, to the properly trained and experienced operators it was apparent that the injured person(s) had no process or guidance from their company on `how` and yet also acted with no understanding of consequence or mandatory PPE.
However, frustrated my buddy and his work mates were, their annoyance was compounded by the fact that the auditors themselves had no experiences, qualifications nor interest in knowing our industry best practises compared to poor industry behaviour. Therefore, deduced an unfortunate occurrence had happened; much to the frustration of the sound operators the investigation was blinded to the obvious errors or non-compliance, consequently led to no change.
In this case study – I wonder why, and to what means, or end did the company hope to gain by entering into the investigation in this manner? Protecting an interest, a business alignment, or allying to supervisors or company interests could/may or might have been the case? Or just plain arrogance and ego? I don’t know – I didn’t do the inquiry; nonetheless, sadly know the behaviour all too well.
Unfortunately, I am all too familiar with the incident type of behaviour and the loaded investigation approach…
This creates the double edged sword where safety conscious and professional workers now have no confidence in the safety process, reliability of the company and the leadership, both local and overall.
Why ignore the apparent lack of industry knowledge required to conduct the investigation?
Ponder the Dunning–Kruger effect where a cognitive bias in which people assess their ability or knowledge as greater than it is. It is related to the cognitive bias of illusory superiority and comes from the inability of people to recognise their lack of ability or required knowledge and experiences in a space (such as emergency management and or response and investigation within a certain field); Emergency Management (EM) and Emergency Response (ER) is not a Safety or Mining qualification, nor is the reverse the case.
Nevertheless, sound, experienced and properly trained operators in ER know their trades safety requirements, required systems and obligatory behaviours – whereas undertrained, under experienced and lackadaisical operatives do not.
The Dunning-Kruger effect refers to the seemingly pervasive tendency of poor or under qualified performers to overestimate their abilities or knowledge relative to other people–and, to a lesser extent, for high performers to underestimate their abilities. It is the opposite of Imposter Syndrome.
It is simply the false belief that we know more than we really do, therefore do not seek council or opinion from the real experts. Typically, and ironically real experts under-estimate their level of expertise; while again the people with low ability over-estimate theirs.
Outcome bias can lead to devastating results in emergency management and response as a result of ignoring risks during the decision-making process.
If you’re a worksite specialist, then be just that – conversely, allow us to be the experts in what we do as well….
I have no doubt you’re an expert in your field of employment and qualification: provided you have high-end, hard to get qualifications (no quick, fast track, tick and flick) yet, also your backed by years of sound, professional and mentored experiences. Stick to having expertise on your area only, not all areas…. And I won’t tell you how to run how to do your job, like mining, safety or construction.
“Organisations should emphasise that it is everyone’s responsibility for spotting latent risks and poor behaviours; then reward people for reporting them. But as importantly engage experts to ensure safety in each field, rather lemmings.”
We can protect ourselves from the outcome bias; Researchers have found, for instance, that priming people to think more carefully and engage when regarding the context surrounding a decision or behaviour, can render them less susceptible to the outcome effect.
The aim should be to think about the particular circumstances in which it was made and to recognise the factors, including chance, that might have contributed to the end result.
In our case-study example whether you are Mine Manager, a Health and Safety representative, or an private Emergency Response provider , your strategies to avoid outcome bias will help prevent a `chance success` from blinding you to dangers that were there all along, rather good management.
Emergency response to incidents is dangerous it shouldn’t be taken lightly nor entered into without the right experts; as is emergency response training – but you can at least stack the odds in your favour, rather than allowing your mind to lull you into a false sense of security, due to no loss of life – yet!