Much is now understood about cognitive bias and the implications for decision-making. Unfortunately, that understanding is rarely applied in the criminal justice process. The Health and Safety Executive ('HSE') is more advanced than some investigating authorities in that it recognises and even provides some limited guidance on the issue. Their guidance 'Investigating accidents and incidents' (aimed principally at organisations but also reflective of HSE practices) provides: "The investigation should be thorough and structured to avoid bias and leaping to conclusions. Don't assume you know the answer and start finding solutions before you complete the investigation. A good investigation involves a systematic and structured approach."

But how far does a structured investigation go in combatting cognitive bias?

Health and safety investigations are particularly susceptible to bias because they are frequently more subjective and complex than other criminal investigations. The fundamental question is usually 'why' (rather than for example 'who'); and the 'why' concerns the behaviour of organisations and not merely individuals. As Daniel Kahneman explained in his brilliant book, Thinking, Fast and Slow, the brain is a machine for jumping to conclusions; and while difficult problems are by their nature hard to solve, the brain is no less inclined to use cognitive shortcuts to 'solve' them. Indeed, complex problems may produce additional shortcuts (such as substituting a simpler question to the one being asked).

Particular problems arise when emotions are involved, such as in relation to policy preferences. Kahneman writes: "Your political preference determines the arguments that you find compelling. If you like the current health policy, you believe its benefits are substantial and its costs more manageable than the costs of alternatives...Your emotional attitude to such things as irradiated food, red meat, nuclear power, tattoos, or motorcycles drives your beliefs about their benefits and their risks. If you dislike any of these things, you probably believe that its risks are high and its benefits negligible...[A] search for information and arguments is mostly constrained to information that is consistent with existing beliefs, not with an intention to examine them."

It is therefore instructive that the HSE 'Investigating accidents and incidents' guidance provides: "Investigations that conclude that operator error was the sole cause are rarely acceptable. Underpinning the 'human error' there will be a number of underlying causes that created the environment in which human errors were inevitable. For example inadequate training and supervision, poor equipment design, lack of management commitment, poor attitude to health and safety...The root causes of adverse events are almost inevitably management, organisational or planning failures."

If the guidance should be understood to mean 'Do not assume that an individual is to blame, assume that an organisation is', that at least would be consistent with how health and safety cases are usually investigated and prosecuted. In any event, it seems hardly the recipe for an open-minded and unbiased search for what went wrong.

Fortunately, health and safety cases do not often result in the dramatic miscarriages of justice where an entirely innocent person is convicted, such as might happen in a murder enquiry (when detectives pursue a suspect based on a hunch to the exclusion of all evidence to the contrary). However, that does not mean that blame cannot be allocated unfairly. In addition to the commonplace where employees recklessly breach established safe systems of work but only their employers are prosecuted, there are not uncommonly cases where more than one organisation has breached its very onerous duties under the Health and Safety at Work etc. Act 1974 to similar degrees but where one organisation avoids prosecution altogether.

What causes such differing outcomes? There may be myriad reasons, not all of them irrational, and these may differ from one case to the next. A constant however is that where investigators have formed a view about your organisation, it is very difficult to change that view. As with the blinkered murder detective, evidence will be searched for and interpreted to confirm that view ('confirmation bias'). Once a decision to prosecute is made, even if it can be proved that the investigating authority's entire understanding of what happened was wrong, they may move from one theory to another without reversing that decision or even reducing the seriousness of the breaches alleged.

Investigations gain momentum and the longer they go unchecked the less likely they are to change direction. Accordingly, organisations should defend their position from the outset. They will be required to respond to information requests. They may be required to respond to an enforcement notice. They will have opportunities to instruct expert witnesses. In due course, there will be an invitation to an interview under caution, and opportunity to make representations before charge. In short, there will be opportunities for organisations to persuade to their view of what went wrong and why at a time when it is most likely to shape the view of investigators.

If there is a decision to prosecute your organisation, 'hindsight bias' poses particular difficulties. This is where knowledge of the outcome causes people to overestimate the likelihood of past events. As Kahneman puts it with comic understatement: "When an unpredictable event occurs, we immediately adjust our view of the world to accommodate the surprise."

There are numerous studies on hindsight bias. Kahneman cited a study of students predicting the likelihood of various potential outcomes of President Nixon's visits to China and Russia in 1972: "The results were clear. If an event had actually occurred, people exaggerated the probability that they had assigned to it earlier. If the possible event had not come to pass, the participants erroneously recalled that they had always considered it unlikely." In addition, "Further experiments showed that people were driven to overstate the accuracy not only of their original predictions but also of those made by others." In summary: "The tendency to revise the history of one's beliefs in light of what actually happened produces a robust cognitive illusion."

Hindsight bias can be manifest throughout the investigation stage and impact a charging decision; but it's at court that its impact is most felt. The sentencing guideline for health and safety offences requires an assessment of harm risked and the likelihood of harm, upon which all else follows. The first question a judge must determine when sentencing an organisation for corporate manslaughter is 'How foreseeable was serious injury?' The potential for a "robust cognitive illusion" to adversely impact the sentencing exercise is plain. For large organisations, that impact could be hundreds of thousands or millions of pounds.

Sentencing judges are no more immune to hindsight bias than investigators; and have been known for example to rationalise their way around decades of safe operation to find an 'accident waiting to happen'. Hindsight bias goes hand-in-hand with outcome bias, where an evaluation of pre-accident systems is coloured by the fact that they did not prevent the accident. Combine these factors with a predisposition to blame organisations and not individuals (on the assumption that it is almost inevitable that root causes are management, organisational or planning failures), and it can explain why organisations even with reasonable health and safety systems risk being aggressively prosecuted following a serious incident.

The challenge for those defending health and safety investigations is to persuade and counterbalance with evidence, with a view to investigations and prosecutions not being biased. But be warned: like negotiating Brexit, in serious cases this is usually a process, not an event.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.