Catastrophic error

When you go behind the label ‘human error’, you see people and organizations trying to cope with complexity, continually adapting, evolving along with the changing nature of risk in their operations.

Eileen Munro, Munro Review of Child Protection

There are two warnings before you read this post. First, it is long as it was originally intended as a chapter in a book. Second, it begins by discussing the ‘Baby P’ child protection case, which some may find upsetting. The point of the piece is to highlight insights about catastrophic events, why they occur, and what we might learn from them, using examples from child protection, the nuclear industry and elsewhere.

Who is to blame when something goes badly wrong?

On Tuesday 9th December 2008, Sharon Shoesmith lost her job in the most public way. A statement appeared on the Haringey Council website stating that Ms Shoesmith had been dismissed with immediate effect following a direction by the Secretary of State that she should be removed from her role as Head of Children’s Services for Haringey. The reason given for her dismissal was a ‘fundamental loss of trust and confidence’ in her ability to keep children safe in the district. This followed the publication of a Joint Area Review which labelled the management of Children’s Services as inadequate.

The events leading up to Ms Shoesmith’s dismissal were the subject of intense media attention. It became known as the ‘Baby P case’, and Ms Shoesmith had been cast as lead villain.

The Baby P Case

The following descriptions of the case draw mainly on articles here, here, and here. Skip this section of you prefer.

17 month-old Peter Connelly died in 2007 at the hands of his mother, Tracey Connelly, her violent partner, Steven Barker, and his brother, Jason Owen.

Peter suffered more than fifty injuries during his brief life. Despite being visited or examined over sixty times by social workers, police and doctors over an eight-month period, Peter was left in the care of his mother. The legal advice sought by Haringey Council stated that the ‘threshold for initiating Care Proceedings… was not met’. Just the day before Peter’s death, a paediatrician is believed to have failed to detect the existence of historic broken bones. Some medical practitioners and social workers did recognise and report signs of abuse; Tracey Connelly was arrested more than once and Peter was briefly taken into the care of a friend. However, the steps taken were insufficient. The day after being examined by a doctor, an ambulance was called to Peter Connelly’s house. They found him in his cot, lifeless, dressed only in a nappy. Despite attempts to save his life, Peter was pronounced dead at 12.20pm on 2 August 2007.

Connelly, Barker and Owen were all found guilty of ‘causing or allowing the death of a child or vulnerable person’. In 2009, Connelly and Barker also faced charges in relation to the rape of a two-year-old child. Barker was found guilty of rape, while Connelly was cleared of child cruelty charges. Connelly was sentenced to be detained indefinitely until ‘deemed no longer to be a risk to the public and in particular to small children’. Barker was sentenced to life imprisonment for the rape and a 12-year concurrent sentence for his involvement in Peter’s death. Owen was also jailed indefinitely, with a minimum term of three years, although this was changed on appeal to a fixed 6-year term.

A serious case review published in 2011 identified failings by multiple individuals and agencies involved, including care, health, the police and social services. Despite all those involved being ‘well motivated’ to protect Peter, individually and collectively their actions were ‘completely inadequate’ in keeping him safe. The report concluded that Peter Connelly’s death could and should have been prevented.

A lack of curiosity

What is striking about the criticisms made of the various agencies and individuals involved is the recurring theme of a lack of professional curiosity; a failure to ask questions – to be inquisitive. Agencies consistently failed to work out the identity (or even the existence) of Connelly’s boyfriend, Barker, despite numerous clues. Connelly named Barker as next of kin on an official form, but his identity was not questioned; a social worker was told that Connelly had a boyfriend, but did not ask who he was or request to meet him; a ‘friend’ was mentioned to hospital staff after Peter presented with bruising (deemed to be non-accidental by a consultant paediatrician), but no-one enquired who might have had access to the family home around the time the injuries occurred. How could so many people apparently fail to ask what, in retrospect, seem like such obvious questions?

Even when the right questions were asked, those involved were too ready to accept Tracey Connelly’s version of events. For example, when Peter was brought to hospital in April 2017 with significant swelling to the side of his head, hospital staff accepted Connelly’s explanation of the cause and failed to alert the police or social services. Such failure to pass on information, or to consider individual incidents as part of a big picture, meant that the reality of Peter’s daily experience was never pieced together. The information needed to see what was happening to Peter was fragmented, like the scattered pieces of a jigsaw puzzle. Why did no-one seek to understand what Peter’s lived experience was like?

Disorder and ambiguity

Throughout this story is a theme of disorder. Peter’s home is described in the serious case review as “disorganised, dirty and smelly”. There was a stench of dog urine mixed with the stale cigarette aroma from Connelly’s 60-a-day habit. Peter and his siblings had head lice and lived in unsanitary conditions, with discarded vodka bottles, animal faeces, fleas, Nazi paraphernalia, knives and replica guns being features of the family home. This chaotic existence is not surprising. Tracey Connelly’s childhood had also been troubled. At the age of 10, Tracey had been placed on a child protection register for neglect having suffered serious physical and emotional abuse. Her own mother had herself suffered trauma and abuse at an early age. Barker and Owen also lived chaotic lives, with a history of crime, familial violence, arson and drug addiction. A child born into such dysfunction would to any reasonable person be immediately identified as at serious risk, if, that is, they took the time to appreciate the history of chaos and misfortune.

If only the response of the services involved had been orderly and functional, there might have been a better outcome for Peter Connelly, but instead the disorder of Peter’s life was met with more dysfunction by the authorities. One senior child protection worker at Haringey at the time describes what she found when she started working at the borough shortly after a previous high-profile child death, that of Victoria Climbié. ‘Morale was extremely low. The quality of staff was very poor. People were depressed. To give you an idea of the mood, the café opposite wouldn’t serve social workers’ (quoted here). She went on to explain the shortage of staff and how social workers had been recruited from South Africa and Zimbabwe, leading to a ‘whole challenge of managing people with different cultural norms and expectations about raising children.’ Case-loads for social workers were very high, and the complexity of the work was reportedly too difficult for some of those struggling with exceptional workload.

Ironically, those who lost their jobs as a result of Peter Connelly’s death were often cited as being amongst the most effective by peers and their managers. Maria Ward, the social worker allocated to the Connelly’s, who was dismissed and publicly criticised for her incompetence in the media, has been identified by one colleague as one of the best she worked with. When asked how, if this was the case, Maria Ward had missed the signs of Peter’s abuse, she said, ‘I think she was in a fog, because it was like that sometimes. We used to call it ‘Nam’ – ‘Tott Nam’. Because it was just constant, You’d come out sometimes and you’d be absolutely exhausted. You’d think, ‘Oh, I’ve lived another day’.’ The frequency of child protection referrals was relentless, and at times the entire team of social workers would be tied up merely responding to the referrals, with little time to do more routine or proactive work.

Mixed in with the disorder and complexity of the work being undertaken by children’s services in Haringey was a high degree of uncertainty and ambiguity. Each judgement being made by professionals involved in child protection cases is nuanced and open to question. For example, when should a child be removed from a family home? Sometimes this might be a straight-forward decision; where the parent is clearly unable to provide care and ensure the safety of their child and is highly unlikely to be able to do so in the future. But usually it will not be as clear-cut. The anonymous child protection worker cited above explains it this way: ‘The sooner the child is removed, the better. For drug mums, it’s much clearer. Those children will come out much more quickly and be adopted. But for neglect, which is widespread for children under protection, it’s very hard to get evidence for removal. Because it’s all about people’s thresholds. What is a dirty house, what isn’t? How many appointments did they not attend? It just goes on and on.’

The ambiguity around each aspect of a child’s circumstances is made more complicated by the tendency of parents to mask potential signs that all is not well. In Tracey Connelly’s case, this masking behaviour was deliberate – she lied and manipulated those asking questions about Peter – but parents may also mask out of shame or embarrassment. Before we judge this too harshly, we should recognise our own tendencies to present our lives as more orderly and controlled than they actually are. When was the last time you tidied the house before someone came to visit? Do you tell the world through Facebook about that mistake you made at work, or just about the promotion you have just secured? Covering our inadequacies is a natural human tendency, but it presents a false picture to others of how our lives are lived.

Disentangling the threads of the Baby P case is no simple task. Everywhere you look there are serious questions about what went wrong. We might focus on the generational neglect and abuse suffered by Peter’s mother and grandmother before her, and the risk this placed him and his siblings at. What might we as a society have done differently to halt this cycle?  We may examine the parental incompetence and morally corrupt actions of those convicted for Peter’s abuse. What safeguards were there that might have prevented this heinous crime? Or we might, rightly, question the inaction of the professionals whose job it is to keep children like Peter safe. How did they so comprehensively fail to prevent this tragedy? Whichever angle we come from, we see a different cause, a new culprit, or an alternative way Peter’s death might have been prevented.

The aftermath

In the aftermath of Peter Connelly’s death, social workers were singled out by David Cameron, then leader of the opposition, looking to blame Labour government failures around child protection. The media frenzy, led by The Sun newspaper, included the naming and shaming of individual social workers. Ed Balls, then Education Secretary, later admitting that he acted under considerable media pressure and threats that there would be calls for his resignation, ordered the removal of Sharon Shoesmith as head of children’s services live on TV.

The effect of this publicity on those working in child protection was considerable. Child protection referrals came flooding in across the country from professionals afraid that they would miss the next ‘Baby P case’, leading to a sharp rise in the number of children being taken into care. By October 2010, the new director of children’s services in Haringey reported that 620 children were now in care in the borough, compared to 460 18 months before. Services were put under pressure as they became extra vigilant; the police would insist on accompanying social workers on home visits to ensure nothing was missed. Social work became ‘defensive’ as everything was recorded, referred and assessed to ensure that there could be no accusation of a lack of rigour.

Despite the forensic, public dissection of the events around Peter’s death, the numerous high-profile dismissals, and the surge in referrals, assessments and children taken into care, did the world become a safer place for vulnerable children? The simple answer: no.

What went wrong?

It is difficult to make a sober assessment of what went wrong in the immediate aftermath of a serious child protection failure. A more distanced, holistic view of such matters is needed. In 2010, Professor Eileen Munro from the London School of Economics was asked to chair a review of child protection services in England. The key question the review sought to answer was ‘what helps professionals make the best judgements they can to protect a vulnerable child?’.

Munro found a system which had been shaped by four ‘driving forces’, including a ‘commonly held belief that the complexity and associated uncertainty of child protection work can be eradicated’ and ‘a readiness… to focus on professional error without looking deeply enough into its causes’. Munro’s criticism of previous enquiries was that they stopped at the point of identifying who was to blame, rather than continuing to question what the systemic circumstances were that led up to these mistakes.

The result of the failure to move beyond apportioning blame was that the response in each case of a child’s death was to find more ways of controlling people so that they would be less likely to make such mistakes again. This control takes the form of psychological pressure (a high-stakes culture whereby individual professionals are fearful of making a mistake), more, and more detailed, procedures and rules (which would avoid the mistakes of the past if followed systematically), and an increased level of monitoring and supervision. This creeping managerialism was a rational response to mistakes identified in each case review, but over time had created a stifling and counter-productive work environment. Munro noted:

“Each addition in isolation makes sense but the cumulative effect is to create a work environment full of obstacles to keeping a clear focus on meeting the needs of children.”

Munro’s answer to this inquiry error was to take what she termed a ‘systems approach’; where human error is treated as the starting point for investigation.

“When you go behind the label ‘human error’, you see people and organizations trying to cope with complexity, continually adapting, evolving along with the changing nature of risk in their operations. Such coping with complexity is not easy to see when we make only brief forays into intricate worlds of practice.”

The complexity of the environment within which decisions were being made is the significant insight offered by Munro. She identified three ways in particular that this complexity manifests itself. Firstly, those involved in protecting children have to constantly make predictions about the present and future likelihood that a child will come to harm, and consequently whether they should be removed from the family home. These predictions are prone to under-estimations (false negatives – left in an unsafe home) and over-estimations (false positives – removed unnecessarily). Whilst it is generally socially unpopular for the state to be seen to break up families, the high-profile cases of child deaths highlight the false-negative judgements where children have been, in retrospect unwisely, left in danger. The social pressure on child services swings from a position of condemning over-zealous action to, as was seen in the aftermath of the Baby P case, encouraging an overly-cautious response.

The second feature of complexity which Munro identified as problematic was the inherent uncertainty of two aspects of the work of child protection teams. Firstly, the signs and symptoms of abuse are often ambiguous and explainable as a result of benign circumstances – in other words, it is relatively easy to manufacture a credible story to cover any individual signs of abuse and neglect. This is partly because abuse and neglect is often suffered in the privacy of the family home, and is only witnessed by those inflicting harm or complicit in it. As a society, we place a high value on family privacy and the rights of the parent, which makes it difficult for those seeking to protect children who rely on parents’ permission and cooperation in accessing their child. Secondly, uncertainty manifests itself in the judgements professionals make about the risk of harm, particularly in balancing up the likelihood of harm against how serious this harm might be. In making a decision about removing a child from their parents, there will certainly be a negative effect to the child of being taken from the family home, and this certainty can be a disincentive when weighed against some subjective estimate of risk of leaving the child with their parent/s.

Munro identified a third feature of complexity which is the tendency to oversimplify in retrospect the situation facing the professional: known as hindsight bias. Once we know the tragic outcome, it seems obvious which decisions were ‘wrong’ and critical in the subsequent events. The pertinent information, or the cause of a subsequent incident, appears obvious to us from a future vantage point. This bias is more pronounced the more complex the circumstances were. The hindsight bias:

“oversimplifies or trivialises the situation confronting the practitioners and masks the processes affecting practitioner behaviour before-the-fact. Hindsight bias blocks our ability to see the deeper story of systematic factors that predictably shape human performance.”

In understanding the inherent complexity of the circumstances around child protection, we begin to see the causes of failure more clearly: not merely as human errors made by incompetent individuals working for inept organisations, but as almost inevitable mistakes which flow from a failure to recognise and adapt to uncertainty, unpredictability and ambiguity. Furthermore, by failing to recognise this complexity, we have implemented simplistic solutions – flow charts, rules, procedures, forms, supervision, audits, micro-management, risk aversion and control – which attempt to prevent each individual error rather than make the system as a whole more flexible, resilient and adaptable… and less prone to failure.

A borrowed insight

Eileen Munro did not reach her conclusions about the failures of the English child protection institutions overnight, or in isolation. Her inspiration came from, amongst other things, human error in anaesthesia, aviation and even nuclear power stations.

At 4am on 28 March 1979, near a small town called Middletown, near Pennsylvania, a single cooling circuit on a newly installed pressurized water reactor at the Three Mile Island power plant malfunctioned. The reactor shut down immediately and the release valve opened to allow coolant to escape, reducing the pressure and heat. However, the valve stuck in the open position and released all the coolant. Engineers were ignorant of this secondary malfunction as there was no indicator to signal the depleting coolant, therefore when new coolant was released to top up the tank, operators reduced the flow resulting in fuel rods overheating and the release of radioactive material into the coolant. This was then released into the surrounding area as the coolant turned to steam and was expelled from the system.

Fortunately, no-one died as a consequence of this fault, but investigations revealed above-average rates of birth defects and cancer cases in the area in subsequent years.

Decontamination of the plant took 12 years and cost nearly a billion US dollars. Millions of dollars in compensation were paid out and no new nuclear power plants were approved in the US for thirty years.

The report into the disaster singled out the poor performance of the operators who failed to take appropriate steps to solve the fault as it emerged, and whose actions allegedly made the problem worse. Prior to the incident, significant efforts had gone into training engineers and preparing them for accident scenarios. Improved warning signals had been installed to the extent that the control room had more than 600 alarm lights covering every conceivable eventuality. All the human beings had to do was to correctly interpret the warning signal and follow the carefully laid down procedures to address the specified problem.

However, when disaster struck, operators were overwhelmed by the flashing lights – the warnings overloaded their capacity to think and act rationally. They were incapacitated in making out the signals amongst the ‘noise’. Individually, each additional warning light improved safety as it alerted controllers to a specific fault which could easily be remedied. However, in aggregate these incremental ‘improvements’ to safety combined to create a bewildering wave of information which resulted in almost inevitable human error.

As with child protection investigations, the initial analysis of the Three Mile Island disaster stopped when it found human error. Only a subsequent report went beyond this error to establish the systemic causes of the failure.

Munro found that the tendency to finish an enquiry at the point human error was identified was common across many complex domains. She found that 70-75% of accidents in the field of anaesthesia were attributed to human error, similarly over 70% in air crash investigations, and that 75% of child abuse deaths identified professional errors as making a significant contribution. In many cases, the ‘solutions’ identified involved ways of by-passing relying on expertise, rather than enhancing it. But rather than fix the problem, such solutions compounded future errors as humans became less equipped to cope with complexity, uncertainty and ambiguity.

This effect was particularly apparent in the airline industry where piloting a plane has become increasingly automated. In the past, commercial pilots had a constant awareness of flight conditions as they monitored the various dials and instruments whilst flying the plane. More recently, increased automation means that the plane is flying itself for most of the time, with pilots in attendance should anything go wrong. The result is that, when things do go wrong, pilots are expected to jump straight in to a complex situation without their head in the game, with less direct experience of flying the craft, and in the most challenging of circumstances. Of course, when the plane crashes, the disaster is blamed on the inability of the human to rapidly respond to a highly volatile situation under extreme pressure. Fortunately for all of us who fly, a systemic approach to air crash investigation and a refusal to cover up the reasons for air disasters has led to improved aircraft design so that pilots are made aware at all times of what is going on so they are better equipped to take over when the machines inevitably fail.

The willingness to look beyond human error started in the engineering industry as it was observed that the traditional solutions were not working as well as they should: power plants continued to melt down and planes kept crashing. Introducing more standards, rules and supervision was not reducing human error, in fact in some cases it was increasing the number and seriousness of mistakes. A radically different approach was needed. The shift in perspective required was to see the human in the context of the complex environment they operate within; as one component subject to the emerging dynamic of the situation.

Beyond rationality

This approach was to break with the classical economic view of humans as rational actors, weighing up each decision carefully, in full possession of the information needed to act in their best interest. Munro pointed to research ‘in the wild’ which pointed to a different view of rationality, one where each decision is not made in isolation, but instead as:

part of a constant stream of activity, often spread across groups, and located within an organizational culture that limits their activities, sets up rewards and punishments, provides resources, and defines goals that are sometimes inconsistent. Human errors are, in general, not random and individual but follow predictable patterns that can be understood by seeing them in their context.”

Whilst it is tempting to apportion blame when mistakes are made, it is much more constructive to ask not just where the failure occurred, but why? From our privileged hindsight view, it is easy to conclude that better decisions could have been made, but those decisions appeared to those making them as right at the time. Events flowed up to the point of this decision, and a momentary judgement must be seen in the context of a unique and complex set of circumstances which, if understood, should help us understand why an erroneous decision was made. Munro cites an example of a social worker in the Victoria Climbie case who was sacked for her incompetence and told she could never work with children again. However, this social worker had never previously been involved in an investigation of abuse before, was almost entirely unsupervised (receiving only thirty minutes of supervision over the previous 211 days from a senior in the process of developing a major psychotic illness), and had a caseload far in excess of that recommended. There is, of course, no excuse for failing a vulnerable child, but if we are to identify cause so that we can act to reduce such events from happening again, we must not stop when we find an individual to blame.

In 2011, Ms Shoesmith, the disgraced head of children’s services in Haringey at the time of Peter Connelly’s death, won her case for unfair dismissal. In his summing up, Lord Justice Kay stated that Ms Shoesmith was “entitled to be treated lawfully and fairly, and not simply summarily scapegoated”, and that her sacking was “tainted by unfairness”. Identifying individuals to take the blame when tragedy happens may play well in the media and satisfy politicians, but if we want to learn from our mistakes we must understand the disorder within which events played out.

Chaotic environments

Life is rarely simple, but when everyday complications reach a tipping point, disorder results. Chaos is a breeding ground for malfunction and dysfunction, as we have seen in the tragedy of preventable child deaths, catastrophic engineering failures, and the intractable poor performance of schools in the country’s most deprived communities.

Understanding complexity is the first step to preventing its worst excesses. To do this, we must look beyond simplistic ‘human error’ conclusions and gain an insight into the complex social and organisational dynamics within which people make decisions and attempt to do the best they can in the situation they find themselves in.

We have seen how humans struggle to make sense of what is going on when they are overwhelmed by too much complexity. Linearity breaks down in such conditions, and it becomes impossible to decipher the cause or likely consequence of any particular action or event. When there is too much uncertainty, decision making becomes hit and miss. Rather than there being a lack of information, complexity can provide us with too much information for one person to process or interpret. Much of this information can be ambiguous or partial, defying any rational attempt to identify patterns and meaning. Disordered environments make predictions highly questionable as volatility increases and any sensible estimate of outcome will likely be thrown off course by random events.

A striking feature of extremely complex environments is that they appear so much less so in retrospect. Post-hoc analysis tempts us to apportion blame and therefore to suggest solutions which are sane in a merely complicated world, but in conditions of complexity may serve to generate unintended consequences and further unpredictability. Looking back, we see each mistake made in isolation, and venture that clearer guidelines and better controls would have prevented such an error. We are lured into creeping managerialism, the tentacles of command and control reaching further into the minutiae of decision making, undermining professional autonomy and trust in expertise. By failing to recognise complexity and its internal dynamics, we increase the likelihood of future error rather than diminish it.

Confronted with complexity, we attempt to simplify and impose order. This is a natural human response as our minds seek out patterns, invent plausible explanations for what is observed, and construct mental models which will help us navigate the environment. We are driven to tidy up: to declutter, put similar things together and discard the irrelevant. In doing so, we can fail to take into account important details or jump to conclusions about what how we arrived at this point. It is certainly the case that dysfunctional environments would benefit from less uncertainty – if only there weren’t so many factors at play – but chaos must settle into a more predictable state: it may resist our attempts to control it.

There is also something here worth considering in relation to diversity. Intuitively, we may wish to narrow variety and difference – to deal with a more homogenous group: it would seem simpler. But diversity might instead even out the extremes; cancel out the peaks and troughs. The clustering of similarity, conversely, can amplify the noise. Homogeneity might act as the catalyst for dysfunction.

In this post, we have considered the darker side of complexity – what happens when we lose control and, in attempting to reassert it, increase the likelihood that we will fail again… and again. However, if we understand how catastrophic error arises from excessive complexity, we may hope to prevent it. There are lessons to be learnt.

Key documents

The Munro Review of Child Education: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/175391/Munro-Review.pdf

Improving practice: child protection as a systems approach: https://core.ac.uk/download/pdf/92549.pdf

Joint Area Review of Haringey Children’s Services: https://files.ofsted.gov.uk/v1/file/50002229

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s