Book review by Steve Wills, June 2016
Very strongly recommended.
This is a beautifully written, 'story-telling-at-its-best' book, which contains compelling evidence that there’s value in being tough enough to learn from mistakes, whether as an individual or as an organisation.
In a nutshell
Matthew starts with a heart-rending account of a woman dying from what should have been a minor, routine operation.
And her death was totally avoidable.
She had a reaction to the anaesthetic which closed her airway. The senior doctors involved could see the problem and followed the correct protocol, the end stage of which should have been a tracheostomy before her oxygen levels fell too far. The nurse in attendance fetched the kit and informed the anaesthetist.
However, they concentrated so hard on each stage that they lost track of time. The nurse didn't feel empowered to interrupt and remind them that she had the tracheostomy kit. By the time they realised they needed to do this procedure, it was too late and the patient had irreparable brain damage. The hospital apologised to the family and explained that this was a rare problem, but that sometimes things like this happen.
In this particular case, the patient’s husband was an airline pilot. The title of Matthew’s book refers to an aeroplane’s black box - but it’s actually about a fundamentally different attitude to failure that the airline industry has compared with most others. The industry sees any failure (sadly often a plane crashing) as being potentially preventable. But even in the case of pilot error, the pilot isn’t ‘blamed’. It’s assumed that pilots and their crew are highly trained, diligent, and responsible - so any failure must have a deeper cause. Blame is counter-productive.
The industry always investigates failures or near misses, and makes the data available to everyone. This means that if the same event happens in different places, it will be spotted and the cause found. The book describes the types of crashes that have helped the industry to learn. Even simple issues such as cockpit layout - where certain switches might look very similar or be too close together - can cause the best pilots to make a mistake when under pressure. So these types of problems are solved and as a result, the airline industry is one of the safest in the world.
The death mentioned at the start of the book was totally avoidable. Syed quotes horrifying statistics for the number of avoidable deaths or injuries that occur through medical mistakes. He shows how the culture is totally different, leading to failures being covered up, and repeated, through fear of a loss of reputation and ultimately litigation. This occurs despite the fact that if a doctor makes a mistake and admits it openly to relatives, the likelihood of litigation decreases dramatically. The healthcare culture says that if you admit to a mistake, someone has to be to blame!
In this case, the pilot didn't sue the doctor but he did ultimately, despite concerted efforts by those involved to defend their actions, get to the bottom of the problem. In fact, this case has a direct parallel with an aeroplane where a pilot becomes so involved with a problem that he or she loses track of time. This has led to planes literally dropping out of the sky as a result of running out of fuel. However, once this issue was spotted, it led to a change in flight deck procedures that empowered junior crew to challenge the pilot, and as a result, such problems no longer happen.
In the case of the hospital operation, all the pilot could do was publicise what had happened to his wife - but in doing so, he alerted some medical professionals to the problem of ‘time slipping away’. He has since had letters from some saying how that knowledge has helped to save lives (he is now, in fact, Chair of the Clinical Human Factors Group, which works to make healthcare safer).
But the problem still remains in health, justice, politics, education and also in business!
Syed gives numerous examples of where organisations of all kinds have learnt to apply this way of thinking… or not. He cites the way in which Dyson learnt from multiple failures. And how Formula 1 teams use this thinking to eliminate even the most minor problems. He then discusses how we can try to change the current blame culture and improve so much of what is done in all walks of life. He particularly talks about the cognitive drivers that cause leaders to ignore the evidence when making decisions.
This book has fundamentally changed the way in which I see the world. I now see this problem of failing to learn from mistakes ‘writ large’ in story after story in the papers. Even this morning, we discovered a mistake in piece of work we’d undertaken but having read this book, we stopped and thought about the problem. No-one was to ‘blame’ - it was the type of mistake we've all made in the past. But there’s the rub. Do we learn from our mistakes? This time, we've realised that a simple change to our check processes will prevent this problem from happening again.
Key insight applications
So what does this mean for Insight teams?
- Learning from mistakes. Despite all of our concentration on evidence-based decision making and on providing that evidence, we ultimately learn most from our mistakes. But most organisations don't want evidence of their mistakes. This book is written in a compelling way, and really questions how Insight teams should help to solve this problem.
- Looking at decisions. Can we change an organisation's culture so that it will accept and learn from failure? This might be hard, but at least it might make you stop and think about how many times your organisation makes decisions despite the evidence that you provide to the contrary. And even if decisions have been made by paying attention to the evidence, things can still go wrong – such as an ad campaign crashing or a product launch going wrong.
- Reviewing the causes behind failure. How often do we carry out a ‘black box’ review to find out why something has gone wrong, doing this in a frame of mind that doesn't assume that someone is to blame? The truth is that organisations love to celebrate their successes but when something goes wrong, they typically say that they learnt the lesson. In reality, this is often little more than saying, “OK - put it down to experience”. Too often, blame does come into it, even unconsciously. Failure is rarely cited as being good for your career. But in fact, it’s those who have failed who typically learn the lessons and become better performers as a result.
So the message for Insight teams? If you truly want to be proactive and to add value to your organisation, start to use black box reviews to record evidence of failure - your own, and your organisation's. When something goes wrong, use your analytical skills to work out why it happened, despite those involved being professional and experienced.
Even if your organisational culture isn’t ready for doing this on a widespread basis, at the very least your Insight team should seek to learn the lessons so that when someone tries to do something that failed before, there is evidence to argue against it. Perhaps if we take the lead, and show that failure is in fact a valuable form of insight and isn't about blame, we might begin to change the organisational culture more widely.