Todd Landman Academic Magician
January 12, 2017Todd Landman

What a TV mixup can teach us about preventing fatal accidents

Sarah Sharples, University of Nottingham and Todd Landman, University of Nottingham

When Todd, a professor of political science, was recently invited to appear on the BBC Breakfast TV show, he didn’t expect to be asked to talk about mountaineering. He’d originally been invited on to the tightly scheduled and carefully planned programme to discuss Donald Trump.

But a series of simple mistakes led to an awkward on-air exchange as the presenters realised he wasn’t Leslie Binns, mountain climber and former British soldier from North Yorkshire, but an academic with an American accent – much to the delight of the rest of the press and social media.

These kind of mistakes aren’t rare. Many will remember a similar mixup when the BBC got the wrong guy when it infamously tried to interview Congalese IT worker Guy Goma instead of British journalist Guy Kewney about a court case between Apple and the Beatles.

No individual was to blame for Todd’s mixup. Due to a bizarre set of coincidences and misunderstandings, the BBC researchers and presenters genuinely thought Todd was Leslie Binns, despite the careful planning that had gone into the programme. So why do simple mistakes lead to major errors in such a complicated system with so many checks? Answering this question can provide vital insight into how to prevent much more serious problems occurring, for example in medical operations or air traffic control.

The discipline of human factors can help us here. Studying human factors helps us to understand how we can design products and systems to take into account human capabilities and limitations. People frequently make decisions based on incomplete information, and use rough rules known as “heuristics” to jump to the most likely decision or hypothesis. The problem with the decisions we make using these rules is that they can be susceptible to our personal biases.

In Todd’s case, both he and the BBC staff and presenters did not expect there to be a mix up. This led them to experience bias in the way that they interpreted each other during short conversations before the interview. Todd was greeted by a female staff member saying “Hello, Leslie, you’re wearing a suit”, referring to his lack of mountaineering equipment. Todd interpreted this as meaning “Hello, [my name is] Lesley”, and a conversational comment about his attire. Due to the biases in their perception and decision making, as well as the time pressure in the studio, neither realised a mistake had been made.

Confirmation bias

In systems where safety is critical, the consequences of such biases or behaviours can be much more serious than the mild embarrassment and amusement that resulted from the BBC Breakfast mix up. In 1989, the pilots of British Midland Flight 92 believed that they had received an alert of a fire in the right-hand engine after misinterpreting their displays. When they shut off this engine, the vibration that they had previously been experiencing stopped.

This led to a confirmation bias in their decision making – their action led to the result they expected so they thought they had solved the problem. Sadly they had misinterpreted the information they had received in the cockpit and shut off the wrong engine. The plane crashed on the approach to the airport and 47 passengers lost their lives.

Decision making does not happen in a social vacuum. We conform to social norms and behave in a way that fits in with others within our social or work setting. Just as Todd had to work out how to confront the misunderstanding as he realised it was happening, just before his interview was about to start, we can feel uncomfortable about challenging or discussing decisions in some settings where we feel intimidated, or where others are clearly in positions of authority.

Er, guys, this is a takeaway menu.
Shutterstock

For example, in hospitals, patients and junior staff can sometimes treat senior doctors as infallible. The case of Elaine Bromiley, who died after her airway became blocked during a routine operation, sadly demonstrated the impact that failure in communications in the operating theatre can have. Many factors contributed to this incident, but one that was highlighted was that the nurses in the operating theatre became aware of the problem before doctors had acknowledged the seriousness of the situation. Unfortunately, the nurses felt unable to broach the issue with the doctors.

In UK hospitals, effort is now made to ensure that medical decisions are discussed by staff and are more likely to be challenged if someone – even more junior colleagues – thinks those decisions are wrong.

In the Flight 92 accident, passengers heard the pilot announce that there was a problem with the right-hand engine, but could see through their windows that the problem was on the left. Survivors of the crash later said that they noticed this mismatch between what they could see and what the captain had said, but that it didn’t cross their mind that an error like this could happen, or that the action of the captain should be challenged.

When something unexpected happens in a resilient system, we use our cognitive and social skills to respond in a way to try to ensure that no harm is done. This is known as thinking clearly under pressure or unconscious competence. The most resilient complex systems take both human and technological capabilities into account. They are designed to be efficient while anticipating the human behaviours that might occur and incorporating design features that try to prevent errors, such as formal checks and structured communication.

The challenge is to make sure those design features prevent errors without slowing people down or introducing social awkwardness. When we find ourselves confronted with a potential mistake, we need to feel comfortable enough to pluck up the courage to politely but firmly say, “I think you’ve got the wrong guest, sir”.

The Conversation

Sarah Sharples, Professor of Human Factors and Associate Pro-Vice-Chancellor for Research and Knowledge Exchange, Past President of Chartered Institute of Ergonomics and Human Factors, University of Nottingham and Todd Landman, Professor of Political Science, Pro Vice Chancellor of the Social Sciences, University of Nottingham

This article was originally published on The Conversation. Read the original article.

Sub Categories

Visit our TwitterVisit our LinkedInVisit our YouTube channel

Contact

T  07584 615104 
E info@todd-landman.com

The Academic Magician is available for a wide range of public,
private, and remote performances that suit all the needs of your event.
Booking Enquiries

Todd Landman. All Rights Reserved 

A Mackman Group collaboration - market research by Mackman Research | website design by Mackman

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram