How Boeing (and Their Reputation) Took a Nosedive

The importance of psychological safety in organisations

11 April 2022

On February 18 2022, the documentary Downfall: The Case Against Boeing came online on Netflix. Downfall paints the picture of how America’s most valued aerospace company went down after two fatal crashes of their – at that time – newest model: the Boeing 737 Max. Boeing failed to take responsibility for the catastrophes and continuously blamed others. But when push comes to shove the real issue was in the mechanism of the aeroplanes. What role played the working conditions in Boeing? And what about their psychological safety?

In October 2018 and March 2019, five months apart, two Boeing 737 Max aeroplanes crash shortly after take-off. A historic situation in the modern history of aviation that cost the lives of 346 people. But with the reputation of Boeing at the time, for many, it made sense to look for explanations outside of the aircraft giant. The media, but also Boeing themselves, suggested that was probably the fault of the pilots, or that the airlines did not facilitate proper training. 

Concealed

After the analysis of the black box of the first plane, the answer was plain and simple. The crash was caused by a mechanism named MCAS. The new model, the 737 Max, was the newest in the range of 737 models, with the new software system MCAS as an addition. The device served the purpose to stabilise the aircraft. It should compensate for the more fuel-efficient, but heavier engine. But apparently, when the only sensor sending input to MCAS is doing so faulty, this can have fatal consequences. In that case, MCAS pushes the nose of the aeroplane down with a force that cannot be corrected by the pilots. But it did happen, with fatal consequences. Twice. 

Although the MCAS system can play a big role in the safety of the aeroplane, an explanation about it in system manuals was nowhere to be found, except as an abbreviation in the appendix. But why was MCAS not communicated in the 737 Max manuals to pilots and airlines? Because MCAS was such a significant addition to the 737 operating system. If this was communicated clearly, it definitely would have led to mandatory pilot training. Which would be needed for all pilots operating the 737 Max. And Boeing did not want that, because it was expensive and it would affect their stock values negatively. So Boeing did everything in their power to conceal MCAS whilst presenting the new plane and to just cross their fingers and hope nothing would go wrong. 

Before the second crash, Boeing had set up a protocol: What to do as a pilot when MCAS reacted faultily? After the crash, this protocol allowed Boeing to maintain that no blood was on their hands. There was a protocol now, right? The pilots probably did not follow it. But the pilots did follow Boeing's protocol after MCAS malfunctioned, but that did not help them. The aeroplane crashed and, again, it cost 157 people their lives. 
 

So Much For Safety

All was said when former employees of Boeing came into view. They showed a contrast between the working environment before the merge with McDonnell Douglas, and after. And that was an eminent difference. Before the merger in 1997 former engineers and quality managers are full of appraisal of what it was like to work at Boeing. There was a sense of team and connection, and a culture of shared trust. “We were a family”, said John Barnett, Boeing Quality manager between 1985 and 2017. 

Cynthia Cole, Boeing Engineer between 1978 and 2010 said the following: "I really loved working there because I had a say. And when something wasn't right, I could bring it up and I wasn't afraid of being fired." Boeing put safety first. Not a plane would take off if the professionals were not convinced of its safety. As it should be. 

Not a plane would take off if the professionals were not convinced of its safety.

After the merge, the story changed – and behind closed doors, the company changed as well. Profit became important and Boeing went to Wall Street. And this also affected the work floor. According to ex-employees, everything had to be done as quick as possible, with fewer resources and even fewer people. Cole notes that employees were not treated with respect. Cynthia Kitchens, a former quality manager agrees with this standpoint. “If something’s not right you need to get it fixed or get it corrected. To ensure safety, finding things was what you’re supposed to do. But everything was about speed.”

Pressure from Wall Street and the rising European competitor Airbus became too high for Boeing. Employees did not feel heard. "They would attack the messenger and ignore the message", said Barnett. It got from bad to worse. What followed were multiple photos and film fragments that clearly showed that the safety and quality of the aeroplanes were suffering from the working conditions. 

Before the merge Boeing was the textbook example of a psychologically safe organisation. Everyone felt heard, a part of the team and is not afraid to speak up. Whereas after the merge, it seemed like nothing was left of that mentality. Employees didn’t feel heard, were attacked if they dared to open their mouths, and they felt unsafe while all they were trying to do was guarantee the safety of their passengers. 

Learning to Report, Reporting to Learn

Amy Edmondson, the mother of psychological safety, defines a psychologically safe environment as follows. Employees should feel that they can ask questions, admit their mistakes and point out observations without fear of punishment. Such an environment can increase trust within a team and should attribute to team members that feel safe enough to open up.

The documentary also makes it painfully clear that the higher management of Boeing tried everything to ensure that no one reported issues. This reminded me of a study by Ryan Derickson and colleagues who researched psychological safety in relation to error reporting in hospitals. To make it more concrete: they compared a hospital that was deemed psychologically unsafe with a hospital that scored positively on psychological safety. What did they find? The psychologically safe hospital reported many more errors than the other hospital. 

But imagine this: you have no clue about psychological safety and you start comparing hospitals only on the number of reported errors. This makes the more psychologically safe hospital look worse than the unsafe one. Not necessarily because they make more mistakes but especially because the organisation provides professionals with the space to report errors, so the organisation can improve as a whole. And this is why it is not a good idea to judge hospitals by the number of reported incidents and calamities. 

This also explains why Boeing's reputation and what happened on the work floor, only surfaced after that many people died. Because there was a culture of “you should not report”, which devastated the ability to improve and learn from one another. 

Eventually, Boeing also did not learn that much. As far as it is known, there were no apologies made to the relatives of the victims, and they paid a sum of 2.5 billion to avoid prosecution regarding fraud charges. And the technical pilot that was sent to trial – and not higher management – got acquitted. Whether a culture change is right around the corner? We’ll see. 

Something that we all can learn, is the importance of psychological safety within an organisation. And if Boeing had taken that into account, their planes would not have taken a fatal nosedive, and those 346 people would still have lived. 

Author

Nienke Luijcks