Air Risks Uncovered: Breaking Aviation's Biggest Myth

Interviews

Sep 10, 2025

Jowanza Joseph

CEO, Parakeet Risk

Christine Negroni

Crash investigator and journalist

Interview with Christine Negroni
Interview with Christine Negroni
Interview with Christine Negroni

The following interview highlights the most revealing moments from the conversation with veteran crash investigator Christine Negroni, author of The Crash Detectives, about what really drives aviation safety—and what every organization can learn from it.


Interview


Jowanza:


Today, we're cracking open the black box of aviation risk with one of the world's foremost crash investigators, Christine Negroni. Christine has reported on aviation disasters for over two decades for the New York Times, CNN, ABC News, and many more. She's the author of the bestselling book, The Crash Detectives, where she digs into the world's most mysterious air accidents from TWA 800 to Malaysia Airlines Flight 370. 



In some of your interviews, you've mentioned you didn't pay much attention to aviation until covering the TWA Flight 800 incident from CNN in 1996. What is it about that crashing that made you think, I can't walk away from the story?


Christine Negroni: 


Hmm... I like that question. I did get my start with the crash of TWA Flight 800, and I was not in any way involved in the aviation world other than as a traveler at the time that that airplane crashed. I was working for CNN, as you noted in my introduction, and was sent out to Long Island, where the wreckage of the plane, which crashed into the Atlantic, was being brought in. I spent six weeks out there in those initial days.


The thing that captivated me about TWA 800 was the uncertainty about the accident in the public's mind. Was it a bomb? Was it a missile? Was something wrong with the airplane? There was so much discussion about it and controversy. Even though it doesn't seem unusual now in 2025, it sure was uncommon in 1996. So I looked at all that was going on and thought:


I don't know of a more intellectually engaging, socially fascinating story that I've covered in 25 years of being a television correspondent. And so it really just drew me in.


I'm going to be perfectly honest with you—a lot of it was ego because I really did feel like I had an understanding of this story. Not because I was so smart, but because people were giving me information and backing it up with facts and previous events.


CNN would consistently say, ‘Whatever you report, we’re going to get the FBI’s position and balance your stories with theirs.’ Since the 1990s, we’ve come to call that ‘false equivalence.’ If you constantly bring in a contrary opinion to ‘balance’ a story, it doesn’t mean that the contrary view deserves equal weight.


In this case, the FBI kept insisting all theories were on the table, claiming they were inches away from a ‘football-sized piece of evidence’ that would prove a criminal act. But no such evidence existed. Still, that false equivalency editorial approach forced us to give their unsupported theory the same weight as fact-based reporting—and that’s wrong.


Jowanza:


That's a fascinating example of how media coverage can shape public perception. You eventually wrote "Deadly Departure" about TWA 800. What drove you to write that book?


Christine:


People were telling me to write a book, for one. I was probably nine months to a year into reporting on Flight 800 when I remember sitting with another reporter and saying, "No one's writing a book about this. It's odd, right? This is such a great story." The next thing that happened was I saw in the newspaper that she had pitched a book and got a publisher before I did, which was very frustrating. Don't share your secrets too broadly!


But the thing I thought about TWA 800—it sounds awful to say it, but I felt I was right. I still do. And that message should get out there. This design flaw existed on 8,000 airplanes. Everything in the Boeing fleet that began with seven—the 757, the 767, the 777, the 747, and the 737—shared the same design. They were all at potential risk of having an explosion in flight, as TWA 800 had.


Jowanza:

Eight thousand airplanes with the same flaw—that's staggering. How does something like this perpetuate within an organization tasked with building planes safely?


Christine:

This is a huge question for industrial risk. There’s a kind of corporate arrogance at play—not necessarily intentional, but a confidence that comes from long success. If a single reporter can feel sure of her work, imagine a company like Boeing; they’ve sent rockets into space, and that well‑earned confidence can mask underlying fissures as they develop.


Preventing flaws from creeping in requires sustained attention from the top: a mandate from the CEO and board that cascades through every layer of the organization, right down to the person picking up trash in the parking lot. It has to be vertically aligned and fully integrated. And yes, this dynamic exists across industries—it isn’t unique to aviation.


Jowanza:

Let's shift to your other major work, "The Crash Detectives," which covers Malaysia Airlines Flight 370. When did you realize this would make for a compelling book?


Christine:

The interesting part about Malaysia 370 is that a book agent reached out to me saying she wanted a book about aviation, and publishers were eager—did I have a book I wanted to write? There were no disasters on the horizon that I could see, and I did have a book in mind called “Flying Lessons,” which would take the factors that make aviation safety so effective and ask how we can apply them to our personal lives, professional lives, and industrial lives.


I was still in Malaysia—probably 10 days in—when we put together that proposal and sent it out, and I had a contract to write that book before I even left. So it wasn’t that I set out to write about Malaysia 370 initially, because at first I didn’t know enough to do that. But I’m so glad I did, because it became one of those stories where the facts were one thing and the public narrative was another—and the two didn’t intersect very often.


Jowanza:

You developed a controversial theory about MH370—that it was hypoxia rather than pilot suicide. Can you walk us through that?


Christine:

People think that without the black boxes in Malaysia 370, we'll never know what happened. That's not true.


The black boxes are a source of critical information, but that's not the only information air accident investigators use.


There is tons of information on the ground, in maintenance records, in previous events with the aircraft model. The independent sources, such as radar and satellite data. How do we even know this airplane flew south instead of north? In fact, the black boxes didn't tell us that. The satellite data did.


I believe there was a depressurization event—likely rapid—that left the first officer (27- 28-year-old) alone in the cockpit to manage the emergency after the captain stepped out of the cockpit, probably to the bathroom.


The first officer did what pilots are trained to do and put on his oxygen mask. The aircraft initiated a turn consistent with heading back toward Kuala Lumpur, which suggests he recognized a problem and attempted to return. Shortly thereafter, the transponder dropped from surveillance; while early reports claimed it was intentionally switched off, but there's no way of knowing that. The only fact we have is that it stopped transmitting.


Jowanza:

How do you think the transponder issue happened?


Christine:

The first officer had rapid depressurization, put on his mask, but experienced trembling extremities—that’s a classic hypoxia symptom. He reaches to dial 7700—the emergency code—to declare an emergency and inadvertently places the transponder into standby. That removes the transponder from air traffic controllers' displays. It can look like he turned the transponder off, but intent is a separate question. That's a whole different ballgame. I don't think he did.


He begins the turn back and then the flight path becomes erratic; the airplane never descends. In a depressurization, pilots are trained to do two things: put on the mask and descend to a lower altitude. He put on the mask and turned around, but he did not descend, which suggests that even with the mask on he wasn’t getting 100% oxygen under pressure. In effect, he was in a state similar to the passengers—alert enough to act, but not enough to make consistently good decisions.


Christine Negroni - quote


Jowanza: What evidence supports this hypoxia theory?


Christine:

Just before that flight took off, both canisters that supply emergency oxygen to the pilots were removed from the airplane, serviced, and reinstalled, which is a meaningful data point when assessing a depressurization scenario. I also spoke to pilots who told me that, in practice, pilots often do not put on their oxygen masks even though they are supposed to above 25,000 feet. Hypoxia routinely leads to actions that do not make sense in the moment.


In my book, I go into several fatal events where pilots made the wrong choices, none of which were classified as pilot suicides. They were considered pilot incapacitation due to hypoxia. Why should this case be treated differently when it follows the same pattern of evidence? The only notable difference here is that the two pilots were Muslim. I can't imagine any other reason why this has become such a murder-suicide theory.


Jowanza:

That's a troubling observation about bias in investigations. Do you see this pattern in other cases?


Christine:

There’s a persistent Western bias that assumes pilots from other countries are less competent. We see it after accidents in places like Indonesia and Ethiopia. Go back to the 737 MAX: what did you see all over the internet? The commentary often reduces complex failures to “stupid pilots” who supposedly couldn’t handle a “simple” problem that Americans would have solved.


Even after Boeing was criminally charged with deceiving the FAA about software and withholding information pilots needed—facts the company acknowledged—many still insist the pilots should have managed it. There’s an element of inherent racism in that narrative. The presumption is that Ethiopian or Indonesian pilots don’t know how to fly.


Jowanza:

Looking at your decades of investigation, how do you weigh human factors versus technological factors in aviation accidents?


Christine:

Over the years, there’s been a profound shift. In the early days, accidents were largely mechanical—they were making a machine they'd never made before and sending it up into the sky with people on board.


Now we've shifted from "the issue is the safety of the machine" to "the issue is the capabilities of the human."


People tend to frame it as “machine versus pilot.” In reality, the pilot is one of many professions that keep an airplane safely flying.

The machines themselves are made, certified, and maintained by humans. So, aviation is human-centered, end-to-end. When we talk about human factors, we’re talking about that entire ecosystem, not just what happens on the flight deck. Every handoff, checklist, and sign‑off is a human link in a chain that determines whether the flight operates safely.


Jowanza:

How does increasing automation change this dynamic?


Christine:

The airplane is increasingly computerized, so more of the repetitive, boring work once done by pilots is now handled by machines, often by design and by company policy that says, “Use the automation rather than your own piloting skills.”


As a result, pilots are progressively less engaged with the flight—not necessarily by choice, but because the system nudges them into a monitoring role. This breeds complacency and distance between human and machine when safety actually depends on the two working in concert. We’ve inserted computers in the middle, and that evolution has made the human factor the dominant aspect of aviation safety.


The challenge is training someone to be an effective monitor when human-factors science shows humans are not effective monitors; we simply don’t do that well. If the future puts humans primarily in a monitoring role, that won’t work unless we rethink how to keep humans cognitively engaged.


Jowanza:

What's the biggest lesson here for risk managers in other industries?


Christine:

Industries facing similar challenges should learn from each other instead of operating in silos. Aviation offers a mature safety culture, but other sectors are wrestling with the same human–automation interface and could contribute equally to the conversation. The core insight is this: it’s all human factors—humans design, train, operate, regulate, and respond, even when AI is involved, because humans create the AI. Until machines make themselves—and hopefully that’s not in our lifetimes—organizations must recognize humans are embedded at every layer of the system and train accordingly.


Jowanza:

Any final thoughts on what makes aviation so uniquely safe despite these challenges?


Christine:

Aviation would not even register high on a personal risk list; it is extraordinarily safe. Talking about safety doesn’t mean there’s a crisis—it’s how the industry prevents one by refusing complacency. The only way to keep aviation as safe as it is today is to keep challenging assumptions and practices.

Day to day, getting into a car or staring at a device while crossing a street is far riskier. In America, you're hundreds of times more at risk from guns. We're living more in the Wild West than we are in an aviation safety threat environment. The lesson is vigilance without complacency—that's the aviation safety paradox, and it's what every high-risk industry needs to master.


The full conversation is available on the Industrial Risk: Beyond the Blueprint podcast.

Episode 4

Industrial Risk

BEYOND THE BLUEPRINT

Episode 4

Industrial Risk

BEYOND THE BLUEPRINT

Episode 4

Industrial Risk

BEYOND THE BLUEPRINT

Logo Image

Copyright © 2025, All Rights Reserved.

Logo Image

Copyright © 2025, All Rights Reserved.

Logo Image

Copyright © 2025, All Rights Reserved.