Catching Up: Professional Ethics And The Challenger Disaster

Because of non-ethical matters in the Marshall household, I missed posting about the January 28 anniversary of the Challenger disaster, as it is labeled among the thousands of Ethics Alarms tags. I have written about and alluded to the completely avoidable explosion of the Space Shuttle in 1986 many times (you can check here), and there may be no other incident that so perfectly encapsulates the complexities of professional ethics, especially in a bureaucracy. In 2016, I offered an ethics quiz on the topic.

In 2020, Netflix presented an excellent, if extremely upsetting, docudrama on how the fiasco unfolded, “The Challenger Disaster.”

I have used the tragedy in my legal ethics continuing legal education courses to force attendees to consider what might make them decide to breach legal ethics and place their careers at risk when an organizational client is hell-bent on what the lawyer knows, or thinks he or she knows, will be disastrous. Legal ethics rules are different from engineering ethics, though the latter has caught up considerably since the Space Shuttle explosion, and in part because of it. However, I view the ethics conflict in parallel situations in both professions the same, as well as situations in medicine, organized religion, the military, and government. When would, and should, professionals decide to do everything in their power to stop the consequences of a terrible decision when it is outside their role and authority to do so?

In my legal ethics seminars, a majority of lawyers ultimately say they would have done “whatever it took” to stop the Challenger’s launch, whatever the consequences, if they knew what the engineers knew. They said they would go to the news media, or chain themselves to the rocket if necessary. Of course, saying it and doing it are very different things.

Here is the most recent incarnation of my Challenger disaster legal ethics question, which I presented to government lawyers a year ago. What would you answer? It is called “The Launch.”


In 1986, Roger Boisjoly was a booster rocket engineer at Morton Thiokol, the NASA contractor that, infamously, manufactured the faulty O-ring that was installed in the Space Shuttle Challenger, and that caused it to explode. Six months before the Challenger disaster, he wrote a memo to his bosses at Thiokol predicting “a catastrophe of the highest order” involving “loss of human life.” He had identified a flaw in the elastic seals at the joints of the multi-stage booster rockets: they tended to stiffen and unseal in cold weather.  NASA’s shuttle launch schedule included winter lift-offs, and Boisjoly warned his company that sending the Shuttle into space at low temperatures was too risky. On January 27, 1986, the day before the scheduled launch of the Challenger, Boisjoly argued for hours with NASA officials to persuade NASA to delay the launch, only to be over-ruled, first by NASA, then by Thiokol, which deferred to its client. Another engineer, Bob Ebeling, joined Boisjoly and begged for the launch to be postponed, only to be overruled.

That night, Ebeling told his wife, Darlene, “It’s going to blow up.”

Question 1Should one or both of the engineers have “blown the whistle”?

  1. They did.
  2. Only the engineer who was sure that it would be a disaster.
  3. No, that’s not their role, their decision, or their call.
  4. After the explosion, but not before.
  5. I have another answer.

 Question 2: How are the ethical obligations in such a situation different for government lawyers than engineers?

  1. Government lawyers have to disclose when human life is threatened, engineers don’t.
  2. Engineers have to disclose when human life is involved, government lawyers don’t.
  3. Lawyers get kicked out of their profession for blowing whistles, engineers just get blackballed.
  4. There is no difference.
  5. I have another answer.

6 thoughts on “Catching Up: Professional Ethics And The Challenger Disaster

  1. Q1: #3

    Engineers, just like everyone else, are often wrong. We don’t want engineers contacting the media whenever they feel their concerns have not been addressed adequately.

    Q2: #4

  2. Question 1: #5.
    -I believe they DID “blow the whistle” by telling the powers that be what was going on and providing plenty of notice (6 months is a substantial amount of time to schedule a new launch date in better weather to ensure the safety of the shuttle passengers).
    -That said, if my dire warnings were ignored, I would have personally gone to each individual shuttle passenger w/my data and let them have the vital, potentially life-saving info so that they could each have the opportunity to confront NASA & demand a change, or refuse to be part of the mission if those demands went unheard, or to the media if that was their only option to force NASA’s hand. I believe that would be the ethical thing to do.

    2. Question 2: #4.
    -I’ve always hated the rule of law in lawyer/client privilege that by coming forward w/hard evidence that your client fully intends to commit a future murder/sexual assault/violent offense, etc., causes you to lose your entire career (an unique career that took several years of blood, sweat, tears and brains, not to mention the hundreds of thousands of dollars in loans to obtain the advanced degrees), and leave you unable to pay back your student loans, support yourself or your family, and prevent you from working in the field you were invested in passionately, all for doing something that in any other profession would be considered a moral & ethical imperative. That said, it is what it is & you make that choice and swear an oath of confidentiality.
    -Even though most engineers have less at stake than a lawyer, I believe they share the responsibility equally. So it doesn’t matter what they personally lose in coming forward.

  3. Here is something to write about.

    Click to access Montez-Lee-Sentencing-Opinion.pdf

    Mr. Lee’s motive for setting the fire is a foremost issue. Mr. Lee credibly states that
    he was in the streets to protest unlawful police violence against black men, and there is no
    basis to disbelieve this statement. Mr. Lee, appropriately, acknowledges that he “could
    have demonstrated in a different way,” but that he was “caught up in the fury of the mob
    after living as a black man watching his peers suffer at the hands of police.” (PSR ¶ 13.)
    As anyone watching the news world-wide knows, many other people in Minnesota were
    similarly caught up. There appear to have been many people in those days looking only to
    exploit the chaos and disorder in the interests of personal gain or random violence. There
    appear also to have been many people who felt angry, frustrated, and disenfranchised, and
    who were attempting, in many cases in an unacceptably reckless and dangerous manner, to
    give voice to those feelings. Mr. Lee appears to be squarely in this latter category. And
    even the great American advocate for non-violence and social justice, Dr. Martin Luther
    CASE 0:20-cr-00168-WMW-ECW Doc. 67 Filed 11/04/21 Page 7 of 13
    King, Jr., stated in an interview with CBC’s Mike Wallace in 1966 that “we’ve got to see
    that a riot is the language of the unheard.” Lily Rothman, What Martin Luther King Jr
    Really Thought About Riots, Time Magazine (2015),

  4. Q2 – I’m not a lawyer

    Q1 – Depends.

    Did the two “whistleblowers” prove beyond all doubt, that no matter what happened on that cold morning, the seals were 100% going to fail and lead to an explosion? Or did they demonstrate that there was an extremely high risk of this occurring?

    Or was the risk demonstrated at a lower level but still one that *in hindsight* should have been concerning.

    If the answer is yes for the 1st question or the 2nd question, then options 2 & 4 are correct.

    If it’s the last question then option 1 is correct.

    We live in a world of risk – of likelihood of a bad thing happening and the severity of the bad thing happening if it does – when we engage in any enterprise worthy of the effort. The engineers giving their educated opinions is part of their professional duty as it is for all of us – but every mission of discovery will be especially risky. But that described risk is all part of a larger goal. If they could only demonstrate an elevated risk – then that’s all part any endeavor we pursue. Since the earliest explorer faced nature as a challenger to the modern astronaut bringing pride to our mythical patron, columbia – we’ve accepted risk as part of the package.

    If the risk was not a 100% guaranteed explosion, then the fact that there was an explosion does actually boil down to some level of moral luck.

    What level of risk did the engineers mathematically and reasonably demonstrate was present?

  5. I was 4 and in preschool when the Challenger exploded. We watched the launch on TV before I went to school that day, and apparently it really disturbed me, because I bit another student and then hid under a table for the rest of the day.

    Working at the refinery now, we get to revisit the Challenger explosion frequently (along with the Bhopal Union Carbide gas leak, the Texas City tanker explosion, the Texas City ISOM explosion, and a host of others) when discussing process safety. Michael West is absolutely right in that it isn’t simply a calculation of what the worst consequence is, but also the likelihood of that occurring.

    Part of the reason the engineers’ concerns were dismissed was because the problem with the O-rings had been known and discussed for quite some time, and there had been numerous launches prior to this one that had been perfectly successful. In other words, NASA had gotten away with using the faulty O-rings before, so they saw no reason to be overly concerned this time around. Furthermore, the launch had already been delayed multiple times, and they were under intense pressure to launch. Why should they listen to the doom-saying of engineers when empirical evidence said the worst-case scenario was not going happen?

    All the pieces that lead up to an incident like the Challenger explosion all seem blatantly obvious in retrospect. I’ve led dozens of investigations on minor incidents here at the refinery, and looking back at the events that led up to an incident, it seems they would inexorably lead to the incident occurring. Yet when you’re actually in the moment, the incident doesn’t seem likely, and when it occurs, it is often surprising. In process safety, we talk a lot about the Swiss-cheese model. We normally have a number of safeguards to prevent an incident from occurring, but every safeguard has its weaknesses, which we liken to the holes in a slice of Swiss cheese. When you stack slices on top of each other, it seems you have an impenetrable wall. And so all the safeguards together are supposed to give reasonable protection against an incident occurring. When an incident does occur, it is because it has managed to slide past all those safeguards, or as we would say, the holes in the cheese all aligned.

    Holes in the cheese align when we take shortcuts around procedures, when we use the wrong material, when we get in a hurry, or when we’re not aware of the risks. If we want to work on a piece of machinery, we want to have it offline and unpowered so that the people working on it are not in danger. But simply have the breaker flipped off is not a sufficient safeguard, because some could see it off, think it needs to be on, and switch it on, which could then energize the equipment. So we place a lock on the breaker so that only the person who locked it can unlock it and re-energize the breaker. Not putting the lock on the breaker won’t necessarily cause an incident, but it creates a hole in the cheese.

    The Challenger investigation introduce into our vocabulary “Normalization of Deviance”, which is surprisingly common. If we engage in a risky behavior and we don’t suffer any negative consequences, we tend to engage in that behavior again. In this case, the shuttles had launched multiple times without incident, using these exact same O-rings. That’s because there were other slices of cheese providing protection. But leading up to that January morning, those protections were subtly defeated. Ultimately, the last “safeguard”, that the O-ring could have warmed up had the shuttle sat longer in the sun and launched later in the day, provided the last hole through the cheese, allowing the disaster to occur.

    There are a number of different investigative methods out there. For a time, we’d used the method called the Latent Cause Analysis (LCA). This method holds to the philosophy that all incidents ultimately have a human cause, and there is a gap between what we expect to have happened, and what actually did happen. That gap is the latent cause. In order to do better going forward, the person who was responsible for that latent cause needs to correct that behavior. Another methodology, a little less robust (I think), but easier to work with is the 5-Why model, in which you ask why the incident occurred, and keep asking “why” to each of those answers. They you identify one of those whys as the most appropriate to address in the aftermath, with the reasoning being that if that why had been addressed, none of the subsequent consequences, including the incident, would have occurred.

    I’ll suggest that everyone should take some time working through an incident investigation, even if it seems like it is something silly. Why did this glass bowl shatter? Because it hit the counter after falling three feet. Why did it fall? Because it slipped out of my hand. Why did it slip out of my hand? Because I was carrying too much. Why was I carrying too much? I was in a hurry to get the kitchen cleaned up after dinner. I can’t really help that i was in a hurry, because often time being in a hurry is outside my direct control. But I can make it a rule that I never carry too much, even if it means leaving dirty dishes on the table before running off to my next errand.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.