04 Feb 2016
The Space Shuttle Columbia’s Final Mission
ShareBar

No organization wants to fail. But even for the best and the brightest, failure is inevitable, and occasionally that failure can be catastrophic. Professor Amy Edmondson describes her experience writing and teaching a case on the Columbia space shuttle’s final mission, including the organizational challenges within NASA that contributed to it, and the lessons that can be taken from the tragedy.

Brian Kenny: The space shuttle Columbia launched for the first time on April 12th, 1981, the first flight of the space shuttle program. Over the next 22 years it completed 27 missions. On February 1, 2003, as its 28th mission neared an end, the Shuttle disintegrated upon re-entry into the earth’s atmosphere, killing all seven crew members. Today we'll hear from Professor Amy Edmondson about her case entitled “Columbia's Final Mission.” I'm your host Brian Kenny and you're listening to Cold Call.

Professor Edmondson teaches in the MBA and doctoral programs here, as well as the executive education program. Her areas of expertise include leadership, teams, innovation and organizational learning. And perhaps you could add rocket science to that list after having written this case. Amy, welcome.

Amy Edmondson: Thank you, glad to be here.

BK: I’ll start by asking you to set up the case. This opens in a pretty dramatic fashion due to the nature of the subject.

AE: It certainly does. My colleague, Mike Roberto, came to me right after this terrible accident and said, “Let's write a case on it.” Now Mike and I had done several projects together and we both shared an interest in crisis and failure. So I said, “Yes, let's do it.” Of course, we had to then wait for about six months before the official accident report was done by the Columbia accident investigation team. So we needed those data before we could do our work.

BK: What inspired you to take on this particular topic?

AE: We expected that it would be a very rich story, that the causes of the accident would be multiple, that they would not be simple, that there would be a rich organizational story behind it, and we were right. We were particularly interested in the fact that NASA had experienced a prior catastrophic failure in its shuttle program back in 1986 with the Challenger launch disaster. So we were interested in whether this was different or the same. Clearly, we expected (and we were right) that it would not be a purely technical malfunction, that it would be an organizational malfunction, and that's what we wanted to understand.

BK: How is this case different from others in terms of how students prepare? I know you have this as a paper case. There's a multimedia version of this and you teach it in both ways.

AE: First we wrote the paper case. We wanted to make sure it stood up to the test of teaching and it did. We still use the paper case, especially sometimes if we're teaching abroad and it's just easier than the complexity of the multi-media case. But the multimedia case has all of the same data and more, and its distinctive value is that there are six different perspectives from which it can be read. Each student is assigned to just one of them, and the six perspectives—three of them are relatively senior managers at NASA, and three of them are working engineers. They each have access to their own emails that they had at that time and to the conversations that they were a part of, but, of course, not to the conversations that they were not a part of. So when the students come into the classroom they each have about 80 percent of the same data as everyone else, which means about 20 percent unique data—unique only to their role, which is much more like real life than an ordinary conversation.

So when I teach the case I remind people of that reality. I say that it's going to be required for you to ask each other questions when you hear something that's puzzling, that you haven't read before.

BK: Can you talk a little bit about the evolution of the culture at NASA? Because you go back in some historical detail about the Apollo missions and it seemed like there was a different culture at that time and it changed over time.

AE: All organizations when they're new go through a period of great energy and excitement and innovation and openness as they are working to figure out what's going to work, to figure out their replicable formula for success. Once they get that established, complacency can set in. So let's call this Phase II. In Phase II there's a growing sense of confidence; we know what we're doing, less openness to dissenting views, perhaps less humility, less of an innovation mindset. Often that period of usually relative success, but also complacency will come to an end with a failure.

BK: So in the Apollo years it sounded like they were able to do some rapid problem solving when they got into issues and you talk about the fire in the capsule and how they were able to resolve that on the fly. What changed between that time and when the shuttle program came into being?

AE: Well, I think the one-word answer is leadership. I've already said things about the culture and greater complacency and greater confidence and it’s leaderships’ role to combat that very natural tendency. It’s leaderships’ role to do as Gene Kranz did in the Apollo 13 mission and say, “Failure is not an option and I'm absolutely confident that between your engineering training and our collaborative abilities we'll solve this problem.” That spirit was no longer there in the mission leadership at the time of Columbia.

BK: Is there a difference between the way engineers would approach these kinds of organizational challenges and, say, managers that have other kinds of backgrounds?

AE: I wish I could say simply the answer is yes. The reason it's more complicated than that is that most of these managers at NASA, very senior managers, had engineering backgrounds. In fact, there's a very famous line in the Challenger launch disaster where one manager says to another, “Bob, take off your engineer’s hat and put on your management hat.” The subtext of that comment seems to be to realize that we've got a very serious contract at stake here and we don't want to upset our customer so please, you know, be supportive of what our customer wants. Clearly not the best advice that was ever given to anybody. But I think it's a real syndrome where managers think that their job is different than the problem-solving that engineers take for granted.

BK: There was a very pronounced hierarchical structure at NASA and I'm curious about whether or not you feel that the learning that came out of the Challenger disaster somehow didn't stick. They didn't—it didn't penetrate the organization deeply enough for them to have a different outcome when Columbia happened.

AE: I think that's an accurate statement. I believe one of the things they may have inadvertently learned from Challenger was that launch is dangerous. I suspect that every time there was a successful launch in the aftermath of Challenger engineers throughout the organization and managers had a deep sigh of relief. You know, “We survived the launch and forgot to realize it's every bit as big a risk upon re-entry.” So they learned the technical lessons well. I don't think the organizational lessons were learned as well until later.

BK: The incident that happened when the Columbia took off was a piece of foam. I'm going to get the science wrong here but a piece of foam broke off one part of the ship and damaged another part of the ship.

AE: It actually broke off the solid rocket booster. So the large booster rocket that the shuttle vehicle is attached to, to get into the launch, and then the booster rocket falls away and the Columbia keeps on going. So a big piece of insulating foam came off the solid rocket booster and hit the leading edge of Columbia's left wing.

BK: And there was a lot of disagreement among the team as the days unfolded about how significant the damage was and whether or not it was going to cause any problems. As you saw the case unfold, as you were doing the research, what became clear to you about what was happening within the team?

AE: What we do in the classroom is we try to unpack the causes of the failure. We unpack them at organizational, group, and individual levels of analysis and it's multi-causal, indeed as Mike Roberto suspected when we went into it.

BK: I'm curious about when you do the role playing version of this versus the paper case, how do students react in that situation? What surprises you about things?

AE: It's wonderful to see the students actually take on the role. Now many people find themselves appalled to be in one role or another. They don't like their character and that's fine. Other times students have more empathy for that character than they would if they were just reading it as an objective document, if they hadn't been asked to occupy their shoes.

BK: Because there are clearly bad guys and good guys. There's a hero character.

AE: Not necessarily. I think there are only humans who are up against challenges that are quite ordinary. I don't mean the technical side of things is ordinary but the human and organizational side of things is very ordinary. So they do what most ordinary humans, in the absence of really superb leadership, would do with the ambiguity they face and the various pressures that they face in their different roles.

BK: If we ratchet this down and take it out of that scenario and say it's not a life or death situation—if I'm a manager listening to you describe this case right now, are there lessons that can be applied across other types of industries?

AE: Absolutely. To be very literal, any time you're going to launch something, meaning a product launch, an initiative launch, a culture change launch, you want to be thinking very carefully about it. You want to be hearing the quiet voices. You want to be thinking about it from all angles. So that's one obvious application is how easily similar dynamics play out in organizations that are quite far removed from the space program. Even separate from a product launch or a new initiative, every organization faces the kinds of hierarchical and group and cognitive issues that we see here. I would like to describe it really as an organizational learning of dysfunction that is quite common.

BK: When you wrote the case, or after you wrote the case, did you face any kind of fallout from NASA? Was there any public reaction to the case?

AE: Initially I don't think NASA would've known about it because it we did this with public data. We wrote it. We're teaching in our wonderful classrooms here at HBS. It's not exactly front page news. But many years into it, maybe six or eight years, I'm not sure exactly, I did get a call from NASA. It came through, I picked up the phone in my office, and this wonderful gentleman said his name and said he was a senior manager at NASA and he said, “We know what you're doing.” And I thought, “Uh oh.” And he said, “And we think it's great.”

BK: Really? So maybe they're learning something from this too.

AE: Absolutely. I think they were learning even without my help. But he asked me to come and talk there and I did and it was really quite a terrific experience.

BK: Have they ever come to the class when you've discussed it with the class?

AE: Once.

BK: How was that?

AE: It was fabulous. It was fabulous. It was very, very humbling and moving. Rodney Rocha came to class, and someone else who was not part of the story.

BK: Great. Amy Edmondson thank you for joining us today.

AE: My pleasure.

BK: You can find this case along with thousands of others in the Harvard Business School case collection at HBR.org. I'm Brian Kenny, and thanks for listening to Cold Call, the official pod cast of Harvard Business School.

Post a Comment