Thinkers50 in collaboration with Deloitte presents:

The Provocateurs:

podcast series

EPISODE 7

ABOUT THIS EPISODE

Charlie Camarda: Space Odyssey

It isn’t every day you get to meet a real-life astronaut! Dr Charles Camarda (Charlie to his friends) was selected as an Astronaut Candidate in 1996 and flew as a Mission Specialist on STS-114, NASA’s Return-to-Flight mission immediately following the Columbia disaster of 2003, which claimed the lives of its seven crew. He was responsible for initiating several teams to successfully diagnose the cause of the Columbia tragedy and developing an in-orbit repair capability, used on successive Shuttle missions until the retirement of the Space Shuttle Program in 2011. 

As well as being an Astronaut, Charlie is a research engineer, inventor, author, educator, and internationally recognized expert and speaker on engineering design, safety, organizational behavior, and education. He has over 60 technical publications, holds 9 patents, and over 20 national and international awards. He was inducted into the Air and Space Cradle of Aviation Museum’s Hall of Fame in 2017.

#TheProvocateurs

This podcast is part of an ongoing series of interviews with executives. The executives’ participation in this podcast are solely for educational purposes based on their knowledge of the subject and the views expressed by them are solely their own. This podcast should not be deemed or construed to be for the purpose of soliciting business for any of the companies mentioned, nor does Deloitte advocate or endorse the services or products provided by these companies.

charlie camarda

Charles Camarda

Founder/CEO of the Epic Education Foundation

Biography

About Charles Camarda

Dr. Camarda is an astronaut, research engineer, inventor, author, educator, and internationally recognized expert and invited speaker on subjects related to engineering, engineering design, innovation, safety, organizational behavior, and education. He has over 60 technical publications, holds 9 patents, and over 20 national and international awards including: an IR-100 Award for one of the top 100 technical innovations; the NASA Spaceflight Medal, an Exceptional Service Medal; the American Astronautical Society 2006 Flight Achievement Award, and he was inducted into the Air and Space Cradle of Aviation Museum’s Hall of Fame in 2017.

He was selected as an Astronaut Candidate in 1996 and flew as a Mission Specialist on STS-114, NASA’s Return-to-Flight (RTF) mission immediately following the Columbia disaster. He was responsible for initiating several teams to successfully diagnose the cause of the Columbia tragedy and, in addition, develop an on-orbit, wing leading edge repair capability which was flown on his RTF mission and all successive Shuttle missions until the retirement of the Space Shuttle Program in 2011.

Dr. Charles Camarda retired from NASA in May 2019, after 45 years of continuous service as a research engineer and technical manager at Langley Research Center (LaRC), an Astronaut and Senior Executive (Director of Engineering) at Johnson Space Center (JSC), and as the Senior Advisor for Innovation and Engineering Development at LaRC.

Dr. Camarda is the Founder/CEO of the Epic Education Foundation, a 501(c)3 corporation seeking to democratize education for learners at all levels. He is also the President of Leading Edge Enterprises LLC, an aerospace engineering and education consultancy.

Hosts:

Hosts 02

Des Dearlove

Co-founder, Thinkers50

Hosts 05

Geoff Tuff

Principal, Deloitte Consulting

LISTEN NOW ON

Inspired by the book Provoke: How Leaders Shape the Future by Overcoming Fatal Human FlawsWiley, 2021.

Subscribe for More Episodes

Get New Episodes in your Inbox

* indicates required

Marketing Permissions

I agree to let Thinkers50 and Deloitte contact me via:

You can unsubscribe at any time by clicking the link in the footer of our emails. View our Privacy Policy. We use Mailchimp as our marketing platform. By clicking to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.

#TheProvocateurs

SUBMIT A PROVOCATEUR

Recommend a Guest for the Podcast

This field is for validation purposes and should be left unchanged.

EPISODE 7

Podcast Transcript

Des Dearlove:

Hello, I’m Des Dearlove. I’m the co-founder of Thinkers50. I’d like to welcome you to another episode in our podcast series Provocateurs, in which we explore the experiences, insights and perspectives of inspiring leaders. Our aim is to provoke you to think and act differently through conversations with insightful leaders who offer new perspectives on traditional business thinking. Now this is a collaboration between Thinkers50 and Deloitte. So my co-host today is Geoff Tuff. Geoff is a principal at Deloitte [Consulting] where he holds various leadership roles across the firm’s sustainability, innovation, and strategy practices. And he’s also the co-author – along with Steve Goldbach – of Provoke: How Leaders Shape the Future by Overcoming Fatal Human Flaws. Geoff, welcome.

Geoff Tuff:

Thanks, Des. Great to be here. And so this is actually pretty cool. We have a real life astronaut with us today, so we’re really looking forward to digging into this conversation. Dr. Charles Camarda was selected as an astronaut candidate in 1996 and flew as a mission specialist on STS-114, which was NASA’s return to flight or RTF mission immediately following the Columbia disaster of 2003, which sadly, claimed the lives of its seven crew. He was responsible for initiating several teams to successfully diagnose the cause of the Columbia tragedy, and in addition, to develop an on orbit leading edge repair capability, which was flown on his RTF mission and success of shuttle missions until the retirement of the space shuttle program in 2011. So Charlie, great to have you here.

Charlie Camarda:

Thank you very much, Des and Geoff.

Des Dearlove:

Well, as well as being an astronaut, Charlie is a research engineer, inventor, author, educator, and internationally recognized expert and speaker on subjects related to engineering, engineering design, safety, organizational behavior, and education. He has over 60 technical publications, holds nine patents and over 20 national and international awards. And he was inducted into the air and space cradle of aviation museums hall of fame in 2017. Charlie retired from NASA in 2019 after 45 years of continuous service as a research engineer and technical manager at Langley Research Center.  He is the founder and CEO of the Epic Education Foundation, a nonprofit organization seeking to democratize education for learners at all levels. He’s also president of leading-edge enterprises and Aerospace Engineering and Education Consultancy. And if that wasn’t enough to keep him busy, he’s also working on a new book, which he may tell us a bit more about later. Charlie, welcome.

Charlie Camarda:

Good morning. Nice to meet you guys. Nice to see you.

Des Dearlove:

Now, before we get on talking about what is an extraordinary life and work, can you start by telling us a little bit about your journey? I mean, where did all this start for you? What provoked you to want to become an astronaut? What was, if you forgive the pun, what was your ignition point?

Charlie Camarda:

Yeah, I can tell you, it was similar to me and most kids my age that grew up during the space race, we were watching The Mercury Seven astronauts, those phenomenal heroes, it was us against the Russians. It was a national thing, it was a technical thing. And for a geeky kid in Queens, New York, that was trying to launch rockets, just like my good buddy Homer Hickam in cold West Virginia. It was every kid’s dream. And so, yeah, I knew when I was six, seven years old, I wanted to be an astronaut.

Geoff Tuff:

So that eventually led you to what was in many ways, a seminal moment in space flight and the return to flight mission. Can you tell us a little bit about just the journey from the beginning of your career to what led you to that point and how you ended up in that position at that point in time?

Charlie Camarda:

Well, I graduated from Brooklyn Polytech, and I did an internship at NASA and that opened the door to me. And it really made me realize that what I really loved was research. I got to work with these amazing mentors at NASA Langley Research Center, knew I wanted to do that. And luckily, I was hired by NASA Langley. I waited three years, applied when they first had a call out for non-pilots for astronauts and wasn’t selected, of course, I only had a BS degree and then waited 18 years, reapplied after I was a single dad and could take my daughter to Houston, Texas. And the rest is amazing. I was just very, very fortunate. I was selected, me and my daughter got to travel to Houston, Texas where I trained for nine years before my first flight.

The way I got on the STS-114 flight, I’m not really sure how I was selected, but I know it must have had to do with the fact that my expertise for the 22 years I was a research engineer was in hypersonic thermal structures, thermal protection systems. And it was really the thermal protection system that got hit with the foam, hit the wing leading edge and caused the accident.

Des Dearlove:

Thanks, Charlie. I mean, people obviously are familiar with what happened with the accident, but perhaps you could just give us just quickly your version of what happened just because some people listening to this may not be as familiar as you think.

Charlie Camarda:

I was training in Russia and so unbeknownst to us, we were training on an expedition eight. We were a backup crew. We had been training in Russia for a month. When we got news that we lost Columbia, we lost the crew and we immediately went inside to cottages there in star city, Russia, and we watched the television camera and I was just amazed that this very, very large piece of foam hit a very fragile, sensitive area – the underside of the wing leading edge, the underside of the wing, very fragile thermal protection system there. I was amazed that the crew on the ground, the ground team, did not recognize that this was a critical problem and thought the crew was safe to come home rather than provide a rescue mission.

Geoff Tuff:

So was that the option then at the time, Charlie, that there could have been a rescue mission in theory? Tell us a little bit about that. And maybe if you could, NASA is I think we all know NASA as a very careful well researched organization. Tell us a little bit about how you think, what led to the conditions for this happening?

Charlie Camarda:

The conditions that led to the Columbia tragedy were exactly the same cultural behavioral conditions that caused the Challenger accident. And so NASA learned nothing. They did nothing to correct the culture. And so Diane Vaughn and the rest of the Columbia accident investigation board decided that the real primary cause of the accident was the culture. So things did not change, unfortunately.

Des Dearlove:

It must have been a really difficult time to be part of NASA, I mean, at that point. You were part of the teams that actually tried to investigate the cause of the accident and try to change processes and how things happen. Can you talk us through that a little bit?

Charlie Camarda:

Absolutely. The team that they had in place that was actually studying impacts to shuttle tiles, really, it was not the right team. They did not have experts really in ballistic impact testing. We had these experts at some of the research centers, but unfortunately, the team at Johnson Space Flight Center was working with teams from Southwest Research Institute, and they really were not able to predict the level of damage that would happen if a large piece of foam hit a fragile thermal protection system. And they didn’t reach out for help, which was really troubling. And they spent over probably about 30 years studying this phenomenon. They mistakenly said, they thought they knew, that they understood the problem, and it wasn’t a problem when really they did not. So it was a real failure of the team.

It was a real failure of a culture which was not psychologically safe. So people could not question, could not interrogate the members of this team to see what they really knew. If they did, they would’ve known that they were seriously lacking the expertise and the crew was in danger. And what came out of the Columbia accident investigation board review, many people at the mission management team, I believe Linda Han said, well, ‘there was nothing we could have done’, which was a terrible, terrible statement. I mean, the statement she made after the accident, when really the Columbia accident investigation board said, you know what? There was something that could have been done. And most assuredly, mission control would’ve sent up a rescue mission that would’ve rushed it, they would’ve tried to get it up there in time to save the crew.

Geoff Tuff:

So, Charlie, Des and I consider you a provocateur, that’s obviously why you’re on our podcast here. And one of the premises behind provocateurs is that they have an ability to overcome what we call fatal human flaws, the biases that prevent us from seeing the reality, looking at data in new ways, spotting trends maybe earlier than others. So it sounds like part of what happened that led to the disaster was a lack of expertise, but tell us a little bit about some of the biases that were inherent to NASA’s culture and to what you know about the team that may have contributed to the accident, beyond the lack of expertise.

Charlie Camarda:

There were probably about 40 or so different terms that sociologists, psychologists, behavioral scientists and cognitive scientists basically list – and NASA Johnson space flights, and the mission management team, the program office displayed almost every single one of those. The first one was a lack of psychological safety, an environment where people feel free to take interpersonal risk, to ask tough questions, and not be silenced. Another cognitive bias is this idea that we all have to work together as a team. We all speak with one voice and there are no disruptors or some outliers out there that have a different voice. And so that tends to lead to groupthink. We may be thinking that it’s a good thing to have a very cohesive group, but what was really lacking was what I call a research culture. Researchers and a research culture are really driving for the root causes when they see a problem.

And they take suggestions from people from all different disciplines to understand what their ideas are and they come together, they even argue the facts, they argue the data and they challenge one another. This was totally unheard of in that environment. You had a group of people that got to work together as a group, it was called the leading edge structural subsystem problem resolution team in the case of Columbia, much like the O-ring team during Challenger, and what the way these teams collectively think and construct knowledge and create rules and procedures by which they categorize how risky an event is, it’s all standardized and what Diane Vaughn calls normalized. They normalize deviance, deviant behavior in some cases, in order to provide what they call flight rationale, they’re driven by that production culture to meet schedule and budget.

And when people raise their hand and try to put on the breaks, they’re looked down upon. And so you saw people like Rodney Rocha being vilified, being likened to Chicken Little when he said, well, maybe we should take a picture on orbit to see if there was any damage to the wing leading edge. That person was vilified. And so it was this culture and this lack of what I call research culture, that led to this bad decision.

Des Dearlove:

Try and clarify for us how, I mean, you’re talking about research engineers as opposed to your ordinary routine engines, if you like. Can you try to explain the difference in thinking? It sounds like we need more of these research engineer types.

Charlie Camarda:

Right. And researchers are more like scientists. So they do the analysis and they conduct very intelligent smart tests. So the test that they do, the experiments they do are very well thought out and planned out. And the analysis they do has to correlate with those tests. If it does not, if the analysis and the assumptions you use in your analysis are not correlating with what you see in the laboratory, you go back, you refine your analysis. And so these are researchers that are expert in developing the new analytical techniques, the new numerical techniques to get a more accurate response to make sure that all the assumptions being made are the right assumptions, or maybe the assumptions have to be expanded and not minimized. And so this is what researchers do. And they do this in a very rigorous way, starting with small experiments, understanding the basic physics of the problem and testing to failure.

Once they really understand the phenomenon they’re trying to describe and understand, they test things to failure. And if you can predict how something is going to fail, you really understand that phenomenon. And they’re constantly scaling up that level of experiment and analysis as you’re building up so that you approach what the full scale system looks like. So for instance, for a wing leading edge, we might be testing coupons, we might be hitting them with small coupons of foam. And what you’re doing is you’re analyzing the failure theories. Are we able to predict how the composite structure fails and when it’s going to fail? Then you start building up, you start adding attachment pieces till you get the complete or full scale picture of the wing and can model it and analyze it.

Geoff Tuff:

I’m now starting to listen to a lot of what you’re saying and trying to apply it to the average corporation out there, where you may not have the benefit of a lot of people who have a research background like you do. And I’d love to dig in a little bit, first of all, on how you think non-research oriented people, good old fashioned MBAs, like myself, might be able to apply some of the thinking that you’re describing here. And in particular, how to think about this whole topic of failure. Because on the one hand, what I hear you saying is absolutely integral to having a research mindset is testing to failure. But, ultimately, we had a large-scale failure here of the entire system that led to a disaster. How do you think about that concept of failure with some of those ideas?

Charlie Camarda:

And really, it’s not just technical people that are researchers. As you do research in every domain, every business and also business, right? How do you predict when a Black Swan is going to happen? When a researcher sees a very tiny anomaly, it disturbs them. They have this quest, this thirst for knowledge to understand everything they’re seeing. If they see something that doesn’t look right, it drives them crazy. They have to understand that – they’ll research anyone else that has ever had a condition like this happen to them, did they experience this phenomenon? If so, how does it occur? Why does it occur? And can I develop the analysis to predict it’s going to occur if I see it in the laboratory? It’s that unquenchable thirst. You don’t just do an analysis and it looks like it compares with your experiment and you’re good to go.

You’re testing it. You’re testing various parameter changes, you’re testing various boundary conditions, initial condition changes, to make sure that you fully understand the environment and the system that you’re trying to understand and test and analyze. And so every domain has those types of thinkers. They’re the scientists, the ones that are constantly looking at just the small anomalies and trying to make some sense of it. Rather than go to the quick idiosyncratic explanation of why this happened, you are not satisfied until you can recreate it in the laboratory to your satisfaction and your level of accuracy.

Geoff Tuff:

That totally makes sense. And I’m sure a lot of our listeners are thinking: analysis paralysis. How do you know when it’s just too much analysis, too much research, too much testing, it’s just time to move on?

Charlie Camarda:

You know, it doesn’t take a whole lot of time to identify these critical problems. And especially in this day and age with this odd technology, you should be able to reach out and get just the right person who has encountered this phenomenon and has analyzed it to the nth degree and has tested it in the laboratory. You bring them in to look at these anomalies, they will be able to point you in the right direction. Why was I able to see problems that hundreds of other engineers, and I led the engineers at Johnson Space Flight Center? Why could they not see the anomalies in my wing leading edge? They might be experts in a particular discipline, but most of these problems or what we call multidisciplinary and they’re inter-disciplinarian in nature and they interact at the boundaries of these disciplines and that’s where the problems happen.

And so when you develop a team to understand these interdisciplinary problems, they have to be trained to analyze these problems and all their couple behaviors so that they can recognize the effect that each one of these phenomena has on the other phenomena. For instance, thermal and structural analysis, or aero thermal analysis, you look at a particular place on a wing leading edge and because of the bump out on the wing leading edge, or it’s butting up against another panel, and it might be sticking out a little bit to the flow field, it causes a very sharp increase in heating, that causes a very sharp increase in temperature and also thermal stresses. This is what caused these very local phenomena, these anomalies to happen on my wing leading edge that the other engineers could not understand because they typically work in one particular discipline. And so we take a team of teams kind of approach, the way we solve problems in my branch at NASA Langley.

Des Dearlove:

Obviously, you’re talking about highly technical stuff, some of that, but if I can just build on Geoff’s point about failure, because when you and I have talked before Charlie, you talk about that people don’t learn to fail. You know, if you’re going to test to failure, you got to know how to fail. Well, and this goes right back to school. Talk, talk us through that a little bit.

Charlie Camarda:

That’s right. NASA forgot how to fail. Many companies forget how to fail. They forget about the importance of failure. Failure is critical. Researchers understand this, that’s why this isn’t a problem for researchers. Researchers go into the laboratory, they test to failure, that’s the whole point of testing, they test to failure. And so we teach our engineers and our students to fail smart, fast, small, cheap, early and often. When you look at the big test that they designed to shoot a large piece of foam, almost two pound piece of foam at almost a full scale wing leading edge, initially, they were only going to put six string gauges on that entire massive wing leading edge. So they wouldn’t have been able to get accurate data to understand the physics of what’s happening because it would happen very locally. And they were also not planning on doing any analysis before they ran the test, which was totally insane.

And so we stopped them from doing that. We had the researchers come in, they built up the analytical model before they ran the test, and before they ran the test, the analytical model predicted exactly a full scale hole that they saw during the test about the size of a pizza box, about 14 inches square. So they launched the actual size of foam that came off the vehicle on STS-107, hit the wing leading edge at approximately the same relative velocity, 545 miles an hour, and what happened was they made a hole, 14 inches square in the wing leading edge. And the people at mission control, the engineers on the ground said there wouldn’t be any damage, it was not critical, and we probably don’t even have to tell the crew.

Geoff Tuff:

So there are fascinating parallels between what you’ve lived through and what I think many corporations do. And actually, as I hear you talk about failure and how to design the right test, to me, it actually sounds like learning as opposed to failing, but you’re failing in a lab environment. Is that fair to say?

Charlie Camarda:

That is exactly what it is. So when you read Amy Edmondson‘s books on psychological safety and teaming, what you’re looking for is a learning organization, a learning team. You’re constantly inquisitive. You’re constantly sharing information. It has to be totally transparent. It has to be as accurate as possible. You’re sharing information and you’re learning together as a team. And that collective learning that synergy that happens in that cohesive team is what makes the magic happen. When I put together this team, I picked just the right people from the right research centers and they accomplished in three months what NASA couldn’t accomplish, and Boeing and Southwest research Institute couldn’t accomplish in 30 years.

Des Dearlove:

That was fascinating.

Charlie Camarda:

And it’s really not magic if you have the right people.

Geoff Tuff:

Right.

Des Dearlove:

We talked earlier a little bit about how you are now getting more and more involved and interested in the whole education space. Tell us a little bit about some of your work there.

Charlie Camarda:

Well, Des, I got into education really about the same time, because what puzzled me was how these engineers could not solve this problem. And so I developed a course called the Epic Challenge Program using innovative conceptual engineering design; used four professors, we put together this pedagogy and we basically taught it to young NASA engineers. And we did it at a workshop and it was only a one week workshop. We picked a challenge that NASA couldn’t solve the land landing of a capsule.  I think Starline or Boeing is trying to land on the land, all our capsules land in the ocean, right? And we were able to solve that problem. We came up with dozens of ideas in one week and we solved that problem with a student from MIT and several students from Penn State and MIT in less than two years. They came up with an innovative solution using airbags inside the Orion capsule.

And it saved a tremendous amount of weight and it increased the volume of the capsule because you could collapse the astronaut seats. So we proved that this methodology works. We proved that this idea works. But really, the whole idea for teaching this to students around the world was to minimize the loss of students in the STEM pipeline. We lose about 80% by the seventh grade, we lose about 95% by freshman, sophomore year in college. And the reason is I believe is because a lot of these students failed to see the connection with their passion. I was passionate about engineering. I had to take tremendously difficult math courses, biology, I had to take chemistry, physics, and you would lose sight of how you apply this for your vision, for your real challenge, your passion. And so the Epic Education Foundation and this methodology is all wrapped around solving these unbelievably difficult challenges, which kids of all ages love to solve.

Teaching them how to do it in teams, how to form teams, how to learn as a team, how to share information as a team, and how to ideate as a team and come up with innovative solutions. So we infuse creativity and design thinking into the engineering teams that we develop. And then we put them through this rapid concept development process. The process that we use at the research centers, what NASA used to use during the days of Apollo. So NASA was a lot like SpaceX back in the Apollo days and the early shuttle days, right? So we have to regain that lost ideology that we used to have.

Geoff Tuff:

So, as we think about extending that education, then to the broader world, really, to all sorts of organizations, it sounds like a lot of the work you’re doing right now is actually catching people when they’re young, keeping them interested in STEM, keeping them interested in design oriented approaches. How can we take that same level of knowledge into organizations writ large to say people who are midway through their career, they maybe have a formal education, maybe they have no stem background, are the ideas transferable?

Charlie Camarda:

Absolutely. As a matter of fact, it’s more important to do this after you’re an established engineer. You forgot how to be creative. You start following these rules, processes, and procedures and standard product development life cycle. And it’s time for you to start rethinking and reach back and join your creative right side of your brain and relearn how to fail, how to have fun, how to try things fail and learn rapidly. And so we teach this to workshops to companies like Adidas, Boeing, NASA, other companies, other organizations. And what we like to do is we like to pick a challenge that’s relatable to that particular organization, but we put it in an “EPIC” framework. We put it in space, we put it in an extreme environment so it really forces you to be creative in throwing out old assumptions and coming up with new ideas, innovative solutions to problems.

Des Dearlove:

I mean, obviously, we are all about provoking in this series and you are a provocative guy. I know you are. We also hugely admire Amy Edmondson’s work on psychological safety. That’s why she’s number one in the Thinkers50 ranking at the moment. I mean, it’s hugely important, but people inevitably will find themselves in situations where they have to sometimes go up against the machine and stand up and be counted. What kind of advice and counsel can you offer to people when inevitably it will happen, and sometimes it can be a matter of life and death?

Charlie Camarda:

Yeah. And it’s almost impossible not to happen. You have these large organizations, you have hundreds of teams in this hierarchical tiered structure organization, all working together to build an airplane, to build a spacecraft. How do you know when one of those 10, 15 person teams is dysfunctional? And this is where I believe technology now – and we’re looking at using technology – to basically monitor communication, to use artificial intelligence and machine learning, to identify when teams do not have the right knowledge and they’re making critical decisions. Are they experimenting? Are they going outside external to their team to bring in other ideas from other experts when they’re in trouble? Are they communicating properly: energy, engagement,  and exploration. You know, Sandy Pentland uses that to identify high performing teams or successful teams. But I think we could do a heck of a lot more using artificial intelligence and machine learning and really diving into what the teams are talking about.

Are they exhibiting psychological safety? I don’t believe surveys will do you well, will weed out the teams that are really struggling with psychological safety because they’re not going to report it. And so I believe we could take advantage of the internet and these network of hundreds of teams working together. And we could identify when we see these weak signals of either technical dysfunction or behavioral dysfunction, the alarm bells go off and you could drill down in there and you could start taking a better look. If I was not training in Russia, if I was in Houston and I saw what this team was recommending, I would’ve been jumping up and down. Charlie Camarda would’ve been a force to reckon with, of being contained and not speaking out.

Des Dearlove:

And have you been in that situation, have you ever been in that situation where you had to really stand your ground?

Charlie Camarda:

Yes, I did. A matter of fact, when I was director of engineering, the first flight, right after our flight, I stood up after my center director said we were safe to fly, I stood up at the microphone at the flights review and I said, no, we’re not. And then the chief engineer at NASA backed me up and said, we’re not. And the head of safety admission assurance backed me up and said, we’re not ready to fly. And the administrator Michael Griffin said, yeah, that’s great. I’m the administrator, we’re going to fly because ‘we have this safety, we could use the international space station as a safe haven’. Well, it turns out, a large piece of foam came off the external tank during our launch and almost hit our wing leading edge. NASA immediately grounded the shuttle fleet, which meant that shuttle that was going to come up to rescue us was not coming up.

And when you think about it, if you see a problem that’s similar to the past problem on the next shuttle, it’s a systemic problem. Are you going to risk the next crew? So it was really not a good backup plan, right? And so I got reassigned three days later, much like Alan McDonald at Thiokol, only he ended up keeping his job, I got reassigned. And then I found the problem with my very own wing leading edge. And they tried to silence my voice and it took a year for me to fight with my own organization, safety organization at NASA, and NASA headquarters, to explain to them and to show to them why their team of experts could not see this interdisciplinary problem, this anomaly and why it was a systemic problem and could cause another accident.

Geoff Tuff:

So, Charlie, it sounds like you have, at least in part, made a career out of standing up to the playbooks and the processes that large organizations like NASA, in some cases, necessarily have to have in place in order to succeed at what they’re attempting to do. That seems to be the case for any successful legacy organization that actually has an established way of doing things. Can you tell us a little bit about the investigation into the disaster and what you discovered, what the team discovered about the tensions between delivering on the mission, delivering on the mandate that the organization NASA has and the types of things that led to the disaster? How do you manage that tension? Let’s take it into a corporate context, sometimes it is time to close the books. We need to actually go and report our earnings. We have to stop testing, but sometimes you don’t necessarily have to do things exactly as it’s always been done.

Charlie Camarda:

You’re absolutely right, Geoff. Hey, we signed up for this. If the odds were one in a hundred, we get on the vehicle. I mean, we knew what the risks were and every astronaut I’m pretty sure knew what the risks were and knew that those were the risks. But when you have unnecessary risk, when you know the simple solution is to just change out the defective panels, just like changing out a bad tire before you fly, it’s a no-brainer. So it really doesn’t incur much cost, much scheduled delay. It’s just the arrogance of the people in charge. And really, if you could boil it down, all 40 terms that sociologists and psychologists use, it’s arrogance, tremendous amount of arrogance and hubris that the commander and the head of the astronaut, the head of NASA said, ‘you know what? I say we should fly.’

And there was really no reason for it. I mean, we almost took a hit with a large piece of foam on our flight. The next flight we told them these ice frost ramps would come off. They came off, they almost hit the wing and it could’ve very well been another disaster, but we lucked out. And these same managers at the mission management team were high-fiving themselves when the astronauts called down and said they did an inspection and there was no damage. So they really didn’t get it and the culture didn’t change. Right? And I think you’re going to see the same thing as some of these other organizations where you have had tragedies, and I really don’t believe that people at the top really get it.

And so you have to have very strong leaders at the top. You have to have it trickle down, but you also have to monitor every one of those teams, which is why I believe we need to do it in an effective way using artificial intelligence and machine learning. So that a person’s job isn’t on the line, it’s the machine telling you, you know what? This team is not performing well, you need to take a look here, you need to take a look at this technical problem.

Geoff Tuff:

So it sounds then, and it’s interesting because when we talk about provoking, people assume that we’re talking about a certain amount of aggression and maybe arrogance to be able to stand up and to be provocative. It actually sounds like the best provocateurs actually approach situations with humility and a learning mindset that is actually the opposite of arrogance.

Charlie Camarda:

Well, Geoff, it’s nice, if you could do that and believe me, I tried that. I tried being humble. I tried being a team player and I was getting beat up. I was being threatened physically and my career was being threatened. If you could believe it or not, not that I was really worried, but they threatened me. And I basically had to put my job on the line, my career on the line, my family definitely took the brunt of this. And I had to say, you know what? If you don’t change out these wing leading edge panels, I’m going to go to the New York Times. We could play this out in the court of public opinion. I didn’t want to be a whistleblower. I tried my best to work with them within the organization.

And eventually some senior research just saw that what I was saying was true. They finally read the hundreds of pages of documents, technical documents I wrote trying to explain what this anomaly was and why it could be critical. And even my own organization, the NASA engineering and safety center finally relented and said, ‘yeah, we should change out the panels’.

Des Dearlove:

That’s amazing. Let’s just change the tone of this a little bit. Let’s go somewhere a little bit brighter. What’s it like up there in space? What’s it like to look down? I mean, you are one of a handful of people who have had that experience. I’m sure everybody asks you, but we have the duty to ask you.

Charlie Camarda:

I know you’re duty bound and you did, and I’m not one of those poetic spiritual types that had this experience, this life-changing experience. I was up there to do a job, I was only there for 14 days. I was a workaholic. We were working like crazy. Andy Thomas used to grab me and drag me up at the end of the day and put my face in the window so I would look out the front window and say, look at the Southern lights, look at the way we see these amazing things on Earth. You really do. We got to see a sand storm through the Saudi peninsula. It’s just beautiful. But to me, the amazing thing to me was working on that team and making the magic happen with the small team we had in space, working with the larger team we had on the ground, because we really were connected with the friends of Charlie Network on the ground.

I had their phone numbers in my little handbook that I took up with me and I could call them on my computer and I could call them on their phone. And if we saw something that we didn’t like, I got it straight from the horse’s mouth, I didn’t have to go through mission control. We also, on our mission, had an agreement with the flight director that every night before we went to sleep, the astronauts would just talk to the head of the astronaut office. No flight directors were allowed on the line, no one else, and that was totally unheard of because we actually didn’t trust what the people on the ground were telling us. Because of what happened during Columbia, which is really amazing when you think about it, right?

Geoff Tuff:

There really are just so many amazing analogies between the experience you had. And I spend most of my life serving large corporations and what the average corporation experience is. I’m imagining someone dragging someone by the collar out of their cubicle and making them look out the window to see that there’s real life out there and stop working so hard. So, Charlie, I’m guessing, we mentioned at the beginning that you’re working on a book, I’m guessing that some of what we’re talking about is going to find its way into a book. Can you tell us a little bit about what you’re working on?

Charlie Camarda:

So if I would’ve written this book 17 years ago, when all this bad stuff happened to me, it would’ve been a totally different book. Yeah, there’s going to be some of that – look how bad NASA was. How could an organization as prestigious as NASA do these terrible things? And you have to do that to some sense to make people realize this happens in every corporation, just like you said, Geoff. And the second half of the book is really going to talk about how do you fix NASA? How do you fix other corporations that lost their core ideology, lost their research culture, their intelligence, and lost this connection? How do you create these networks? Create these team of teams networks to get just the right people interacting and collaborating in a cohesive synergistic way to make the magic happen and understand the true root causes of these anomalies?

Because once you understand their true root causes, you could fix them. It’s when you think you understand the root causes that you really don’t fix them. So if people would’ve just changed the rubber on the O-rings, we would’ve had another failure. It really wasn’t an O-ring failure, it was a structural joint failure and they had to redesign the entire joint. But if you just read the headlines, it plays out like it was just the cold temperature and that material, that rubber material.

Geoff Tuff:

Do you have a title?

Des Dearlove:

What I’m hearing with the book is, knowing you too, I mean, this isn’t about throwing stones or trying to get one over on that, this really is about solving problems. That’s what’s driving you to write this book.

Charlie Camarda:

Absolutely. And so I want to teach other organizations, how do you do this? Because what I show with these case studies are the simple teams that I was able to form. And I didn’t lead some of these teams, I created some of them and I handed them off, but I picked the right people, had the right chemistry, and just followed them at a distance to make sure they did what they needed to do. The repair team, I had a hand in working in my friend’s garage and coming up with some of these ideas, but I still had to hand that over because I had to fly in space, but it’s really trying to teach corporations how to fix these problems.

Geoff Tuff:

So I asked before, I spoke over you Des, I apologize, but do you have a title, Charlie? There’s all sorts of really cool things you could call it.

Charlie Camarda:

I’m kicking around and I’m fighting with some of my coaches and my managers, but I really believe one of the titles we’re kicking around is: Houston you have a problem: why large corporations fail and continue to have recurring disasters and how do you fix the problem.

Des Dearlove:

That’s a great title. Hey, listen, we’re out of time. It’s been an absolute pleasure. That’s I’m afraid all we have time for though. Huge thanks to our guests, Charlie Camarda and to you for listening. This is the Provocateur podcast brought to you  by Thinkers50 and Deloitte. And we’re Des Dearlove and Geoff Tuff. Please join us again soon for another episode. Thank you.

Charlie Camarda:

Great job guys. Thank you.

Geoff Tuff:

Thanks, Charlie.

This podcast is part of an ongoing series of interviews with executives. The executives’ participation in this podcast are solely for educational purposes based on their knowledge of the subject and the views expressed by them are solely their own. This podcast should not be deemed or construed to be for the purpose of soliciting business for any of the companies mentioned, nor does Deloitte advocate or endorse the services or products provided by these companies.

Subscribe to our newsletter to keep up to date with the latest and greatest ideas in business, management, and thought leadership.

*mandatory field

Thinkers50 will use the information you provide on this form to be in touch with you and to provide news, updates, and marketing. Please confirm that you agree to have us contact you by clicking below:


You can change your mind at any time by clicking the unsubscribe link in the footer of any email you receive from us, or by contacting us at . We will treat your information with respect. For more information about our privacy practices please visit our website. By clicking below, you agree that we may process your information in accordance with these terms.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.

Privacy Policy Update

Thinkers50 Limited has updated its Privacy Policy on 28 March 2024 with several amendments and additions to the previous version, to fully incorporate to the text information required by current applicable date protection regulation. Processing of the personal data of Thinkers50’s customers, potential customers and other stakeholders has not been changed essentially, but the texts have been clarified and amended to give more detailed information of the processing activities.

Thinkers50 Awards Gala 2023

Join us in celebration of the best in business and management thinking.