Skip to main content

Colloquy Podcast: How to Succeed in Business by Failing—Intelligently

Fail fast, fail frequently, and learn from it. That's the mantra adopted by many Silicon Valley firms in recent years. Fine. But would you tell that to your emergency room doctor? Or someone who's managing your retirement funds? Or the pilot of your next flight? 

Harvard Business School Professor Amy Edmondson says that the key to squaring this circle is failure-proofing critical areas where best practices are well known while encouraging experimentation in new fields where useful and productive knowledge can be gathered. That means building a sense of psychological safety among colleagues and coworkers that fosters trust, open communication, and new ideas.  

This transcript has been edited for clarity and style. 

So before we get into your research, I'd like to talk just a bit about how you came to it. You mentioned in your book that you had your own stomach-churning experience of failure as a PhD student at Harvard Griffin GSAS, which led you to develop the concept of psychological safety. Could you share more about that experience and how it shaped your research interest? 

Sure. Well, I was a second-year PhD student. And I had said yes to joining a team of medical researchers at the Harvard Chan School, some at the Harvard Medical School. I was asked to join that team to look at medication errors. My small part of the study was to ask the question—do better teams, better teamwork, which I could measure with a historically validated survey instrument called the team diagnostic survey, have fewer errors compared to less good teams? 

Image
Amy Edmondson Headshot
Amy C. Edmondson, PhD ’96, is the Novartis Professor of Leadership and Management at the Harvard Business School.

So that seemed like a pretty straightforward research question and one that had a high likelihood of working. I was collecting the survey data, as I said, and trained medical investigators were going team to team in the hospitals to gather data on error rates. Now, let's fast forward six months. I finally have six months of error data, having done my survey data in month one. And all I have to do is put the two data sets together and run the correlations. 

And lo and behold, first the good news: There is a statistically significant correlation between the two data sets. So that was good. And then, suddenly, I looked more closely and realized that the correlation was in the wrong direction. The data suggested that better teams, according to the team diagnostic survey, had higher—not lower—error rates. So that really threw me. And, yes, it was a stomach-churning moment. I had failed desperately to support my hypothesis. 

In fact, not only did I not have the data to support the hypothesis, but the data suggested I was 180 degrees wrong. And my brain went spiraling outward to, "I'll never make it as a researcher. What am I going to do after I leave graduate school—drop out of graduate school?" None of that thinking was very healthy or very productive. But it didn't take too long for me to pause and think, OK, what's really going on here? Why might better teams have higher, not lower, error rates? 

And it suddenly occurred to me that maybe the better teams, better leadership, better relationships in the teams were more able and willing to talk about mistakes than the other teams. Now, I was able to do some secondary analysis that at least supported the plausibility of that argument—that better teams are, by the very nature of teamwork, more candid, more open, more willing to speak up. Much later, in a second research project where I was studying this notion of interpersonal climate on purpose, I called that phenomenon "psychological safety." I developed a measure of psychological safety, which is roughly translated as the belief that your environment, your team, is safe for speaking up. And that measure of psychological safety turns out to vary dramatically across teams, even in the same organization, and to be very predictive of team learning and team performance. 

What you're describing there is so fascinating. Something that presented itself to you as a terrible failure actually inspired you to reframe the way that you were looking at the problem. Is that what you mean by the term "intelligent failure"? 

Yes. And my disappointment at not being able to support my hypothesis—in other words, my failure is, in fact, part of the research journey. You're not a researcher if you don't have any failures. If all of your hypotheses are supported by the data, you're just not venturing into new territory. 

So my pausing to think about it wasn't so much reframing it as trying to figure out what's really going on here. So I wasn't able to get the data to show this simple relationship I thought must exist. But instead, I was able to get data that made me think deeply about what really goes on in these work environments. What does it mean to be working in a team, in an environment with high stakes? And what does it feel like when you can't speak up honestly about a mistake or about not knowing what to do? 

I mean, it began to seem to me that that was a pretty important phenomenon. It was not a phenomenon I'd ever thought about before. But I began to think, in an uncertain, complex world, people need to keep learning. And if they feel reluctant to speak up and ask for help and admit a mistake, they're not learning. And their teams aren't learning. And this is a problem. So I got interested in a different problem than the one I had failed to get the right data for. And that new problem actually had legs. That new problem ended up giving rise to a research literature in organizational behavior that has been very influential and is really quite large. 

Many people have used my measure. There are well over 1,000 peer-reviewed studies on this topic. So if I hadn't failed, I wouldn't have had this subsequent success. And I think that's true in all research. Scientists are people who withstand the disappointment of failure because they've chosen to work in fields of discovery where you just don't know enough a priori to go straight for the target. There will be failures along the way. And, yes, those are largely intelligent failures—an undesired result of a thoughtful experiment in new territory. And by thoughtful, it's no bigger than it has to be. And it's been driven by a hypothesis. 

So researchers, by definition, if they're any good, are experiencing intelligent failures. They're also experiencing some spectacular successes and discoveries. But the nature of becoming a scientist is that you accept that there will be failures along the way. And most of them are intelligent. And I will say, you can be a scientist and accidentally mix up the chemicals in your experiment and get a failure. But that one wasn't intelligent, because it was born of a mistake. So mistakes are, as you said in your introduction, mistakes. When we're at our very best, we can prevent them. And we can catch and correct the ones that we don't prevent quickly enough to not cause harm. 

You've used a couple of different terms. You've used "failure." And you've used "mistake." Are they the same? 

No. No. In fact, it's one of my pet peeves that people often treat them interchangeably. So a mistake is a deviation from a known practice or knowledge base. So you can't make a mistake in new territory. It's just nonsensical. A failure is a bigger category. Some failures are caused by mistakes. But other failures are simply the undesired result of an experiment. 

You mentioned that, out of your experience of failure when you were a graduate student, you developed this idea of psychological safety. What does your research show about the benefits of creating a psychologically safe environment? How does that contribute to intelligent failure and organizational success? 

Well, the benefits of a psychologically safe environment are roughly in two categories. One is the freedom to experiment. You feel more willing to take risks—the necessary risks of science or the risks of learning. If you're in the classroom, you might raise your hand and ask a question that someone might believe is a stupid question. But you don't worry about that because it's a safe environment. And you know that you need an answer. And it's worth asking. 

So you might be willing to try a very bold experiment. So psychological safety in an organization gives room for experimentation, which ultimately can produce innovation and wonderful new products and services that contribute to the success of the organization over time. The other bucket of factors related to psychological safety, and performance relates to people's willingness to speak up quickly when they see someone else making a mistake or when they're not sure what to do. And the performance benefits are more expressed in terms of excellence. They're about people being able to do a great job at things we already know how to do. They might be people in a manufacturing setting, or in a customer service setting, or, indeed, in a patient care setting who, because they feel psychologically safe enough to ask, avert a consequential medication error. 

Who does this well? What are some companies, some organizations, that have successfully implemented these principles? And can you talk about some of the outcomes? 

IDEO is a company that I love to study and enjoy talking about. It's an innovation consultancy. They work with clients to help them develop highly innovative, attractive products and services. And this is an organization that recognizes—remember, one of the benefits of psychological safety is that ability to experiment and speak up. And they recognize that you can't innovate if people are tied up in knots, they're reluctant to take any risks. You just will never come up with anything new and useful if people aren't willing to take risks. 

So the company and its leaders have worked hard to build that kind of playful willingness to take risks, willingness to speak up, willingness to share wild ideas, ask for help, and all the rest. And they've been incredibly successful for the last 25 years or so. Pixar is another company that I love to highlight with respect to this question. It's a company that has been consistently successful in developing profoundly compelling, innovative, wonderful movies with great messages and just truly, truly entertaining and truly smart. And again, in order to do that they've recognized they must be willing to critique themselves along the way, that people can't hold back their opinions if their opinions are seen as boring or that don’t look compelling. 

They've got to create the kind of candor they need to be honest with each other behind closed doors so that when the products are finally released to the public, they're great. So both of those companies are very obviously creative industry companies. They're organizations that do nothing but create new things—so let me go to the very different context to give you a third one, which is Toyota. Toyota had some wonderfully innovative products over the years, but the bulk of what they do is the production of high-quality vehicles day in and day out, in a highly routine way. And they have worked hard over decades to create psychologically safe environments where every single team member, every front-line associate knows that they are not only allowed—they are encouraged, they are expected to speak up if they see some deviation or even potential deviation from perfect. They are invited, celebrated, and thanked for speaking up, for taking those interpersonal risks. 

In describing your experience when you were a graduate student, you described your initial reaction, the stomach churn, the horror. You said that you briefly considered dropping out of graduate school. You describe that as unhealthy and unproductive. But it also seems to me to be pretty normal. No one wakes up in the morning and says, "Boy, how can I fail today?" So how can individuals and organizations manage that, I think, very natural human response to failure so that we're willing to go through with it? 

Well, in The Right Kind of Wrong, the recent book, I describe three competencies that are partial answers to this question. And they are self-awareness, situational awareness, and system awareness. And self-awareness, I think, is probably the most important in answering your question because it's the self-awareness to pause, to disrupt the automatic thinking. We all have automatic thinking. Something goes wrong, or you're late for a meeting. And your brain just starts spiraling out. Your self-talk starts spiraling out in unhealthy ways. 

And the job of adults is to learn how to slow it down and say, "Hold on. No, this is inconvenient, not terrifying or terrible." So it's learning to pause and insert a healthy dose of productive analysis or rational thinking around the situation. "OK, this didn't turn out the way I wanted. What do we learn from it?" And I know that sounds cliché and trite. But it's a powerful reorientation to just remind yourself, "OK, this wasn't my first choice. But sure is interesting. What does it teach me? What's going on here?" 

And situational awareness may be even more important because if you're in new territory, if you're learning to ride a bicycle for the first time, you're 5 years old, it won't work the first time. When you're learning to do advanced linear algebra, you're not going to get everything right, or it's not a very interesting sport, is it? So there's something about just pausing to say, "This is new territory. My job here is to take risks, to get it wrong on the way to getting it right." 

That's a great segue to my next question, which is about the Stanford psychologist Carol Dweck, who I'm sure you know. She's done a lot of research on fixed versus growth mindsets. And in her book, Mindset, The New Psychology of Success, she writes, "In the fixed mindset everything is about the outcome. If you fail or if you're not the best, it's all been wasted. The growth mindset allows people to value what they're doing, regardless of the outcome. They're tackling problems, charting new courses, working on important issues. Maybe they haven't found the cure for cancer. But the search was deeply meaningful." 

So in some sense, I think Dweck might say that there really are no failures, particularly if we learn from them. I wonder how this framework intersects with the notion of intelligent failure. And what strategies can individuals use to build resilience and learn from the times they fall short in their personal lives and careers? 

Well, I do write about growth mindset in Right Kind of Wrong. And it's absolutely tightly connected to this topic. In fact, my son Nick, when he was about 8 years old, maybe 7 years old, asked me to watch him—we were skiing at Wachusett near here—and asked me to watch him come down the hill. "Would you go to the bottom, Mom, and watch me come down?" And I said, "Sure." So I did that. He came skiing down. And he said, "How did I do?" I said, "You did great." 

And he looked at me. I'll never forget it because he literally looked disappointed in me. And he said, "Can't you tell me what I did wrong, so I can get better?" And I just thought, "Oh my goodness, I've got a kid with a growth mindset. This is incredible." I mean, I've done as much as I could over the years to emphasize process and not outcome, as Carol recommends. But I realized, it's real. He really is thinking that way, to his credit. 

And he was disappointed in me. And I think my answer was one that I've heard a lot of parents give. And I think a lot of kids might not have been disappointed. They might have been pleased. But Nick knew he wasn't an Olympic-level skier. He was just a little kid trying to learn. So he was like, "Gosh, Mom, I thought you'd be more helpful than that." And a growth mindset literally believes: I am getting smarter, better, stronger, when I stretch, when I try things. 

And also, if everything's working out, I'm probably not taking enough risks, again. And a fixed mindset says, yeah, the outcome is what matters. And if I don't get an A, that means I'm not smart. The growth mindset says, if I don't get an A, it may be because I took a wicked hard course, and it's still a little bigger than I am, this body of knowledge. I think it's almost more helpful to talk about a growth mindset as a learning mindset. It's a mindset that truly values learning and appreciates the role of learning, not just for children, but for all of us because we're living and operating in a world that keeps changing. So we have to keep learning. 

What you're describing there is the hardest thing for organizations to do. It's a culture change. And particularly in a society that values success and achievement, there may be dynamics in there where people are scared. "Oh, I'm going to lose my job if I take a risk and if I fail, if it doesn't come out." How do you change the culture? 

Well, this is where context matters, right? Because you alluded at the very beginning to the operating room or to passenger air travel. Those are contexts where, fortunately, our knowledge and our protocols are very mature, very deep. And the best advice is follow the protocols, pay close attention, try to produce as close to an error-free performance as humanly possible with the help of your teammates. That's the culture you want in the operating room, in the cockpit. 

You do want a culture of psychological safety, where people are speaking up quickly when they're not quite sure about something. Or they see you doing something wrong. Call it out, immediately. So the culture is one of learning, whether you're in passenger air travel or in a scientific laboratory. The way I think about context is that it has two dimensions that we just need to be ever mindful of. One dimension is, how much uncertainty is there? If you're in a scientific laboratory trying to cure cancer . . . huge uncertainty. It is just unlikely that today you will wake up and have the answer. It just isn’t—it’s not going to happen. It’s going to take lots and lots of false starts along the way. And if you are on the automotive assembly line, the odds of you being able to produce a perfect car today are nearly 100 percent. So uncertainty is one dimension. And the other is, what are the stakes here? What’s the risk of things going wrong? 

If you’re in passenger air travel, the stakes are high. If you’re in the operating room, the stakes are high. So you’re thoughtful about what risks we can take and what risks we cannot take. I’m a big fan of taking interpersonal risks, of speaking up when you’re not quite sure about something—but not taking wild-eyed experimentation risks mid-flight. You don’t do that. You do that in the simulator. So it’s not that we want a culture where everybody’s willing to have intelligent failures all day long. We want a culture where people are very thoughtful about context. And then because of the necessity of intelligent failures in progress in new territory, they’re willing to endure the temporary setback of intelligent failures in, say, scientific and innovation realms. 

Finally, for leaders who are looking to act on these concepts, what would be your top three pieces of advice for fostering a culture that supports intelligent failure and innovation? 

Well, the most important advice is, early and often, call attention to the nature of the challenge that lies ahead. Because I think we—people in companies or in organizations of all kinds—are at risk of assuming that we already know how to do things. We have a kind of industrial-era mindset that says, "I’m supposed to just execute. I’m supposed to do my job, do it well, and go home at the end of the day." And if I need help being reminded that, no, we’re striving to do something new, exciting, and uncertain that we haven’t done before, I need your help. 

So it’s really helping people understand both the challenge and the excitement of the opportunity that lies ahead. And then leaning into inquiry: "What do you see? What ideas do you have?" I think when leaders show up in a way that clarifies that they are really eager to listen to, learn from, and collect ideas from others, that’s the biggest contributor I can think of to a learning environment. And finally, monitor your response. Be very careful not to get angry, upset, or in any way humiliate or punish someone for saying something that you didn’t want to hear. 

Harvard Griffin GSAS Newsletter and Podcast

Get the Latest Updates

Subscribe to Colloquy Podcast

Conversations with scholars and thinkers from Harvard's PhD community
Apple Podcasts Spotify
Simplecast Stitcher

Connect with us