Scott Young 00:01
I do think we can build things into meetings and agendas and processes that not only allow space for curiosity and discussion, but actively try to foster it.
Gary Klein 00:14
One of the most important contributions of a premortem is to help create a culture of candor in a team, where people aren't afraid to say things that might be unpopular.
Wieke Scholten 00:26
What I also try to do in organizations is to normalize it to talk about things going wrong, because things do go wrong. And that's fine. It's realistic. I love the word realism at work, things go wrong, we look at it, we learn from it.
Michael Hallsworth 00:41
If you just assume that you're in a meeting, and the point of the meeting, is to come up with the best solution, and then deploy it. That is a different mentality from saying, we don't fully know. And we know that details matter. So, we're going to try a few different things and test them. That requires some openness to the idea that you don't know.
Preston Cline 00:59
The bottom line here is what really matters. It's not the data that matters. It's the narrative that matters. As a leader, it actually doesn't matter so much what you think went wrong and went right. What matters is the story that your people are telling themselves tomorrow, about the team, themselves, and you.
Jeremy Brisiel 01:18
This is Bank Notes, Banking Culture Reform. The views expressed in this podcast do not represent those of the New York Fed or the Federal Reserve System.
Toni Dechario 01:28
Hello, and welcome to season three of the banking culture podcast, part of the New York Fed's initiative to drive awareness and change in financial services culture. This season we'll speak with experts on how organizations can build curiosity and learning mindsets into their cultures. We'll explore tools that are immediately useful in the information that they uncover, but at a deeper level, their repeated use can create a culture that treats mistakes as opportunities for improvement versus moments that instill fear. My name is Toni Dechario and I'm with the New York Fed's culture team.
Toni Dechario 02:01
So welcome, Gary, thanks for joining us today. I'm hoping that for those in our audience that aren't familiar with your work, could you briefly introduce yourself?
Gary Klein 02:11
Yes, I'm Gary Klein. I'm a cognitive psychologist. I've been working in the field of naturalistic decision-making for several decades trying to help people understand how decision-makers actually function.
Toni Dechario 02:27
Can you describe briefly what naturalistic decision-making is?
Gary Klein 02:31
Right, so naturalistic decision-making is the study of how people make decisions in real-world environments, as firefighters, nurses, physicians, pilots, military officers. Naturalistic decision-making community studies them, tries to understand where they get in trouble, what their skills are, how they develop expertise. And these, you can see it as a contrast to studying decision-making in a laboratory, where you give college sophomores tasks they've never seen before. But then you don't understand the effect of experience, and you don't see how it plays into it. And the naturalistic decision-making community, there's several 100 members around the world, goes out and doesn't limit itself to the laboratory, but has to learn different domains in order to, to get in sync with the experienced people who are making really difficult decisions under time pressure and uncertainty. The naturalistic decision-making was started in roughly 1989. So, it's been going over 30 years. A number of books have been published, and anybody who's really interested can attend, there's a naturalistic decision-making association that you can join. And the next conference is going to be held in New Zealand at the beginning of July 2024.
Toni Dechario 04:16
Sign me up, I want to go. So, in this season of the podcast, one of the things that we want to delve into is how to improve decision-making by building cultures of curiosity and learning and encouraging curiosity and learning in an organizational culture context. Can you maybe start off by describing why it's important to intentionally build cultures of curiosity and learning?
Gary Klein 04:48
Basically, the situation we face keeps changing, technology keeps changing, challenges keep changing. The routines that we've learned, don't apply the way they used to. And so, if you're not continuing to learn, then you're going to get trapped. And one way to learn is to notice things that aren't working the way they used to, to wonder about what you could do better. And that's the essence of curiosity. So, curiosity is an essential ingredient for organizations to move forward.
Toni Dechario 05:30
What are some obstacles to building cultures of curiosity? What, what gets in the way?
Gary Klein 05:35
So, the, the problem is there's just lots of barriers to curiosity. It's very easy to just sit back and say, yes, we want organizations to promote curiosity. But the fact is, they're organizations. And curiosity leads to insights which are disorganizing. So, there's a natural tension between like, what we're encouraging curiosity, and a smooth-running organization, especially for the managers in the organization, who want to carry out the tasks within a timeframe, within a budget. And now you've got people on your team, who are curious who are coming to you and say, “Hey, you know what I just noticed? If we do this, we can really improve our product. We can make all kinds of great changes.” And you're thinking, “Oh my gosh, how much time is that going to add? What if it doesn't work? What are the risks? Do we just stick with a game plan and not make any of these kinds of digressions?” So, there's all kinds of pressures within an organization not to be curious, but to just hunker down and perform the tasks as they were originally described. If you're working with a vendor, then there's a schedule of payments. And now you're going to go to your vendor and say, “By the way, we're changing our plan, and we're moving the whole timeframe, that's going to change.” And the vendor is thinking, ”Well, how am I going to get paid? How am I going to pay my employees? Because everything seems to be up for grabs.” If you want to train people in various kinds of an environment, you think we would encourage curiosity. But the way instructors work, I should say, “I'm an instructor, I'm supposed to give this platform lecture, I've got an hour to give the lectures, I've got all the PowerPoints to get through. And now you want me to encourage curiosity on the part of the people, you want them to ask questions, and I'm looking at my watch, am I going to get through the PowerPoint? Can we just hunker down and follow the script?” You see the barriers.
Toni Dechario 07:52
Presumably, if you're in a large organization, senior management would love for people to be offering up things they've noticed, suggestions for improvement that are inconvenient perhaps for the people that they're reporting directly to, but ultimately are going to improve the bottom line right, are going to improve how things work going forward. So how do you kind of overcome that obstacle of the person who is immediately inconvenienced by, by that curiosity, by that speaking up?
Gary Klein 08:24
I'm not sure it's that easy to overcome that inconvenience, because organizations always say we want to encourage innovation. And we want to encourage creativity. We want to encourage insights. But for the reasons I described, insights are inconvenient. And discovery is inconvenient. And people just want to do their job. And you think that they would be oriented towards the bottom line. But the bottom line is kind of distant. That's something that the finance people have to worry about. I've got to worry about getting my projects done within this schedule within the resources. And yet, it may be a better way to do it. But that's risky. I don't want to incur that risk. I mean, I'm just sort of putting my –
Toni Dechario 09:20
Yeah, I this is how it's always worked.
Gary Klein 09:22
This is the way it works. And that's why that organizations are just very risk averse. And, you know, they may have a policy, let's have a consensus decision-making framework, though, that everybody has a chance to weigh in, which always sounds harmonious, but it's really terrible. Because if you use consensus, that means everybody on your team gets a veto. And I guarantee there's going to be at least one person on your team who's going to be unnerved by what the new direction might entail and is going to resist so you're gonna have lots of, lots of barriers to curiosity, and innovation.
Toni Dechario 10:13
So, you either end up with the lowest common denominator…
Gary Klein 10:14
Yes.
Toni Dechario 10:15
…or worse than what you described, people kind of don't veto, because they, they, they go along to get along.
Gary Klein 10:24
Right.
Toni Dechario 10:26
So, one of the, one of the, you're known for a number of specific tools in behavioral science, that help actually build more curiosity and learning into organizational cultures. And one of them is the premortem. And I want to make sure, to be able to talk to you about the premortem. Can you describe what that is?
Gary Klein 10:47
The premortem is a technique for evaluating plans, even as you’re, you're beginning to carry them out, as a form of risk management. And I have to be honest, I never expected the premortem to be any kind of popular technique. We just started using it in my company and then I published a short article on it in The Harvard Business Review and my, my friend, Danny Kahneman talked about it at Davos and, and now are doing premortems all over the place. So, the way premortem works, here's how we started: we, we generally in my company at the time, this would have been in the 1980s, most of our projects went very well. But not all of them. Some of them failed. Not terribly, but disappointingly, when we said, after the failures, let's, let's do an after-action review, let's see what went wrong. And then one day, I felt, why are we doing this at the end of the projects? Why don't we do it at the beginning of the project and imagine what might have gone wrong? So, we started in our kickoff meetings for projects and started this premortem routine. And the way it works, is you have your team assembled around table, and you’ll have been going for an hour or two. This is your kickoff meeting, that's usually where we do a premortem. And so, everybody knows what the plan is, they know what their role is in the plan, that's all nicely nailed down, and you say, now we're going to take the next 20 minutes or so maybe, maybe half hour, usually no more than, than that than 20 minutes. And we're going to do this premortem. And in the premortem, I tell everybody, now, relax, everybody lean back in your chairs, you've got a piece of paper in front of you. And I am looking into an imaginary crystal ball, I actually do have a crystal ball someplace. But we just have people imagine that I'm looking at the ball. And it's now six months from now, or a year, whatever is the appropriate time frame, and I'm looking in the crystal ball, and oh no, this project has failed, it's been a disaster, it has been a fiasco. The people on the team, when they pass each other in a hall, they avoid eye contact, because it's so painful. So, I want all of you to understand that this crystal ball is infallible, the project has failed. Now, I want each of you to take two minutes and write down all the reasons why it's failed, starting now. And then they start writing and I hold them to the two minutes, then they're writing like crazy about what caused the project to fail. And then time's up. And now I go around the room to see what people have written. And I record what they have, if it's virtual, I'll do it on a virtual whiteboard, while if it's in person, I'll do it on a real whiteboard. And we go around the room and each person puts what's at the top of their list that hasn't been covered. And we start with the project leader to indicate what's at the top of his or her list, and then we go from there. And that's how a premortem works. And we've done research on it. And the premortem technique seems to significantly reduce overconfidence and seems to do a better job of that than other comparable techniques like you know, pluses and minuses or just general planning. Here's why the premortem works so well. It's because the crystal ball is infallible. So that changes your whole mindset. And now, if you just ask people at the end of a planning session, let's sort of get a step back, does anybody see any problems? All the pressure is on people not to see problems, because you don't want to disrupt harmony. And in fact, you may not see problems because you've spent all this time getting ready. You're all enthusiastic. So, the premortem gets you out of that mindset. It changes the mindset. And you say, okay, it's failed, why could that be? And it becomes a competition. The people writing things down, they want to come up with items that other people haven't thought of, that are reasonable and worth worrying about. So, this before, if you come with criticisms, you're getting in the way of the harmony of the team. Now, people are competing with each other to come up with plausible problems that haven't been discussed anymore. And that's how, that's how you show you're smart and experienced by the kinds of criticisms you can come up with. Now the premortem works awfully well. And some people have complained to me that it works too well because you have a team that's all ready to get started. And now you're doing this premortem to reduce their overconfidence. So, we added another step at the end, where we say now that we've done this premortem, let's do a backup exercise. Look at all the problems that we've listed. And I want each of you to think about what you can do individually to try to reduce many of these from happening. What can you do? I'm giving you another two minutes to come up with them.
Toni Dechario 17:03
Why two minutes? Why is it two minutes each time? How'd you land on that?
Gary Klein 17:08
Yeah, one minute was really frustrating for people, because they'd always be writing when I'd say time is up. And if it's more than two minutes, a lot of people would have run out of steam, and they're just sitting there. And I want the premortem to be filled with energy. So, two minutes seems to work. And most just about all the teams that I've seen people, a few people are finishing before two minutes, but most people are, are still writing, but they're about done. So, two minutes empirically seems to be the right amount.
Toni Dechario 17:43
And because the way I understand it, that what you described is very specific. And these are some very specific steps that you want to follow, you want to make sure that you say, the crystal ball is all-knowing, and the project has failed. This is not a hypothetical, it's happened. That kind of frees people up to, with my understanding of what you've described, to think in that way gives, it gives them kind of a free pass.
Gary Klein 18:12
Right.
Toni Dechario 18:14
I wanted to ask you about the order, in that you described, in that the project manager speaks first. Why is that?
Gary Klein 18:23
One of the most important contributions of a premortem is to help create a culture of candor in a team, where people aren't afraid to say things that might be unpopular, and the project leader has to set the tone for this in the premortem by coming up with a good criticism right off the bat. And that way, the, if the project leader doesn't do that, hasn't done that, then it reduces his or her credibility. So, they've got to show that they're willing to take this risk. And that will free up the rest of the, of the participants to take that risk. So, the project leader really has to set the tone. And in all the premortems I've run, they always have, because there's a part of us, even though we love the plan, or we want it to succeed, there’s a part of us that kind of excited to see, to think about how did it fail? And intellectually, it's like fishing, and this is just this wonderful bait. And everybody goes for it. Because it's a, it's a liberating experience.
Toni Dechario 19:41
Yeah, it's kind of fun.
Gary Klein 19:42
Yes, it is.
Toni Dechario 19:43
To imagine what's happened.
Gary Klein 19:45
Right.
Toni Dechario 19:46
Yeah. Yeah. How long? How long does a premortem take?
Gary Klein 19:53
Twenty minutes or so and the reason is, we respect people's time. And we will, we know that kickoff meeting, people are taking time out of their schedule. We don't want to overdo it. I have a friend, Bryce Hoffman, who runs these as part of his red team sessions that go for two days. But you know, say he has clients who were willing to spend that amount of time on trying to insert this into their very ordinary routine of organizations and twenty minutes seems bearable.
Toni Dechario 20:39
You said that generally do this at a kickoff meeting. So, does that mean, kind of after you've laid out the bones of a plan, before you start executing the plan?
Gary Klein 20:51
And the reason for that is that you can do the premortem, most effectively, everybody knows what the plan is. So, you know, you wait until the plan is described, and the roles are assigned, everybody understands it. And now you're in a much more knowledgeable place, doing an effective premortem.
Toni Dechario 21:13
And what kinds of projects lend themselves best to this tool?
Gary Klein 21:19
I think this tool would work for any kind of project, especially one where there's a schedule where something is supposed to be delivered at a certain time. And you can have a team of people who know each other, but I've also done it with teams of people who were just meeting for the first time. I've had friends do it with organizations that were about to be reviewed by government agencies, and they wanted the review to go well. And so, they had a plan for how they were going to describe their work. And they said we'd better, this is too important. Let’s do a premortem, just, just to make sure we're more, we're more prepared. And so, they use it there.
Toni Dechario 22:15
And you mentioned that one way in which a premortem is different from red teaming, for instance, is that it's much shorter, tends to be much shorter. Are there other kinds of distinctions between running a premortem versus other types of kind of forward-looking analyses like red teaming or like, kind of, red hatting, playing devil's advocate?
Gary Klein 22:42
So, I believe that the premortem is included in some of the red team activities, I think the army had included the premortem, and Bryce Hoffman includes the premortem, his red team activities. The premortem has caught on because people resonate to it. They don't feel their time is being wasted, quite the reverse. Organizations get to a point where people become nervous if they are too, in too much time pressure to get started. And they haven't done a premortem. It's like driving, that you realize you forgot to put your seatbelt on and, and all of a sudden, you're feeling unsafe because you have, you’re now in a risky, riskier space than you wanted. A devil's advocate is a technique that I like, but I'm not completely up on the research and I may be wrong about this, but my impression from some of the research I read is that the devil's advocate technique, where you appoint somebody as the official critic didn't have much of an effect. Because as the article I read speculated, that it didn't have much of an effect, because if I appoint this other person to be the critic, that means I don't have to be the critic. And when the person raises criticism, I don't know if I, if I necessarily need to believe that, are these, are these real criticisms that the person experiences? Or are they just playing a role? I don't know if they're sincere. So, I don't, I don't necessarily take their criticism all that seriously. The premortem works because people are coming up with where people are going to be the ones who are carrying out the plan. And they're, if they have criticisms, they're genuine ones, they're coming from their own concerns. They're not just making things up, because that's the role they're playing.
Toni Dechario 24:51
Right, and you get, you get the ideas of everybody in the room, as opposed to one person who's been assigned this role of poking holes.
Gary Klein 24:59
Right? Everybody's in it together, right?
Toni Dechario 25:02
Yeah. Yeah. Yeah. You mentioned different types of teams that conduct premortems. Some of them know one another. Some of them don't know one another that well, necessarily. And you also mentioned that people kind of get used to this and start to come to expect it feel uncomfortable if they don't do it. And those all sound like kind of markers of culture, and how premortems impact culture. And I was hoping you could talk a little bit about what you think the broader cultural impact is of having a practice of conducting premortems.
Gary Klein 25:41
You are promoting a culture of discovery. You’re promoting a culture where people expect to hear things that they hadn’t thought of before. As I mentioned before, you’re, you're trying, you're promoting a culture of candor, where people are saying things that ordinarily might have been unpopular. Well, now they're given permission to say it, and they're given respect, if they come up with ideas that hadn't been heard before. You're creating a culture where people are listening to each other. And as such, sitting back and saying, “Well, I wish I had thought of that!” And you're appreciating the, the intelligence and the experience of their colleagues based on what they're hearing because too often we don't get a chance to see our colleagues’ minds at work. And the premortem showcases that so those are the cultural changes that I think the premortem has achieved.
Toni Dechario 26:55
Yeah, that's interesting. Do you have any examples that you can share of, of times that you've seen any of, any of those things happening?
Gary Klein 27:06
The only examples are from my own companies, where people become used to the premortem technique. And, and I'm for the opportunity that offers them to learn from each other. So, I feel it's had a positive impact on other companies where, where I'm, where we've been using it and where I get to interact with people on a daily basis. I haven't studied follow-ups of other organizations so, I really don't have any evidence of that. I hope it would help them. But I can't claim that it does.
Toni Dechario 27:58
Because what I find interesting that you described from a cultural perspective, is how because everybody in the room has to say something you might be surprised by what someone that you didn't expect. You didn't have preconceived high expectations of who comes up with something that actually changes potentially the outcome of your project or whatever it is that you're working on. I think that's a really interesting kind of cultural implication that opens people's mind up to the fact that there are things they haven't thought about. In a way that I don't think I think we have that many opportunities to do sometimes. Exactly. Yeah. Yeah. Okay, well, I want to talk a little bit more about the premortem before moving on to a couple of other tools that you've pioneered. Importantly, I want to ask about the conditions that you think need to be present in order for the premortem to be successful. So, we've talked about kind of the specifics about how to conduct a premortem, you want to, you know, make sure that you've established that the crystal ball has said that the project has failed, there's no question that it's failed. Everybody gets a chance to talk. We take two minutes to write down our thoughts, the leader goes first. And importantly, to kind of build confidence back up at the end, we talk about how we can mitigate some of the problems that we identified. But all of that assumes a particular environment that will allow for those conversations to take place. So, can you describe some of the conditions that you want to make sure, are present before conducting a premortem?
Gary Klein 29:44
I confess, we don't worry about it that much. I've never seen a premortem fail. It's just too engaging. People get into the exercise. And I once was doing a workshop at Columbia Business School. And people were making sure that nobody was looking at what they had written, they were really eager. I and I think that's one of the reasons it's become so widespread in so many communities, including Wall Street.
Toni Dechario 30:22
Hmm, that's so interesting, because I was expecting you to say, you know, hierarchy can get in the way, or a lack of psychological safety can get in the way, but you've never seen one fail.
Gary Klein 30:32
A lot of people say, don't be afraid to speak your mind. But you'd be an idiot to take that seriously. Because you know that there can be repercussions. So, so you're, you're going to be very guarded. But with a premortem with, with the leader going first and voicing a possible problem that nobody had mentioned before, that sets the stage. And that instead of saying, I want you to feel psychological safety, you're now demonstrating that, you're now living it.
Toni Dechario 31:09
And in fact, the thing that people are afraid of is not coming up with a problem, right? As opposed to a normal day-to-day life afraid of raising a problem. Yeah. Yeah. Really interesting. So, I wanted to talk to you about a couple of other tools that you've worked with. I know that you used to work with military quite a bit. And you work with simulators, which, like premortems, are kind of a form of practice, thinking about issues that do come up, issues that could come up. You also use a very specific tool in your business called shadowboxing. Another form of practice. Maybe you could describe shadowboxing, how it works. What are the benefits? What are the potential drawbacks?
Gary Klein 32:09
Back in the mid-80s, my research team discovered how people actually make decisions under time pressure and uncertainty. Our model is called the recognition prime decision model, and we've written books about it and many articles, and our research, we did research with firefighters, that was extremely convincing. A little over ten years ago, I was having lunch with a group of firefighters in New York. And one of them, a guy named Neil Heinz, who's now retired, described a method that he was starting, he had done his Master’s on. And I said, that's the method that I've been looking for decades. And that's the shadowbox method. And what happens with shadowbox is you put people in tough situations where they have to make decisions, these are simulated situations. And shadowbox is a way for people to see the world through the eyes of experts, without the experts having to be there. So, I might put you through a situation. And I stop the action. And I say, at this point, you've got these four options of what you could do. Rank order, which option you would choose first, second, third, or fourth, and write down your reason. But it's not just about causes of action because we might continue the scenario and stop it again. And I might say, at this point in this scenario, you've got three goals to pursue, rank order, which are the most important, second, and third, and write down your reasons. And then we continue the scenario, no, stop it again. This time, it might be about information, here’s five pieces of information, which will be the most valuable for you to get? Now, we've also a small group of experts, three to five people who have lots of experience and are widely respected, they've gone through the same scenario, as you and I. And they've done their ranking of the options and the goals of the items of information. They've ranked them themselves. And they've written down their reasons for why they ranked them. Now we've taken what they've done. And we combined it. So, we have an overall ranking from the experts. And we synthesize the rationale, who, what they wrote down. And we've combined that. So, when you say, here’s my ranking for these four courses of action, and here's my reasons, right away, I can give you feedback and say, here's what the experts ranked, and you want your ranking to match the expert. That's the game rank part to match the expert. The real training, when I show you, here are the reasons that the experts came up with. And you look at what they saw, in the same scenario you just have been reading, you look at what they came up with. And you compare it to what you came up with when, you say, “I never noticed that, I never thought about that. I never made that inference. I never worried about that.” And so, you're expanding your mental model, by seeing the world through the eyes of these experts. But the experts don't have to be in the room, they've already done their work. That's always a bottleneck for training, is when do we get access to experts? Here, you do it upfront, and one short session, and they're done. And now I'm providing their expertise to you. And you're getting to become smarter and more sophisticated. We've done research and we find that after a half-day of this training, people match the experts by about 25%, their matches are 25% higher than when they started, that in a short period of time, having an impact on their decision-making skills.
Toni Dechario 36:26
What is that changing that they can so quickly make better decisions after having had this experience?
Gary Klein 36:37
Major thing is changing is their mental model, they're getting smarter. They're appreciating of factors that they might have otherwise ignored. They're appreciating relationships that they otherwise might not have noticed. And they're, they're doing it and in a training environment. So, they're not as, they’re not being evaluated. We’re not claiming the experts are perfect. We’re saying, you may disagree with the experts. But they're respected for a certain reason, you will need at least need to see what they, what they put down, what they were noticing. Another thing it's changing is their minds, that they might have a certain mindset about what they need to do. I'll give you an example. I work with police. One of the mindsets we needed was that we were asked to shadowbox to change is for police officers who believe that they needed to get compliance from criminals and even from the public. And the way to do that is through intimidation. I'm a police officer, you'll do what I say. That's the kind of immediate respect that I, that I demand. And I have weapons to back that up. Well, yes, you can get, you can get our cooperation through intimidation. But that has negative consequences. And we found that there were, in the police and military, there were people who are extremely skilled at getting voluntary cooperation. And I remember one police officer telling me he used to try to get it use intimidation, but he didn't think it was working the way it was he wanted it to. And now, every time he deals with a civilian or a criminal, he has a mindset that he wants that person to trust him more at the end of the encounter than at the beginning. And that changes, that moves him out of an intimidation mindset into a trust-building mindset. So, we can create shadowbox exercises, where people get a chance to see how the experts would handle it, and what they might do to try to build that kind of trust. Those are the kinds of changes that we think are possible with this kind of shadowbox training.
Toni Dechario 39:16
That's so interesting. And so, you're able to kind of mimic the, the mindset and the thinking of whomever the expert has been that, that created the shadowbox responses or that the expert responses. Do you think? Who should the experts be? And I'm asking this specifically because much of our audience is in financial services. And so presumably, the CEO perhaps, or a member of senior management would want to kind of set the tone for how people are thinking. So, it does matter who the expert is. How much do people have to kind of buy in to the expertise of the expert?
Gary Klein 40:09
They don't have to buy in. But they have to at least listen to be open. If you are CEO wants to be experts. Well, maybe the CEO isn't that good. But there's an advantage of the CEO taking the role of the expert, is it lets the rest of the team calibrate with the CEO and know how the CEO thinks about things. And that's different from the CEO giving lectures and say, here are what my priorities are. That's different from seeing how the CEO is ranking options when they've got to make tough tradeoffs.
Toni Dechario 40:51
Yeah, I think that would be very helpful. Okay, we just have a couple minutes left. I have, I have two last questions. I'll combine them. Hopefully, hopefully, that, that won't be too much. But the first is, do you have thoughts on other suggestions for creating cultures of curiosity and learning? And also, do you have recommendations for further reading? Or listening or watching that people can do on these topics?
Gary Klein 41:24
Great question. So, what can you do to create a culture of curiosity, there's a number of things that you can do. One thing that you can do is shadowbox exercises that break people out of their existing way of thinking and put them in environments they hadn't encountered before and try to move from there. And we, I was just very inspired by a trainer that I met once who said when he was early on in his training career, he thought the idea of training was to wait for people he was training to make a mistake and then slammed them. That's how he'd been trained. That's what he thought he was supposed to do. And after a decade or so he realized that's not happening any better. And now, if somebody makes a mistake, he wonders, why did they make that mistake? And he asks them, and it becomes a joint activity. And, and so he, by his, his own, his own orientation to be more effective, changed his mindset. Well, you can, you can use training techniques for trainers. That's one thing you can do. I have others, but I don't want to miss your last question, which is about books that people can look at because there are books. And this is a shameless and self-serving bias. I would be negligent if I didn't mention my own books. So, I just came out with a book, Snapshots of the Mind published by MIT Press in 2022. Whereas the first book that I authored Sources of Power, How People Make Decisions back in 1998, I have a few other books, but people who are interested in my work, those are two good places to get started.
Toni Dechario 43:26
That's wonderful. Thank you. I think that that wraps it up.
Gary Klein 43:30
Thank you very much for the opportunity. I really enjoyed the conversation.
Toni Dechario 43:35
For more conversations like this, as well as publications and other resources related to banking culture reform, please visit our website at newyorkfed.org/governance-and-culture-reform.
Scott Young 00:01
I do think we can build things into meetings and agendas and processes that not only allow space for curiosity and discussion, but actively try to foster it.
Gary Klein 00:14
One of the most important contributions of a premortem is to help create a culture of candor in a team, where people aren't afraid to say things that might be unpopular.
Wieke Scholten 00:26
What I also try to do in organizations is to normalize it to talk about things going wrong, because things do go wrong. And that's fine. It's realistic. I love the word realism at work, things go wrong, we look at it, we learn from it.
Michael Hallsworth 00:41
If you just assume that you're in a meeting, and the point of the meeting, is to come up with the best solution, and then deploy it. That is a different mentality from saying, we don't fully know. And we know that details matter. So, we're going to try a few different things and test them. That requires some openness to the idea that you don't know.
Preston Cline 00:59
The bottom line here is what really matters. It's not the data that matters. It's the narrative that matters. As a leader, it actually doesn't matter so much what you think went wrong and went right. What matters is the story that your people are telling themselves tomorrow, about the team, themselves, and you.
Jeremy Brisiel 01:18
This is Bank Notes, Banking Culture Reform. The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 01:28
Hello, and welcome to season three of the banking culture podcast, part of the New York Fed's initiative to drive awareness and change in financial services culture. This season we'll speak with experts on how organizations can build curiosity and learning mindsets into their cultures. We'll explore tools that are immediately useful in the information that they uncover, but at a deeper level, their repeated use can create a culture that treats mistakes as opportunities for improvement versus moments that instill fear. My name is Toni Dechario and I'm with the New York Fed's culture team.
Toni Dechario 02:00
Today we're joined by Wieke Scholten from BR Insights. Welcome Wieke.
Wieke Scholten 02:06
Hi, Toni.
Toni Dechario 02:08
Wieke, could you start off by briefly introducing yourself for those that aren't already familiar with your work?
Wieke Scholten 02:15
Yeah, of course. Thanks. I'm Wieke Scholten. I'm a social and organizational psychologist. And I work in behavioral risk management in financial services. So, which is bit of a new field, I guess, in upcoming still upcoming field, where you look at current behaviors in an organization to prevent future problems. And I have my own practice in behavioral risk called BR Insights based in the Netherlands, but working globally.
Toni Dechario 02:47
Wonderful, thanks, Wieke. This season, we're talking about how to use intertemporal thinking to build curiosity about our own behaviors and decisions. My first question for you is, why should we care? Why should organizational leaders focus on creating a learning and curiosity-based culture at all?
Wieke Scholten 03:06
Well, I would think there's actually, at least two reasons, one that we know is that if you create a learning culture that improves your performance, which I guess is a really good reason to look into that innovation, it helps with your agility of your organization. But it also prevents problems and future issues. And that's really the work that I'm often concerned with is in risk management or behavioral risk management. So there, we know that learning culture helps you to prevent future problems. And who wouldn't want that? Right?
Toni Dechario 03:48
Right. Specifically, you've done a lot of work on error management, and how organizations handle errors, what perceptions are like about errors within organizations. Can you talk a little bit about why it's important to have an open approach to errors specifically? And how the attitudes towards errors impact outcomes?
Wieke Scholten 04:12
Yeah, I think there's just, so I'm an organizational social psychologist, and from a behavioral science perspective, it's very clear, there's lots of experimental research that shows if you have an open response to errors, that that helps your performance and prevents those issues. So not so much what I encounter sometimes in organizations is also the thinking that it's more comfortable, or that we create an open environment, because we feel better when we can talk about problems openly. You don't have to be worried about consequences or shame. And yes, it's more comfortable. But more important, it actually helps your performance and helps prevent those problems. And for that you need to have a structured look at what went wrong.
Toni Dechario 05:11
I want to talk to you a little bit about terminology. I think that you're pretty specific about the terminology that you use. Can you describe what terms you use and why you use those terms?
Wieke Scholten 05:25
Actually, when you look at that, that psychological theory that I just referred to, their error management approaches is often used in terms of that terminology, referring to learning from things that went wrong. So that's also the academic literature I use in my work in financial services. However, when I use the word error in our industry, often it's associated with operational errors, for instance, so it's more about operational risk or operational errors. Well, I think it's really important and interesting to discuss things going wrong more widely. So, this could be about somebody making a mistake. Yes, it could also be an operational error, but it could also be a decision that you made that turned out differently than you thought up front. So, to really make that wide definition, I always make sure that I explain a little bit when I talk about things going wrong. It entails all of that. You know that Professor Amy Edmondson who is very known for her psychological safety work, just published a book about this as well. So how to learn from mistakes, but she actually distinguishes mistakes from discoveries, which I think is very interesting. So, she says, when it's unknown territories or in there's no knowledge yet, we better call it discoveries and not so much mistakes. Because then if you don't know it, when there's no knowledge yet on what to do, how can it then be a mistake? So, I think that's also where she tries to, let's say, experiment with different terminologies to make it more comfortable for people to talk about things that go wrong at work. So yeah, I think terminology in that sense, is really important to make it more comfortable and safer for people to discuss it.
Toni Dechario 07:24
That idea of treating something as a discovery also seems to reduce stigma, I would think, around something that's gone wrong. I guess not only reduce stigma but turn it on its head and turn it into a positive.
Wieke Scholten 07:39
Exactly, yeah. And, yes, turning it into a positive because you want to look at the opportunity, almost that that the thing that went wrong, also offers in terms of looking forward. But what I also try to do in in organizations is to make it normalize it almost to talk about things going wrong, because things do go wrong. And that's fine. You know, like, it's realistic. I love the word realism at work, things go wrong, we look at it, we learn from it. But we do actively learn from it instead of just fixing it and moving on? Yeah, and another thing, I think on terminology, which I think is important, is that how we deal with things going wrong at work is rarely the same within an organization. So, when we look at learning culture, and we just refer to, I think it's important to acknowledge that that learning culture differs per subculture. And that has to do with one of the basic principles in social psychology, is that we're social creatures. And even though we think we are individual and rational thinkers, we often think and act primarily as group members. So those groups at work. So that's about the team's work in, the areas we work in, the departments, the groups we identify with at work are often very local. And what we know is from research and practice is that how we deal with things going wrong at work also differs per area. So that means it's really difficult to say we have a learning culture in our organization. It is really about finding out sometimes, what helps and hinders us to learn from things going wrong in those, let's say pockets, those local climates or subcultures, we like to say, and I think that matters, because if you want to then create a learning culture, yeah, you have to first have a bit of a local view to know how to do that.
Toni Dechario 09:52
And how would you describe kind of different subcultural approaches to mistakes?
Wieke Scholten 10:02
So, I then strongly rely on what we know from experimental research as well. And in the Netherlands, there's a group of researchers at Vrije University, for instance, who have on quite some research on this. And they think in four approaches, really, that you can distinguish when you look at a group of people in how do they deal with things going wrong? How do they respond to that? And those four ways can be plotted on two axes. One is the extent to which things that go wrong are acknowledged or accepted as part of professional reality. So, is that acknowledged that things go wrong at work? Or is it forbidden? The other axis is around the extent to which these things going wrong are responded to actively. So, do you sit back? Or do you take action? So going to those four approaches, and where teams differ in three of them actually, we know do not lead to better outcomes for organizational outcomes in terms of performance or mitigating risk, right, the risk of poor outcomes occurring. So those three that are not effective, one is ignore and deny, sometimes called denial. So, there is a low acknowledgement. So, it’s very much we ignore that it's there, we don't discuss it, it's not part of what we talk about. It's not part of our reality at work. Hence, also the, let's say, adverse implication in terms of conduct is that if you don't discuss things that go wrong, you can't learn from it. So that's quite easy, easy to understand the denial approach. Second, you can also have a team where it is, again, low acceptance, but paired with a taking action instead of sitting back. So that's not a denial, but more of a blame and punish approach, you would say. And for that, we also know that the conduct implications are, by the way, universal, this happens globally, these are very well researched, behavioral impact of when you respond, for instance, as a leader, as a colleague, to something that goes wrong at work with blame, that people start, tend to cover up their actions as a consequence. So, there's less visibility of things going wrong. And from a learning perspective, that is obviously not good news. You can have higher risk of poor outcomes, you can't see it, you can’t pick up on it, but also, from an innovation and performance perspective, people will not, you know, will be more reticent to try new things. And third, let's say, approach to watch out for is the approach where a high, then a high acceptance is paired with sitting back. So yes, there's acknowledgement that things go wrong at work, which is a good thing. Because denial and blame or punish approaches do not, are not categorized by that, but it's paired with not doing anything. So, we call that sometimes 'accept and justify' type of approach. From a conduct implication, you could almost have, let's say, a slippery slope effect. You call that in psychology, where you are, where you are exposed to things that go wrong, but are not really dealt with actively or learned from, then we get used to those incidents, and it can also spiral out of control. So especially in unethical conduct, that could definitely be a problem. So, to go to the most effective way, actually to deal with things going wrong at work is then where you may have guessed it, yeah, you combine a high acknowledgement that things go wrong at work. It happens every day. I probably messed something up already two times in this podcast, right? So that happens. That’s what we do, we're people, but pair that with an active response, so I'm not sitting back but learning from it. So, identify and learn it's sometimes called or high error management if you take the academic term. And we actually there's so much science showing that leads to great organizational outcomes. So, it's really the smart thing to do. Next to that that high acknowledgement is more comfortable. It's also smart. But that means that you also have to let go. If you want that in your routine, it means that you need to let go that a zero-tolerance approach is effective. And I find that especially in financial services, or in markets area, where I do my work, sometimes, it's still fear is sometimes used to create the better outcome. So, it's zero tolerance, and really pushing towards that zero tolerance is, is still common. And I think there's nothing wrong with aiming to deliver to excellence. But as long as there's a realistic acknowledgement that things sometimes do not go as planned. And if that happens that you don't respond harshly, because that does not drive performance, it actually has many adverse effects. And I think there's too much known health and negative impact of fear. And I think that was what, also what a great academic guest, on the last conference, the New York Fed had organized on culture last June. So, Professor Giselle Antoine of Washington University talks about her research on fear and shame, for instance, and, and those emotions are part also of our professional life, you have to manage that in a smart way, I think. Yeah, and it also means, of course, to then learn from mistakes actively. So, you have, you could say, as, for instance, a leader making the same mistake three times is not accepted. So, we have a high acknowledgement that things can go wrong. But that doesn't mean everything is now allowed and, you know, we can do what whatever.
Toni Dechario 16:21
I love a good two variable graph, where I can locate myself on a scatterplot. I think that more often than not, I think that that low acceptance, that mistakes are gonna happen. As well as kind of low action on mistakes that do happen, I think that is a pretty common place to be. So, if you know I'm leading an organization and have identified, you know, a subculture to be there, what do you think is the more important axis to focus on first?
Wieke Scholten 16:58
So, I would start with acknowledgement that mistakes do happen. And even if you're then still passive about it, let's say sitting back then turn up, how you then learn from them. Yeah, I did read yesterday, some research on what happens if employees do not, do not see that you actually take action? So, they have to keep, for instance, in Speak Up research you see that a lot, for them to keep raising issues, or document, for instance, incidents, they need to believe that something is done with it. If that isn't the case, it will go down very rapidly.
Toni Dechario 17:35
Yeah, yeah, we actually, in our first season of the culture podcast, spoke to Mikael Down of the former Financial Services Culture Board, and he talked about how their research had discovered how futility, he described that as the idea of futility and how, how much that drives behavior and how much that kind of stops people from acknowledge, talking about things that have gone wrong. Why do firms fail to learn from things going wrong? What are some of the common hindering mechanisms that you've seen?
Wieke Scholten 18:12
So as a psychologist, I can't help answering this question from, from, from very much in a psychological way, probably, but it is, I think the first point to make is that there are two basic psychological mechanisms that we all have, which is have the need to belong, the need to belong to a group. And, and the need to be seen as good and successful or to do good. We all have these, like this part of us. And that means that I believe that owning up to something that you've done or have not done resulting in something that was not desirable at work, could hurt the group you belong to, or hurt your reputation in that group. And that is threatening. It's one of the scariest things for us as people. So, it could lead to exclusion, which is a very prime primary fear we have, or we don't want to be excluded, right? So, we are naturally wired to want to avoid exclusion or blame. And we also do not want to see ourselves as not good. And that those two make it so difficult to own up to your mistakes. Self-justifications, also very known, so we just talked a bit about shame and defensiveness. Self-justification is very strong tendency we all have, you know, you hear people say this also in your personal life, right when something goes wrong, we often say, I did not intend to do it. And there's also a great book, I feel called Mistakes Were Made, But Not By Me, I've, I've got catch myself quoting it time and time again. It's by Tavris and Aronson with also lots of juicy examples, also from politics, lots of examples. But it's really going into that tendency to self-justify and all the reasons why we do that. But to counter that basic human tendency to self-justify and not own up, we have to create, I guess, in organizations’ professional context, that encourage people to do so anyway. So, to sort of overcome that tendency. And I think that is that it's quite difficult in organizational contexts where, for instance, there's a legal paradigm. So, for example, I heard a Dutch lawyer say a long time ago, by the way, but after financial crisis 2008, 2007, he said in a newspaper, look, as a lawyer, what you want is to contain a problem to make and to make it small. So, you, you want to say this person messed up, and we fired him or her. It was a bad apple, we got rid of the person. We fix the problem. They're there with, we fix the problem. And we continue, we have nothing to learn, that it was really the legal response to things going wrong, which is makes it more difficult to really start talking about mistakes.
Toni Dechario 21:21
I think that's a huge obstacle. And if you're kind of just blaming it on a person, then you can move forward without ever fixing the thing that really was underneath at all. So actually, since we're there, I want to, I want to talk to you about root cause analysis. I think that root cause analysis can be very difficult for organizations to really, to get right or get as much out of as, as they, as they could. So, can you talk a little bit about root cause analysis and kind of what are some common practices that characterize effective root cause analysis as a way to examine things that have gone wrong, learn from it, and get better?
Wieke Scholten 22:09
I think first so practically, what you need to do I think is first to decide on what occurred offense are we going to do an RCA on? Because I don't believe that everything that went wrong, you need to do an RCA on, a root cause analysis, apologies I call an RCA, right? So, I think you know that you chose the wrong color of the carpet, you don't need to do an RCA on and please don't do RCAs because you said you're going to do RCAs, it's because I also think that is not a good reason, and probably also a waste of your time as an organization. So, we need to decide well thought through on what occurs that justifies for us an RCA. And I would deploy this in that sense risk-based in terms of sort of what is the risk of us doing something like that again? Or what is the risk of that occurring, again, will it hurt us as an organization or customers? What is the societal impact? And in that sense, then it's almost impact and likelihood analysis, or assessment that can guide your decision on should we do an RCA yes or no? And if so, what type? So that is step one, then step two, is really to decide on what root causes to explore. What, what I see in practice is that lots of organizations do not include a behavioral perspective. I was talking yesterday to somebody who is doing work in a retail bank. And he said, and it was about, let’s say, quality of decisions that relationship managers made. And he said, we have it covered Wieke, we don’t need a behavioral perspective on this or root cause analysis, because it’s really the process and it is the system. So, we changed the process, and we changed the system and we’re fine. Or I came into a business once where they said, oh, you don’t need to look here because we had bonuses, but we took them away. So, there’s no incentive anymore to do something that is not right. So, whilst we know that bonuses are, is only one element of that could drive certain behaviors, it could be one of the root causes. Sure, but I think that behavioral perspective is often a blind spot. And I think so that is really an advice I would like to give in this podcast as well include a behavioral perspective because everything that goes wrong in an organization also has behavioral root cause.
Toni Dechario 24:42
At the end of the day, I imagine, you know, there might be a systemic, or a systems, I should say, reason for particular behavior. And so at the end of the day, you may end up needing to change a system, but the behavioral kind of the behavioral analysis helps you see where kind of those behaviors are stemming from and which systems might be driving them, as opposed to going straight to some of the systems you described, you know, as opposed to going straight to comp and thinking that that's going to resolve everything.
Wieke Scholten 25:14
Exactly, I think, then there, it's to the step from, oh, we change the process, or we change the system to then the outcome will be better always goes through people's actions. So, you have to have a view of what to then change indeed, in terms of the process, but also, or the system, but also the elements of that. And you could also we call that often drivers of behaviors, right? Where you can distinguish more formal drivers, so anything on paper, or anything that you can almost touch, that may drive behavior. So, these are systems, processes, procedures, incentives. Sometimes they are also called organizational drivers of behaviors. But you also have social drivers. So those collective beliefs, social norms, psychological safety. Yeah, the dynamics between teams, assumptions that people have all of that are more social drivers of behaviors. You also have individual drivers: think of fatigue, stress, people, experience, contextual drivers at market developments, the regulator. I mean, I'm also, I've also worked at a regulator, so I know what type of regulatory behavior can also result into behaviors in organizations. So, all of that, I think that is in terms of good root cause, effective root cause analysis, tries to have a wide like there's, there's fair chance in terms of what type of root causes can we think of, and include behavior. So not what happens now how do we do things here? And what aspects of how we do things here were okay also causing the occurred offense, but are also present today and could lead to future events? Because I do think that's where the learning comes in. Right? That is really what that deep dive often entails. Really resulting into these are the behavioral patterns and drivers that you can address to prevent those future issues that we can learn from that were underlying the occurred offense, but that we can also change to prevent future events. Whilst that session that I refer to in terms of a root cause analysis, that's really where you go and sit in a room. And here, what I see in the industry sometimes is that that is almost done by people are involved that weren't really involved in the incident that already occurred offense that you want to do conduct a root cause analysis on. So, you don't have the right people in the room.
Toni Dechario 27:54
Is it, why don't you have the right people in the room? Is it that the senior managers are all invited in as opposed to the people who are on the ground? Why do you think that happens?
Wieke Scholten 28:04
Sometimes that's the case, senior leadership that who were involved are now too busy to, you know, spend two hours of your day to really invest in learning from this. This is also where you see leaders differ greatly. There are leaders out there that really understand the value of this right and go in there and be there themselves. But there are also leaders who say, you know, it's something that needs to be done. So do it for me almost, or people that represent areas but weren't really involved. So, my advice would be to really involve those who were involved in that, that event, and ensure you have all of them actually in the room, preferably physically is also my experience, then not to take too long. Like I said, don't dwell on it, but make it action. Don't sit back. But yeah, really, first, I think, step one, gain a mutual understanding of the event. But then really, not so much, I think this happened, but more what happened really, factually. Who was involved? What happened from almost a timeline. Step two here, then what kind of root causes can we distinguish? They're taking in those different categories that we discussed, the potential root causes, so diverge, scope wide. Step three is then converge. So, what are all those root causes? What are the ones that had the most impact and with the highest risk to cause recurrence? So those are two questions, right? Which one is at the highest impact, and which ones are still here and could cause recurrence of such an event? That is actually also your, almost your risk assessment of what our current root causes that are still current today. And then step four is then to what do we need to do ahead to address these root causes so that recurrence is prevented and then take those actions. Really decide who's going to own these actions, follow up, that I think is really important.
Toni Dechario 30:15
One of the things you touched on, was the importance of leaders really taking part in this and being active. And I'm wondering if you can expand upon that a little more from the sense of the work you've done on accountability regimes. How has the introduction of accountability regimes in your mind influenced the ability of leaders to kind of take ownership of things that happen under their watch? And how has it changed behaviors in your mind?
Wieke Scholten 30:51
Yeah, so I think here, there's really something to gain in the way we implement these regimes, right. And I think it definitely impacts the way leaders keep engaging about in conversations about things going wrong. So, and I think that what we know about behavior is currently not sufficiently taken into account when implementing these regimes. So, the risk is that leaders could start avoiding liability, right, by not wanting to know about what is going wrong in their organizations. And that, I think, is a real threat for performance and again, also for risk events happening. So, you, you want senior leaders to be curious about what is happening in that operational reality, what is happening in the core, at the core phase of my organization, not what I say at the top. But what happens in reality, that realism again, right. So, you want them to be curious about that. And I think, therefore, it is important when you implement that regime. And I think regulators, again, have, for instance, a role to play, but also firms, senior leaders in firms themselves to almost choose to take it that way, and to have that conversation with authorities as well, to aim for a process-based application. So instead of these regimes, being more outcome based, so meaning that you ask organizations and senior leaders to evidence their efforts to prevent problems instead of asking them to evidence that nothing went wrong, because that's not realistic, there will be things that go wrong, it will drive defensive behavior or drive motivation of indeed not wanting to know that reality. What senior leaders need to have is a certain degree of trust that the steps they undertake to prevent problems in the organization will be fairly assessed by authorities. Because that’s often what these regimes say. You need to take sufficient steps to prevent problems. But what will, will the steps that you have taken be assessed fairly? That is really what you need to trust as a senior leader to be able to feel, don't feel bad, don't freeze up almost. So that expected fairness or expected unfairness is something in the implementation that you really need to manage. And I think what we do know from behavioral science is that expected unfairness has adverse effects on learning, sure, but can also drive rule breaking behavior, even so it can drive incompliant behavior, if you don't manage that.
Toni Dechario 33:42
We're almost out of time, so, I just want to ask you one last question, which is, you've mentioned a couple of resources during the course of this conversation. And I'm hoping that you can give us the names of those resources again, and any other resources, whether it's books, or lectures, or podcasts or things that are out there that people can refer to if they want to know more about this kind of thing.
Wieke Scholten 34:07
Yeah, great. Yeah, I think, well, I mentioned indeed, that book Mistakes Were Made, But Not By Me.
Toni Dechario 34:13
I love that title.
Wieke Scholten 34:14
Yeah. And the subtitle is actually Why We Justify Foolish Beliefs, Bad Decisions And Hurtful Acts. And what I love about that book is, by Tavris and Aronson on self-justification, but also really on that intention piece. Right? People have the best intentions. But to learn from it, it’s almost irrelevant, right? That those intentions were great. Because you can, how can, how can we say, I think that's fascinating, "I just did something wrong. I did something wrong. I messed up. And it doesn't really matter whether that was my intention or not, it happened.” And owning up to that is, I think, how do we do that, as people I think is utterly fascinating. Then The Right Kind of Wrong, the Science of Failing Well, is that new book of Amy Edmondson that I have to admit, I haven't read fully yet just partly. But I love it already. So, I'm like, yes, get that one. And then lastly, the book I love most of all, I think it's called The Moral Organization. That book is written by Naomi Ellemers and Dick de Gilder. It was also published this year like Amy's book as well, and actually perfectly explains all of the mechanisms that we've been talking about today, from error management approaches to moral climates and concrete ways to improve them. So that would be my tip.
Toni Dechario 35:40
Thank you, Wieke, thank you so much for all of your tips today.
Wieke Scholten 35:45
Thank you, Toni.
Toni Dechario 35:47
For more conversations like this, as well as publications and other resources related to banking culture reform, please visit our website at newyorkfed.org/governance-and-culture-reform.
Scott Young 00:01
I do think we can build things into meetings and agendas and processes that not only allow space for curiosity and discussion, but actively try to foster it.
Gary Klein 00:14
One of the most important contributions of a premortem is to help create a culture of candor in a team, where people aren't afraid to say things that might be unpopular.
Wieke Scholten 00:26
What I also try to do in organizations is to normalize it to talk about things going wrong, because things do go wrong. And that's fine. It's realistic. I love the word realism at work, things go wrong, we look at it, we learn from it.
Michael Hallsworth 00:41
If you just assume that you're in a meeting, and the point of the meeting, is to come up with the best solution, and then deploy it. That is a different mentality from saying, we don't fully know. And we know that details matter. So, we're going to try a few different things and test them. That requires some openness to the idea that you don't know.
Preston Cline 00:59
The bottom line here is what really matters. It's not the data that matters. It's the narrative that matters. As a leader, it actually doesn't matter so much what you think went wrong and went right. What matters is the story that your people are telling themselves tomorrow, about the team, themselves, and you.
Jeremy Brisiel 01:18
This is Bank Notes, Banking Culture Reform. The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 01:28
Hello, and welcome to season three of the banking culture podcast, part of the New York Fed's initiative to drive awareness and change in financial services culture. This season we'll speak with experts on how organizations can build curiosity and learning mindsets into their cultures. We'll explore tools that are immediately useful in the information that they uncover, but at a deeper level, their repeated use can create a culture that treats mistakes as opportunities for improvement versus moments that instill fear. My name is Toni Dechario and I'm with the New York Fed's culture team.
Toni Dechario 02:00
So, I'm here today in New York with Michael Hallsworth. And we are joined by phone by Scott Young. Both are with Behavioral Insights Team Americas. And I'd ask each of you to briefly introduce yourself, Michael, maybe you can start.
Michael Hallsworth 02:15
Sure. My name is Michael Hallsworth. I'm the managing director of the Americas for the Behavioral Insights Team.
Scott Young 02:21
And I'm Scott Young, I also work with Michael at the Behavioral Insights Team, and I lead our engagement with private sector clients.
Toni Dechario 02:27
Thank you. And again, welcome. So maybe we could start off, Michael, you can just describe to us what for those who don't know, the Behavioral Insights Team is and what you all do?
Michael Hallsworth 02:37
Sure. And we're an organization that takes findings from behavioral science, evidence about how people actually behave in the real world and apply it to real world problems. We're practical, we develop kind of interventions to say, how could you do something differently in your organization? And then generally speaking, we, we try and test it wherever possible, rather than assuming it works. We've been around for about 15 years. And we had this kind of interesting history, because originally, we were created inside the UK government back in 2010. And people called us the Nudge Unit, in relation to the book Nudge that came out, then it was sort of the kind of same thing, but now around the world and with a range of different organizations.
Toni Dechario 03:17
Thank you. And, Michael, I'm gonna stay with you for a minute. Because you've recently responded to some of the current challenges in behavioral science by publishing a manifesto for the industry. Can you describe briefly kind of what you're hoping to achieve with the manifesto?
Michael Hallsworth 03:35
Sure. So, I've been in the U.S. five years now, and I hadn't realized, still, the word manifesto has kind of weird connotations here. You know where I'm from, it's just like, a program of what you set out to do in the future. But putting that to one side, what we tried to do here was say, you know, as I just mentioned, this kind of current wave of playing behavioral science has been around for 10 to 15 years. And to be fair, it's attracted a lot of attention. Organizations all around the world have invested in it, they've created behavioral science teams. We're talking about some big corporate players. We're talking about Google, for example. We're talking about Walmart. We're talking also about governments, the United Nations, World Bank, a whole range of different institutions. And that's great, and a lot has been achieved. But I think also people looking at this, this development, this success would say, you know, could we have done more, are there missed opportunities? Arguably, a lot of what's been done, has been looking at changing details, the way in which something is done and saying, "Oh, it had this much effect," which is great when we need those kinds of improvements. But I think behavioral science can do more. It can also looking at bigger questions about how we set up our organizations, how we set up our regulatory systems, all those kinds of things. With the manifesto, I tried to take kind of criticisms that have come along with the success and say, "Here are 10 proposals for how we could do more with behavioral science in the future." These exciting things can make a difference in the future. And then try to make them real, try to say what you'd actually need to do.
Toni Dechario 05:18
One of the things that you talk about is applying a behavioral lens. What's the difference between a lens and a behavioral science tool?
Michael Hallsworth 05:27
Yeah, sure. So, where this is coming from is quite often, in the past, people have approached behavioral science, like it is a tool, you can pick up and fix a problem. And indeed, a particular kind of problem. Oh, this is a behavioral problem, get the behavioral insight, insights people and get the behavioral science tool and we'll fix, we'll fix things in this particular area. And yeah, there are things you can do with that tool. But it also, behavioral science can be a lens through which you see all kinds of problems. Right? It's a way of understanding the nature of the problem. I think that has some practical consequences, because it means you're not just going around doing changes all the time. You may say, well, actually, we don't need to change anything, or we're trying to sustain a behavior. We're trying to understand the problem in more depth, step back and say, what's our strategy, what's our approach here? Rather than let's go and nudge someone with a particular kind of tool. And so, I'm just trying to make that point. It's about deeper reassessment, I would say.
Toni Dechario 06:32
So, so one might argue that it requires a certain amount of curiosity, which is handy, because that is our theme, this season on the podcast. And so, we're thinking through kind of how do you build cultures of learning and curiosity. But maybe before we get there, we should take a step back and think about why we would even want to build cultures of learning and curiosity. Scott, I might go to you first with this question.
Scott Young 07:03
Sure. Well, I think perhaps the most obvious answer or reaction to why we would want to promote learning and curiosity-based cultures is that they're associated with continual growth and improvement. You know, there are cultures that are consistently improving and not remaining stagnant. But I think another important aspect of it is that these kinds of cultures are also pretty closely associated with environments where people are also willing and able to admit mistakes and recognize them, as opposed to feeling compelled to cover, to cover them up or hide. And, you know, one term that we use in behavioral science that some may be familiar with is the idea of psychological safety in the workplace, the idea that folks are comfortable, that they won't be punished, or won’t be humiliated for either speaking up, or, again, admitting when something isn’t right. And when it, when we think about conduct, I think there's pretty close connection there. Because, you know, in so many cases, if people don't have negative intent, but perhaps they do something, you know, something goes wrong, or they do something wrong, somewhat inadvertently. And then the question becomes, is that something that they can admit to and speak up and learn from? Or does it potentially become something that they, again, feel a need to cover up? And then perhaps we have a snowball effect, which leads to worse behavior. So, I think those two ideas are pretty correlated. And a learning culture can, can certainly help us in preventing small negative behaviors from growing and becoming much bigger issues.
Toni Dechario 08:53
And why do you think it is that most organizational cultures aren't naturally this way, aren't naturally kind of curiosity and learning-based?
Scott Young 09:02
Well, I think some of it is certainly inherent or rooted somewhat in human nature, you know, there are certainly issues of insecurity, and some fear that that all of us have as humans. But with that said, I think there's a lot of positive intent out there. I think most leaders, most people would like to have environments of this nature. But unfortunately, they often may default into more familiar patterns of behavior, the way things have been done previously, because again, that's often the easier path. And really, what I'm getting at is that, you know, it, to create these kinds of environments and foster them and nurse them really requires some real thought and work, you know, thinking about how everything from the mundane of how meetings are run. And you know, where and when and how people are speaking to you most importantly, perhaps, when we have challenges and problems, how they're discussed, how they're handled, and so forth. And are we reinforcing fears people may have? Or are we doing perhaps the harder work of creating some of that safety that may contribute to a more positive environment?
Toni Dechario 10:25
So, I guess the natural next question then is, what can behavioral science tell us about how to overcome or mitigate some of these dynamics? Michael, I'll start with you.
Michael Hallsworth 10:44
Yeah, sure. So, I think one thing to remember from behavioral science is that context matters and that people respond to what we call cues in their environment. So, this how habits are built up, you repeatedly kind of encounter something and you develop a kind of automatic response. So, the first thing is that telling people, what they should do is never enough, or rarely enough. Where you got to look at is what are the repeated things that people see and feel in their organization that will shape their behavior. And so, you need to look at processes. Some of these things can be quite mundane. But I know there has been work done, for example, about the ordering in which people speak at Federal Reserve minutes, for example. This can massively influence the course of conversation. These apparently small things, in terms of the, the ordering can lead conversations one way or another way, can lead to these kinds of self-reinforcing patterns and agreement, which lead to a different decision. And so, you can look at things like, how do we set up meetings, like Scott mentioned, to try to ensure that there aren't the self-reinforcing patterns of everyone ignoring, sorry, everyone agreeing with the person who speaks first or just being more mindful actually of that as a factor. I think there's various other things like, for example, when you're hiring, how do you compare between candidates. We know that looking at candidates on their own, can be less effective in terms of comparing between people with similar experiences and can actually lead to greater bias and potentially discrimination. So, the ways you can do it by comparing different parts of people's resume or experience together, so then, without seeing who has the most experience, you can say, this candidate seems better in this way. And you build up the kind of judgment bit by bit comparing each facet with each facet. That means that you get a better result when you hire someone. Another thing might be around, how do you, you know, how do you buy stuff for your organization? These, these may sound like mundane things, but they are really important. Do you, do you get the best deal for your organization, when you're procuring things, there are simple things you can do to get a better deal by making comparisons and prompts that the moments of purchase make it easier to get a better deal. So, I'm not saying you have to stand up as a leader and motivate everyone and give an inspiring speech. Yeah, that may be part of your job. You need to look at the prompts in people's everyday environments in organizations, and rethink those. How do people make budgets? How do people hire? And if you can change those, you may start to get the smaller changes that lead to bigger cultural changes.
Toni Dechario 13:32
Are there prompts that kind of help people exercise their curiosity muscle, their willingness to gather other perspectives to wonder about other perspectives?
Michael Hallsworth 13:45
Well, one very obvious one is time pressure and resource constraints. You know, if you have if you're goal directed, and you have a particular thing to do, that's normal. There's not much space for curiosity there. The other thing is a licensing by leaders. So again, if your whole set of a meeting is let's get this done, efficiency, and you've administered everything and memo-ed everything, and put everything in the agenda, that doesn't leave any room for curiosity, because curiosity, as I understand it, to a large extent, involves non-goal directed behavior to a large extent, your brain, not focusing on something, but instead opening up to connections. And that kind of associative mode of thinking is really different from a kind of fully engendered meeting. If you don't make space for that, it's not going to happen.
Toni Dechario 14:46
So, what kinds of things should you do to make space for that when the reality is, the people who are listening to this podcast are working in financial services firms, they have goals, they need to meet those goals, they have time constraints. What are some thoughts that you have on kind of how to create that sort of space, so that people can develop a habit of getting into a different mindset sometimes as they think about making decisions?
Scott Young 15:12
I do think we can build things into meetings and agendas and processes that, you know, not only allow space for curiosity and discussion, but actively try to foster it. So, you know whether it's certain prompts or time allocation and so forth. But, you know, again, building into the way we have our meetings, or we go through our design briefs, or whatever, you know, the issue may be that, you know, specifically indicating that this is what we're looking for, and, you know, promoting and actively fostering, you know, rather than the idea is to get through this agenda as quickly as possible, which is, you know, again, obviously understandable, and can be a default unless we consciously think otherwise.
Toni Dechario 16:04
And kind of normalize that process. Yeah, Michael?
Michael Hallsworth 16:07
I mean, one thing we do as an organization is test stuff, experiment, prototype. If you just assume that you're in a meeting, and the point of the meeting, is to come up with the best solution, and then deploy it, that is a different mentality from saying, like, we don't fully know. And we know the details matter. So, we're going to try a few different things and test them. And that requires some openness to the idea that you don't know. Because you may feel like as, I don’t know, as a professional, you're paid to know the answer, rather than trying to find it out through tweaking things or experimenting. And if you make a big, bigger point here, I think that mentality is quite often how we make policy as well. We tend to sort of consider all the options in the list, and then say, this is the best one, we're going to deploy it. But actually, when things get into the real world, there are unexpected events or details that you haven't fully thought through that require some thought so that certainty, that sort of lack of curiosity about how is this actually going to play out and contact the real world is potentially damaging. So, we use experimentation, we use prototyping, try and get some of that curiosity.
Toni Dechario 17:26
Yeah, we're also speaking with Gary Klein, who pioneered naturalistic decision-making, which is exactly that, like the real world is different from what's happening in a simulation. And therefore, you need to think about how people really behave. In real life.
Michael Hallsworth 17:42
Yeah, the real world is different from our expectations. We’re often over optimistic. You just look at major projects that always go over budgets. We've done quite a bit of work on reducing overconfidence in government, because a lot of the incentives are to think the happy thoughts, to reinforce that this is a great idea, boss, of course, it's going to work. And there are things you can do to say, well, actually, for example, if your, I loved the examples from infrastructure, you're running the Olympics, right? History shows the Olympics go 20 percent over budget, you take your budget, you add 20 percent on as a matter of course, it's called reference-based forecasting, things like that, to combat overconfidence. It's a major, major task, because it leads to these problems when we reached the real world.
Scott Young 18:33
Just to pick up on that, I think another real enormous contribution of behavioral science is making us aware of some of these biases or heuristics that can get in the way as we, you know, either look backward at past events, or we look forward and try to project. Michael spoke about overconfidence bias. There’s also things like hindsight bias, you know, our tendency to look back at something, and, and assume it was very easily predictable, of course, it worked out that way. And confirmation bias, the tendency to essentially find what you're looking for, right, to reinforce the things that you were looking for, as, as you look backwards. So, I think if we're more aware of these tendencies that we have, as humans, we can be more aware of them and potentially build into some of our processes and our thinking ways to mitigate them. And one thing that we do, we try to do as a matter of course, is the idea of premortems looking forward, to try to anticipate what could go wrong, not only as a matter of thinking ahead and trying to plan ahead, but also, I think, over time to kind of show ourselves that you know, we don't always predict outcomes properly. And you know, that reinforces the need to be open to more ideas and not fall into the trap of assuming we know everything. I know, Michael, you've spoken a little bit about premortems, you know, and the value of doing them in the manifesto and elsewhere. I didn't know if you wanted to speak more about that.
Michael Hallsworth 20:19
I mean, I'll leave Gary Klein to talk about premortems cause he, he created them. So, I think you've, you've got him, haven't you?
Toni Dechario 20:26
Yes, yeah. Yeah, actually, but I did want to ask you kind of on the, on the borders of premortems, and after-action reviews, which we'll also be talking about this season. What is it about using different time periods, like looking forward and looking backwards, as opposed to looking at today, that might be useful in gathering different perspectives and being able to kind of suss out potential issues. Was there something in the mindset of imagining a different time?
Michael Hallsworth 21:01
This is where I think about scenario planning. And, you know, the scenario planning as, as practiced by some companies like Shell has been around for quite a long time now. But my understanding is the point, scenarios are not predictions, right? They are ways of imagining the world. And people often think they're like predictions in like, this is what's going to happen. But actually, as an incentive, one of the big benefits is that process of imagining a different world, of seeing how the same driver, say, and the population growth or whatever, could lead to different outcomes. And that, I think, is the main benefit in terms of opening up ways of thought that can be quite path dependent, we just sort of say, oh, it's going to turn out this particular way. And that that way of thinking can be exacerbated by looking at the past, because what you're seeing in the past is the way it turned out. Scott referred to this in hindsight bias, you're seeing one version, and you don't see all the things that needed to have occurred, the chance events and so on, that led to this particular single outcome. And scenario planning kind of tries to open that up and pick apart the things that may have led there. And I think a final thing that I find quite helpful, is you take that one step further, and you think about how we're living in a complex system quite often, like a city, for example, where you have all these different parts, interacting in ways that we don't fully understand, quite often it kind of works. But what I think is incredibly interesting is to understand what are the kinds of rules of the game that mean that it works over the unspoken ways that people interact? In a city like New York, for example, the mean, collectively, we generate kind of outcomes, like, we don't crash into each other in Times Square. That I think is, is also about curiosity, and understanding the kind of unseen rules that we are using all the time in these kinds of systems. And I think if you can get there, that is actually, for me, a really key part of good policymaking as well. And you may have it in your organization, as well. What are the things, the tacit things people doing in your organization that make it work?
Toni Dechario 23:28
And these sort of really intentional conversations about what's happened in the past, and what were the specific elements that not everybody saw, but that were there, as well as what could potentially happen in the future, and it helps to crystallize.
Michael Hallsworth 23:47
Memory is biased quite often. There are many studies, to jurors, for example, about what they remember, and don't remember, witnesses in criminal trials. You can run experiments where people see a particular event, and then you, you can see what people remember, you can sort of change different aspects. And you can see patterns in what people remember. So, there are distortions of memory, that mean that we can have, two people can have very different understandings of the same event. And that's not particularly new insight. But I think if you don't bear that in mind, and you think about how you need to leave space for people to, how do I put this, if you’re talking about the past, you need to leave space, so people can match it to their memory as well. Rather than just sort of saying this is how it was, right, you also need to sort of give people space to, to buy into your story, matching it to what they remember.
Toni Dechario 24:52
That's interesting, I wonder about the also, you know, a lot of what we think about is ethical decision-making. And I wonder what the ethical implications of that are like, if memory is more distorted or less distorted, depending upon how someone feels about their own behavior in a given situation.
Michael Hallsworth 25:16
Okay, so there are self-serving biases. One of the most powerful things from psychology is that we act to preserve positive self-image to feel good about ourselves. If we are aware that we did something, and yet, we also have an image of ourselves as someone who doesn't do that kind of thing, that's unpleasant. It can provoke what we call cognitive dissonance. And that's like, actually feeling bad. And so, the famous idea is that you try and reduce that bad feeling. And you either do that by acting differently, and sort of changing your behavior to bring it in, in line with your self-image. Or you change your kind of attitudes and your beliefs. And you think about that thing it is, and you re-present it to yourself, and you say, “Well, actually, was it really a conflict with my values because of this, this, and this?” This is a very powerful post-rationalization we do to alleviate that unpleasant feeling we get.
Toni Dechario 26:25
So, an after-action review, or a post-mortem could be uncomfortable.
Michael Hallsworth 26:31
Oh, I think it definitely will be. So, then the question is, how do you anticipate that? How do you give people an outlet for that? How do you prevent them from being backed into a corner and defend their self-image by rewriting history? You need to give an outlet and a constructive way forward rather than that defensive corner.
Scott Young 26:52
I just add that it's a real challenge, right? Because, again, people will rational away, rationalize away negative behaviors and still hold on to the perception of themselves as fundamentally good people. You know, if you're looking to prevent negative behaviors, you may want to try to link those things to self-image, because it had a stronger impact on limiting people's tendency to do something. So, it’s kind of an interesting dilemma of, on the one hand, how do we allow people to make mistakes and recover from them, so to speak, and be open about them? So, they don't fall down a snowball’s path, so to speak, of covering up and getting worse. But on the other hand, how do we link negative behavior to negative self-perception so that there's a stronger, a stronger impact in hopefully reducing people's likelihood to commit even small behaviors that will reflect negatively on their self-image?
Toni Dechario 27:57
Part of the reason that I'm so curious about the forward-looking element of it and you each spoke a little bit to this is, you know, in the financial services sector, certainly we've seen over the course of the last decade and a half a couple of examples of a failure of imagination, leading to some problems. You know, who could imagine that house prices would go down? Right. And so, the kind of experimentation, Michael, that you were talking about, you know, I think could potentially be helpful in imagining is kind of previously unseen scenarios.
Michael Hallsworth 28:43
One thing that comes to mind, which may be interesting for your listeners, is increasingly use, people use agent-based modeling, which is basically where you, you simulate different people interacting with different entities, and you work out what are the different futures where people, the collective decisions go one way or another way. So, you, you kind of build it so each kind of individual person is, is making decisions, and then you see how they will interact. And now the thing about agent-based modeling is very interesting and powerful. And you can change different aspects and say, well, what happened if the weather was slightly different, does that affect how this collective behavior happens? And this is relevant for the economy, right? So, you can simulate different economic outcomes, like house price changes. What I think is really interesting is, a lot of these models have been based on our rational choice theories of behavior where people act in their best interest, they weigh the costs and benefits and make a decision accordingly. And what we're seeing on the kind of cutting edge of this is that you see models which are more behaviorally accurate, based on some of the things that we look at in terms of mental shortcuts, the fact you copy people, you don't weigh up the costs and benefits, you look what others are doing. And that's behavioral science infusing into agent-based modeling. And it can offer a really more accurate set of future thinking, scenarios really. So, you're not just going on what do we think will happen in the future and trying to create creativity, you're letting the model do the work, just more accurately, we hope based on what people actually do.
Toni Dechario 30:32
So, I want to move to a little bit of a meta question, Michael, which is we're thinking a lot about using forward and backward thinking, to improve decision-making. And I'm curious about what you learned from your own experience in writing that manifesto, which was a bit of an exercise in forward and backward thinking, you know, that might help this conversation.
Michael Hallsworth 30:57
Yep. So, some quick thoughts. First of all, is the hindsight bias thing that Scott mentioned, like, it's very easy, looking back to construct a narrative like this was all inevitable that people would be interested in this kind of thing. We'd be sitting here 10 years later. But I remember at the time, it wasn't like that at all. And we were very uncertain. It felt like risk, and we didn't really know if it would pay off. And, you know, maybe people will just won't be interested. But it's very easy to construct that narrative afterwards. At the same time, I felt that it was only from looking back on things that I could see what had happened sometimes. So, I think there was some choices that we made in our early days that influence the way that the field develops in like long-term way, focusing on testing, focusing on changing specific aspects of an organization or a service. So, I think that was really helpful in terms of looking back. Looking forward, I was just struck again, by the fact that I've spent like, a lot of my career like writing stuff, like, you're doing stuff, like, we do a lot of practical projects. But when I write stuff, it's like, here's what we should do. Here are some recommendations. I've been doing this for a while now. And just particularly here, I felt well, it's hard work, knowing what to do, and I think but we think we've got there, that's solvable. Actually, doing this stuff, make it a reality, living up to the ambitions here, it's just really hard because you, you publish the same report, and you go back to your day job, and all the same incentives are still there. All the same things that make it really hard for someone to be the first person to do things differently, are still there. And that difference between what I know we should be doing, and just the difficulty of making the first move was just really apparent to me. And it's really ironic, because that's what we focus on. Right and even knowing what I should do and knowing the barriers, I find it really hard to make some of these changes myself. And it's a humbling experience.
Toni Dechario 33:04
Because you're human.
Michael Hallsworth 33:05
Because I'm human, because you, if people will know when they're running organizations, you don't have complete freedom. People, some people who may work for you think you could do everything, but you can't. And that's really hard. Really hard trade off to make off. It's really hard to reconcile that in your mind, but it's true.
Toni Dechario 33:27
And doing it yourself, is that difficult getting a whole organization to move in a particular direction is nigh impossible. So, I'll ask the impossible, Scott, I'll direct this to you. What advice would you give to leaders who are trying to fundamentally change the culture of their teams or, or organizations or, or let's say the financial services sector?
Scott Young 33:54
I think maybe two main themes come to mind for me as I’m thinking about that. I think first message might be that, you know, as leaders, the common default or tendency is to think about changing culture from the top down, you know, and if I put forth the right core ideas, and I repeat them, and I make them visible, and that you know, and hopefully, of course, I embody them, and model them myself, that everything will flow positively downward from there. But I'd encourage them to also keep in mind that we can change culture from the bottom up as well. And that, you know, we can complement those top-down efforts by thinking very specifically about the behaviors that constitute an ethical culture and environment. And to focus on moving more people towards those behaviors, you know, whether it's moving from 50 percent to 60, or 80 to 90, but getting more people to consistently do those specific actions that we're looking for. And I think what we find, and we mentioned, I think, rationalization earlier, that behavior can change attitudes and culture as well. And that when, when people are doing something, they tend to adjust their own thinking to align with their behaviors. So again, I think that's another path to changing a culture internally. And then I'd say one, maybe broader theme is just, I think a lot of energy inevitably goes towards policing behavior, and you know, thinking and focusing on what I believe is mostly a minority of folks who have active negative intent and preventing wrongdoers from succeeding. And, of course, that's critical. But I think there's a larger opportunity around prevention. And that's, you know, speaking to the vast majority of employees in most organizations, most cultures that want to do the right thing. And I think they're, we often fall into the trap of thinking, we just need to persuade them more and give them more information, and that will make the difference. And sometimes that has the opposite effect of confusing and overwhelming people. And if we take more of a mindset of let's make it easier for people, let's help them do the right thing and make that the easier path, that's going to be more likely to, to move the culture and move the majority of folks to doing the things that that ultimately constitute the culture and the environment that we want.
Toni Dechario 36:47
So, we just have a few minutes left, I have a couple more questions. First, where do you see the most promising opportunities for applying behavioral science to the financial services world? Scott, I can I'll start with you on this one also?
Scott Young 37:04
Sure. Well, I tend to think of things in terms of what we call Win-Win-Win opportunities, which are the opportunities that have definite value to the organization, but are also positive for the people impacted, whether that's customers or employees, and ultimately benefits society. And tied to that, I think we look for areas where we have what we think of as intent-to-action gaps, where most people do indeed want to do the right thing, but they aren't necessarily doing it, whether habits are getting in the way or confusion or complexity. I think those are the areas that we can apply behavioral science, both ethically and effectively. When I think of which areas, you know, kind of fit that description for me, the ones that come to mind are compliance as an area where, again, we tend to sometimes overwhelm people with information and complexity. And I think there's an opportunity to, to simplify in many cases, to apply social norms, and to get more people, ultimately to the place they want to be, which is doing the right thing consistently.
Toni Dechario 38:18
Sorry, Scott, to interrupt you, but you said to apply social norms. What do you mean by that?
Scott Young 38:24
Well, that can be a couple of different things. But first off, in many cases, social norms can be in showing them that the majority of people are indeed, complying and doing the right thing. Sometimes it's easy to rationalize that nobody else is doing it. You know, anecdotally, because you know, of one colleague who, who also isn't, but in reality, you know, 80, or 90 percent of the people are doing the right thing, right. Another dimension of social norms, I, that I think of a lot in organizations is tapping into the power of teams or colleagues. So often, the communication and the efforts are all aimed at the individual as an individual. But we all know that our behavior within an organization is usually influenced by the group of people we work with. And sometimes there's opportunities to leverage that, you know, having people do something collectively, jointly, or even gentle forms of competition, our team versus others. I think there's a lot of power there that we can tap into. And we've seen it in fact, in work that we've done in organizations around things like employee health and wellness, as well as other areas. Another area and financial services that really comes to mind would be anti-fraud and cybersecurity-related efforts. You know, those are ultimately human issues. In most cases, it's a question of people following through on procedures, it's not just technology. And again, I think there's positive intent in those cases. And often it's helping people think before they act. And we've had success in integrating small interventions to help people become aware of things like phishing attempts, and to stop them or at least mitigate the likelihood of sharing information that they shouldn't and things of that nature. So, I think there's a lot of potential tied to fraud and cybersecurity as well. And then, of course, there's this huge host of people issues, kind of the third bucket that comes to mind, you know, everything from employee engagement and retention, and health and wellness and safety, to issues around diversity and inclusion and gender equality and equity as well. You know, some of the things we've been talking about today in terms of process changes and so forth I think can have enormous impacts, and indeed, have been shown to have enormous impacts on success rates and behavior tied to those issues as well.
Toni Dechario 41:01
It strikes me that a lot of these things, perhaps suffer from a similar issue, when you talked about how allowing space to exercise curiosity can sometimes be seen as like, time consuming and this is the way things have always worked. And so, this is a little bit of an annoyance that we have to kind of stop and listen to these things that could go wrong, it's never gone wrong before, why is it gonna go wrong this time. And compliance is often seen, kind of with the same, like, let's just get through it so we can get to the stuff that matters. And so that kind of emphasis on these processes are actually important. And here are the ways in which they can lend themselves to the bottom line, I think is something that behavioral science can kind of offer people that perspective. So, I think we just have a couple of minutes left I, I'd like to be able to ask more, but I'll wrap up with a final question, which is what recommendations would you have for those listening who’d like to learn more about behavioral science generally?
Michael Hallsworth 42:07
Well, one, one website that I read myself, one kind of magazine is, is Behavioral Scientist. It’s like, really, obviously, high quality is sort of like articles not just like about behavioral science, oh, this is interesting, but also, like relating it to current events and offering new perspectives on things. So, I would recommend people look at that.
Toni Dechario 42:34
Scott?
Scott Young 42:35
I'd agree that Behavioral Scientist would probably be the first place I would start. One other one that comes to mind is Habit Weekly.
Toni Dechario 42:44
Did you say Habit Weekly?
Scott Young 42:45
Yes, it's a site and a newsletter called Habit Weekly, and probably a little bit more of a practitioner or applied angle to it. But also does a good job, I think of summarizing a lot of what's going on in the field, from research to case studies, and so forth. So that's another place that I would look to if I was looking to one or two resources. It's going to keep me informed and show me a range of things going on in this world and maybe point me towards areas to dive deeper.
Toni Dechario 43:20
Thank you both very much for being with us today. Appreciate it.
Michael Hallsworth 43:23
Thanks so much.
Scott Young 43:24
Thank you.
Toni Dechario 43:26
For more conversations like this, as well as publications and other resources related to banking culture reform, please visit our website at newyorkfed.org/governance-and-culture-reform.
Scott Young 00:01
I do think we can build things into meetings and agendas and processes that not only allow space for curiosity and discussion, but actively try to foster it.
Gary Klein 00:14
One of the most important contributions of a premortem is to help create a culture of candor in a team, where people aren't afraid to say things that might be unpopular.
Wieke Scholten 00:26
What I also try to do in organizations is to normalize it to talk about things going wrong, because things do go wrong. And that's fine. It's realistic. I love the word realism at work, things go wrong, we look at it, we learn from it.
Michael Hallsworth 00:41
If you just assume that you're in a meeting, and the point of the meeting, is to come up with the best solution, and then deploy it. That is a different mentality from saying, we don't fully know. And we know that details matter. So, we're going to try a few different things and test them. That requires some openness to the idea that you don't know.
Preston Cline 00:59
The bottom line here is what really matters. It's not the data that matters. It's the narrative that matters. As a leader, it actually doesn't matter so much what you think went wrong and went right. What matters is the story that your people are telling themselves tomorrow, about the team, themselves, and you.
Jeremy Brisiel 01:18
This is Bank Notes, Banking Culture Reform. The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 01:28
Hello, and welcome to season three of the banking culture podcast, part of the New York Fed's initiative to drive awareness and change in financial services culture. This season we'll speak with experts on how organizations can build curiosity and learning mindsets into their cultures. We'll explore tools that are immediately useful in the information that they uncover, but at a deeper level, their repeated use can create a culture that treats mistakes as opportunities for improvement versus moments that instill fear. My name is Toni Dechario and I'm with the New York Fed's culture team.
Toni Dechario 02:01
Today I'm joined by Preston Cline of the Mission Critical Team Institute. Thanks for joining us today. Preston, welcome. Can you start out by briefly introducing yourself and the Mission Critical Team Institute?
Preston Cline 02:14
Sure. I'm Dr. Preston Cline, I, my degree is, all my degrees are in education, Rutgers, Harvard and Penn. I run an applied research and education lab. Think about DARPA (Defense Advanced Research Projects Agency) or RAND. But instead of looking 20 years out, we look Monday. We specifically are working with what are called mission critical teams. These are teams that work in decision-making environments, about 300 seconds or less, where the consequence of failure can be critical or catastrophic. So, think NASA, think firefighting, think Special Operations, tactical law enforcement. And our role is just because you can do a thing doesn't mean you can teach a thing. So, we look specifically at what's called the tacit knowledge transfer problem, which is, you know how to ride a bike, but you can't explain it to me. And while that's interesting, it becomes very interesting if you hand a resident a scalpel on their first day in surgery. So that's, that's who I am. And that's what we do.
Toni Dechario 03:07
That clarifies that a bit. So, one of the things you've done is, you found some extraordinary similarities between mission critical teams, irrespective of what industry, you know, they're in or what they're doing, whether they're surgeons or astronauts or, you know, in the military. And there are probably quite a few differences between financial services teams and mission critical teams, certainly the kind of catastrophic, immediate catastrophic, consequences are probably different, but there are pretty, there can be pretty severe fallout from mistakes in financial services that maybe less immediate and less direct. I can imagine some similarities. There’s quite a bit of pressure, in many cases, there can be high adrenaline, there can be some competitive personalities who are used to being successful, are a few similarities, I can imagine. Would you be able to draw some parallels that you might see and some thoughts you might have about what financial services firms can learn from mission critical teams?
Preston Cline 04:14
Sure. So, let's just give some, some nomenclature and some structure to this. So, one, we're talking about teams, right, not individuals, and not groups, and it's important to note the difference, right? We're not talking about a group of individuals; teams suggest and require some interdependence. And so, we're not talking about a group of people that simply works together, that's not necessarily a team. And so that's, that's when we are talking about teams, let's just talk about some of the differences. And then I'll talk about what are the same. The difference is, is the teams that I work in, regularly work and are trained and educated to work in immersive or all-consuming context, evolutions, we call them. So, a surgery, right? You can't step out of surgery and have a cup of coffee, a hostage rescue, you can't press pause, and go running into a burning building, you've got to complete it right? And so, these are the things where they're doing this every day, in financial services, in your, this might happen a few times in your career, that suddenly there's a crisis, a disaster, and you're in the midst of it, and you're not done until there's a resolution one way or the other. And, and so, I want to sort of talk about that side of it. Keep in mind, that many of the things I'll talk about aren't going to be useful to you in your everyday lives. They are meant to, to sort of navigate and sustainably and successfully navigate immersive and critical environments. But here are some things that are, that are similar. One, we always start with the problem set. And so, when, when any team is looking at a problem set—and I'm going to be super generalized here—we often reference the Cynefin model by Snowden, many people will be familiar with that. And I'm not gonna get into the details other than to say he suggests that there are ordered and unordered problems. Ordered problems you can have contingencies for, and you can plan for, you can train for. And then unordered problems where you have to instead of building the contingencies, you're building the capacity of the team to resolve whatever shows up: alien landings, zombie apocalypse, whatever it is, we got somebody on our team that does aliens, right? And so, we have that capacity. And then once you do that, you have to start thinking about the human factors. And I'm doing this in order, as you'll see because they're all related. And so, for ordered problems, you can have an intact homogeneous team, not a problem, because they're very fast. They're very agile. They can finish each other's sentences, but those strengths work against them when it comes to unordered problems because they lack what we call cognitive diversity. Basically, it is to say you want as many tools in the toolbox as possible, right? And so, you want to be thinking about that. And then the next thing is technological change. We're all familiar with technological change, it's happening all the time. Where it impacts us directly, where the rubber meets the road, is the thing we call rate of learning. So technological change is only relevant in our world as it, how it impacts the human that's interfacing with it. And so, how can those humans adapt, evolve and leverage that new technology in an, in a fast and efficient way? And then lastly, the sort of cultural structural view is on information management. And this is a term I know gets used in your world a lot. And we use it a very specific way, which is to say, there are 24 hours in the day, that hasn't changed. But the amount of data information that we're all receiving on a regular basis has increased dramatically. And this is actually working against our natural work ethic. And the problem is, you're slowly creating a situation where it would be impossible either now or shortly to actually do all of the work. And so, we have to restructure, and we have to think about the cultural implications of that.
Toni Dechario 07:58
I want to follow up on the last thing you mentioned, which is kind of information management and the fact that we're overwhelmed with information now. And so, no matter how hard we try, how hard we want to do our jobs properly, we kind of can't, because we're just overwhelmed. What are some solutions to that problem? What are some ways in which you recommend people handle, teams handle that problem?
Preston Cline 08:20
Well, let's talk about the symptoms first, right? So, what ends up happening is, you get somebody with a, with this incredible work ethic, they’re getting overwhelmed from work. So, what’s the first thing they do, they give up on breaks. The second thing they do is they give up on meals. The third thing they do is they give up on sleep. They come in earlier, they leave later. That has an impact, meaning they're paying less attention to their team and their family. So, what ends up happening is, they burn through all of their mechanisms for resilience and sustainability. And by the time they raise their hand to say I'm in trouble, man, we are on a cliff. And so, I'm saying that because I want to let people know that are listening that if you're guilty of any of that stuff, you have to pause and just ask yourself, if you are telling yourself the lie, "Oh, it's just busy right now. But once this is over, it'll calm down." That world doesn't exist anymore. It's always busy. So, you've got to figure out a way to navigate the busy. So specifically, to your question, what we're seeing on some of the tech teams and, and other sort of cutting-edge organizations—you probably might be better versed in this than me, this is, this is anecdotal stuff that I've been on there telling you about —we're seeing really strict rules about when data can be exchanged. So, hours in the day, when emails and texts can be exchanged, or platforms such as Slack, Slack can be talked about. We're seeing breaks on weekends, giving people data-free weekends. And then we're also seeing some really innovative things that are happening with some teams where they're actually saying no data on Fridays. And some teams, what they do is they, let's say, you're on a floor, well everybody will put on a whiteboard or a post-it note in front of their cubicle or their office, they'll say, here's the problem I'm working on. And you just spend Friday walking around reading and talking to other humans and there are teams that say they get more out of that Friday than they do the other four days combined.
Toni Dechario 10:12
Okay, so, you know, I take it that we should just assume these days because of some of the circumstances that you already described, that everyone is stressed out and burned out. And that people aren't necessarily raising their hands as early or as often as we'd like them to be. And so more mistakes are probably getting made because people aren't sleeping, people aren't eating, people aren't, you know, taking care of themselves and taking care of their needs. And actually, that the theme that we're exploring this season in the podcast is how to build cultures of learning and curiosity but also build cultures that have a tolerance for mistakes and expect mistakes and understand kind of how to manage through mistakes and learn from them. And so, it sounds like the kind of environment for mistakes is hotter than ever. And I wanted to talk to you about some tools for how to practice acknowledging mistakes, raising your hands. And if you're open to it, I'd like to discuss a range of tools, if you have them to share. The initial one, though, that I wanted to talk to you about is the after-action review. You've written a lot about this; I know that you're very well versed in the after-action review. Can you just describe what, what it is?
Preston Cline 11:34
Sure, so. So really briefly, the way that humans make meaning of experience is both independently through reflection, but collectively through dialogue. That's how your brain works. You need to actually hear other people's perspective of an event that you went through, we've all had that experience where you went and did, went to a party, or you had an event or a crisis happened, and you're telling the story, and somebody goes, “Man, that is not at all what happened for me,” and even though they are right next to you, right? And this is why a socially constructed mechanism for making meaning of experiences really matter. And, and the, the bottom line here is what really matters, strangely enough, what we've found, and we've just released a paper in the Harvard Business Review on after-action reviews, if you're curious, is, is that it's not the data that matters. It's the narrative that matters. And so specifically, and I'll break, I'll go back and break down your question. But specifically, what I want people to understand is, as a leader, it actually doesn't matter so much what you think went wrong and went right. What matters is the story that your people are telling themselves tomorrow, about the team, themselves, and you. And if the narrative is “I suck, we suck, you suck.” Man, there’s not a lot of places to go from there. Right? There’s not a lot of learning. There’s just shame, embarrassment and recrimination. And we know from the data, that means that you’re losing connection and belonging, which means you’re going to have higher attrition, that math, the research on that is really straightforward, right? We know that teams will stay together longer if there's a sense of connection, belonging, purpose, and learning. Those things, right? You get those that that little formula, correct, people will bleed for you, but you get it wrong, people will leave. And so, one of the ways to do that, to get to your specific point, is after an event happens, especially where there are some emotions provoked, right, either some feelings of injustice or failure, or you didn't meet the target, or, you know, favoritism, whatever it is, we all have these things at work. But there's some discrepancy in how people feel about the experience afterwards. And so, there are two kinds of after-action reviews I’m going to talk to you about, and they’re used for different purposes. The first is the technical, historical after-action review, or debrief or hot wash. In medicine, they call it mortality and morbidity meetings, m&ms. Every, every elite team has some mechanism to do this. And the formula that's been going on since oh, I don't know how long, is basically this: you get a group of people together, you take off all rank, so everybody speaks. The people that do it well have the most junior person speak first. And there's a bunch of reasons for this seem pretty obvious, which is, if you don't do that, they won't speak A, because the risk is just too much. B, you're demonstrating that you value every voice and three, it gives you a chance as a leader to correct any misperceptions or mis-educated or mal-educative sort of things are happening. By hearing their voice, you kind of hear where their perspective is. And so, and the way it works is you get everybody together, and you just do three things, you get somebody to say, “What was the plan going into this?” So just, we all agree, and we all agree, “Yep, that was the plan.” Part two of the three-part plan, of our process excuse me is, “Okay, what actually happened?” And this is where narrative really matters. Tell me the story of what actually happened. And what you're going to find, and what we find is that if you do it well, and you can take ego out of it, the collective narrative is more holistic than the individual narrative. So, if it's only the boss telling the story, that's only the perspective from the boss. You really want the people on the ground telling the story because it really happened to them. Now, they may not see some things. That's why everyone has to, to be involved. And then the last one and a traditional one is, “So, what's the delta between our plan and what actually happened? So, what was that? What was the gap between what we intended to do and what actually happened, right?” And the universe gets a vote, all those things, uncertainty, chaos, all those things play a role. But so does, you know, error, so does mistake to your points, right. And one point I want to bring up and this was taught very profoundly to me early in my, my days as a wilderness guide, leading expeditions is, I had a senior instructor, rather famous person named Chris Horner, who, by the way, just is the second person ever to have summited all seven summits. So, congratulations to Chris. And Chris sat us all down and he said, “Hey everybody, let me ask you a question. How many of you, how many people here will make a mistake here today, some mistakes, spill a coffee, misplacing something,” and we all raise our hand. And he says, "Okay, so let's just think about that. If we're a group of eight, that means that our team at a minimum is making eight mistakes today, at a minimum, just because we're human. So, our choice is that we can either deny that and lie to ourselves or have a conversation about why those mistakes are happening. That's our choice.” And, and it's, and it's just this very practical, pragmatic, factual, sort of math problem, like you can either tell the truth or not, it doesn't change the fact that humans make mistakes. And granted, there are different kinds of mistakes. And there's a whole taxonomy of mistakes and errors and lapses, and oh, gosh, it goes on forever. But the simple thing is, does your culture allow for a conversation for error, or doesn't it? And I'll stop here by saying that in the teams that I work with, often the argument we're in is, I work with what are called no-fail organizations, right? Organizations, where you, you need the surgeon to save your daughter’s life, you know, this is a no-fail environment, you cannot fail at this job. However, that surgeon also has to train, which means that in order to train, in order to learn a function of learning is failure. That's literally how we learn, we make a mistake, and we adjust. If you look at, Zab might have talked about this earlier in the neuroscience of learning. But that's literally how we learn. And so, I'll just end by saying that a really healthy learning organization has a low tolerance for errors, meaning that they don't like them. And there's consequences for the wrong kind of error, but that they have a robust dialogue about it.
Toni Dechario 17:51
And that was Zab Johnson, who we interviewed in the first season from, from Wharton. I was struck by a distinction, which was, when you were talking about after-action reviews, you emphasized that the most junior person in the room should go first, to make sure that everyone's being included, to kind of amplify the importance of everyone's perspective and for the other reasons that you described. I'm struck by the distinction between that and another practice that we've been looking at this year, which is the practice of premortems. And you described a little bit of a premortem, when you talked about how your wilderness team was planning an excursion, and the leader said, we're all going to make mistakes, and how important it was for the leader to speak first and kind of create that safety for people to acknowledge that mistakes will be made. And I'm just kind of reflecting on that and wondering if you have any thoughts on who speaks when and what we should keep in mind and thinking about that as we practice these tools.
Preston Cline 19:07
Sure, and I'll just point out that the legendary Gary Klein, Dr. Gary Klein has done a lot of work on this. And so, if you're curious about this, I suggest to all the listeners go to listen to Gary, he's just an absolute legend.
Toni Dechario 19:18
And indeed, we're interviewing him this season.
Preston Cline 19:19
Brilliant, brilliant. And so. So, I will say two things. There is a premortem which is to say and it was really comes out of the automotive industry, which is this idea where you know, a car maker builds a car, they break, they take the first test version of that car, they put it in the middle of a huge room warehouse, they bring every single human who was involved in the creation of this car, and they hand them all post-it notes and say, “Tell me where this car will break, you've worked on it, tell me that when you were making it, where were the shortcuts that you necessarily had to make? The budget cuts, the choices, the decisions. Now in reflection, tell me just now where things will break.” And then what they would do is they take all the post-it notes, and where possible, they'd go back and preemptively fix those issues. Now, that's that premortem is one way to do it. Another way to do it is what's called red teaming. And so, you don't have that team do it. But rather, you have another team, an independent team whose job it is to just look for your weaknesses, look for your vulnerabilities, and just, just exploit them as much as possible. And that in some contexts, certainly in the cyber world, can be really efficient and effective. In both cases, however, just know that, you know, don't, you know, they always say that don't go after perfect, perfect and sacrifice good, right? Good enough is, is often what's required. And so, there isn't a perfect car, there isn't a perfect software. And so too often you get people that are like, “Well, we're not done.” Well, you’re done enough, right? And so, the leader really needs to know when the premortem, or the red team has accomplished the goal. And those need to be really clear, otherwise, man, it can turn into just an endless sort of thing.
Toni Dechario 21:05
Yeah, it’s an interesting balance in thinking about when a leader exercises his or her kind of authority, and when a leader tries to erase his or her authority to get that cognitive diversity, to get, you know, the views and to get people to speak up. So, one of the things you described with respect to an after-action review is that there's no rank in the room, but you're talking about the military, there is a major hierarchy in the military, you know, that for, for good reason, needs to be respected most of the time. How do you actually effectuate like, how do you make no rank in the room actually happen in people's minds when they're so used to respecting the hierarchy and respecting authority in something like the military?
Preston Cline 21:56
So, it's a, there's a technical answer to your question. So, if I, if I go too deep on this, just bring me back. But basically, when we look at homogeneous teams, or intact teams, and we look at heterogeneous teams, or non-intact teams, what we call tactical swarms. And I want you to think about trauma resuscitation where a group of people that have never maybe never met before, come together and within 300 seconds need to resuscitate somebody, right? So, in that case, there is a doctor who literally owns, legally owns that patient, that doesn’t change. However, if the doctor is one of the doctors that has to tell everybody what to do, they’ll run out of time. And so what we, what we talk about when we talk about, at the, at the most elite levels, we talk about swarming, which is to say, everyone owns the problem, there's still authority, you're still deferential to the person who owns the legal part of this, but you know your role and, importantly, you know when to step forward and when to step back, much like in a play on stage.
Toni Dechario 22:57
One quick follow-up on this hierarchy question, which is, you said, people kind of understand when to step in and when to step out. And that everyone has a say, you know, that those are the rules that are that have been established. But if you're operating not in a swarm team, not in, you know, one of these unordered teams that you talked about earlier, but rather a team that's been working together for a long time, that has kind of an established set of relationships and expectations, there could be more kind of well-worn grooves in fear of reprisal or, you know, sense of like, “there's no point” futility. How do you use tools like the after-action review to try and kind of fight those well-worn fears or grooves?
Preston Cline 24:04
So, there's this really interesting conversation I had with the Philadelphia Eagles, right before they won the Super Bowl. So, we were working with the players, and we were asking the following question, “Is it possible to win the Super Bowl in the absence of pure leadership?” Which is to say, could you win the Super Bowl just by having the right talent and the right coaches? Or is it required that in the locker room, you have pure leaders that are motivating, challenging, growing, teaching, and every single one of them and we talked to other teams too, said , “Oh, no, peer leadership is actually required.” Here's the problem. Professional sports incentivizes you, financially, not to actually help the person next to you, because they're gonna take your job, that's a real thing. And so there are players that have to actually independently take financial and career risks by committing to develop the people around them, even potentially at their own cost. Which is to say that, that the well-worn paths and cultural sort of norms you're talking about, are absolutely there and cannot be denied, especially in financial services, where we're talking about money, other people's money, your money and promotions, etc. However, if you look at the data, and this is not universally true, but in many of your teams, if you actually run the numbers on what happens if you're, to you financially, if your team is successful, as opposed to you, it usually ends up that you do better if everybody does better, financially and otherwise. And so, but that takes a certain amount of courage, that takes a certain amount of cultural friction, a certain amount of vulnerability. And a lot of folks are not prepared to do that. And that, you know, in the words of Dr. Mike Useem, here at the Wharton School, that's that leadership moment, right, that moment where you have to decide what are you here for? And I mean, holistically, existentially, what are you here for?
Toni Dechario 26:03
That's so interesting. So maybe the advice is, run those numbers. Help people understand that. If their team does well, they're gonna do well.
Preston Cline 26:10
Yeah. And not always true, right? I don't mean to be Pollyanna about this. Sometimes, independent people can do much better than the team. But there are many cases where that's not true.
Toni Dechario 26:22
So I have a couple of other kind of specific questions about after-action reviews. How do people get them wrong? What sort of mistakes are caused when conducting an after-action review?
Preston Cline 26:33
Yeah, so things you've, you've all heard about. Groupthink is a big problem. “This is the way we've always done it” kind of a thing. Egos can get out of it, people who have a particular soapbox, which they can't get off of, and they'll railroad the conversation, people that talk, people that don't. And so, some of the things that you really have to do is, you have to set some really clear principles, and that is you’ve got to share airspace. If you are, you should, you know, you should be listening as much as you're talking, probably listening more than you're talking, right? Especially if you're in a large group. Also teaching people to be concise in their communication. People will often get anxious speaking in public, and so they'll end up using a lot of time to get a simple point, and so, we have strategies where, when we are in an AAR, we encourage people, we give them pads of paper, and we say, before you speak, write down what you're going to say, just, just look at it and make sure you're comfortable with it. And then at the right moment, then say it and people will say well, you know, the conversation’s moved on. An after-action review, if it's relevant and important, the conversation has never moved on, that will always be important and relevant.
Toni Dechario 27:47
Mm hmm. That actually brings me to another question, which is generally how long would an after-action review be?
Preston Cline 27:52
Oh, yeah, not only how long but when. And so, in my world, let's say that you've just come off, let's say you were a federal tactical law enforcement team, think Secret Service, or FBI, and you've just come off of an all-night 24 hour, you haven't slept, and there's been a serious incident, someone got injured or something like that. Could you do an after-action review right then? Sure. But everyone's braindead because they're so tired. And so, there's a debate here, right? Because if there's trauma, they may not sleep, right? And that's, that's actually really bad for them. And so, there has to be a balance between are you doing an after-action review to make meaning of an event? Or are you doing an after-action review to process an extreme experience, so that people can make meaning of their own emotions, mourning, loss, everything else? Those are two different things, and they'll determine when you do it, but to answer the question, you want to do it within 24 to 40 hours of the event, right after is usually optimal. As I said, if people are okay. And then how long? So, the short answer to that is, however long it takes for everyone to speak and have their voice be heard and be able to say what they need to say, in a, in an efficient and effective way. So, in other words, really great teams can do these in 20 minutes, right? Like, like medical teams can do these really effectively in 20 minutes, because they have a lot of laps. And they know what they know what patterns look like. They know what dissonance looks like, they know what, you know what they need to pay attention to.
Toni Dechario 29:26
So, you mentioned medical teams, medical teams have, you know, very distinct things that they do, there's a surgery or there's a, you know, an emergency event, or a special ops team, you know, has an operation and then it's over, and they perform an after-action review. In the case of financial services, the discrete action, you know, isn't going to be as dramatic, probably. And also, maybe the lines around it aren't as clear. Do you have any recommendations about when it makes sense to conduct an after-action review?
Preston Cline 29:59
So, I’m not a person that believes in meetings for meetings sake. In fact, I think we do meetings way too much. However, culturally appropriately, and this is, this is difficult, especially with remote workers, right. But to have a regular session, and this is where I’m gonna dodge your question, because I would need to know the context and know the iteration. Sure, this is a weekly, monthly, quarterly, whatever it is, but really to get together and say, hey, we've just gone through this period, this week, this quarter, this month, right? Here's, let me tell you from a place of as a leader, here are some things we pulled off and did great. Here are some ways we didn't meet expectations. Let's just go around and talk about both. How did we pull off what we pulled off? Like, what did we do right? And then what can we improve on later, right again, so next month is even better?
Toni Dechario 30:47
What are some of the biggest misperceptions about after-action reviews?
Preston Cline 30:53
Yeah, or when they're done wrong, right, and that is when competition becomes more important than substance, when, when the strong man should be able to take a critique, so, what we're gonna do is just go into a room and hammer one another so we can show that we did it and be prideful of the fact that man that really sucked. That's a waste of everyone's time. It really is. Go join a football league or a rugby league. It'll be better use of that energy. You don't want to actually go in and just have a beat down. There's, there's no, there's no gain in that. Because again, the narrative is usually after that is, man, what kind of team am I working on? Man, this is really discouraging. And we see this at the most elite teams. And typically, what we see, when this starts to happen, that's kind of feeding on itself, this kind of circular firing squad, where every AAR is just a mechanism for people to scream at one another, we usually start to see high attrition. People will start to leave. And who leaves first, of course, is your best people.
Toni Dechario 31:53
You're actually segueing perfectly into the next thing that I wanted to talk to you about, which is the fact that, you know, research shows that we hold on to the bad, much more than that the good, and you talked about that a little bit earlier about kind of always hearing criticism, you begin to only hear that. And one of the kind of things that we are most motivated by is appreciation and kind of gratitude and the things that you said make, can make people feel uncomfortable. In a sense, criticism can, you know, kind of constructive criticism can be seen as a gift and kind of a form of gratitude and appreciation for somebody that you're willing to take the time to share with them how something went wrong, and how they could improve. But I think as humans, as you described with affect, it's really hard to take that in as something positive. And so, do you have any recommendations for the best ways to do that? And to use these tools as a positive tool, even though perhaps we're talking about things that have gone wrong?
Preston Cline 33:05
Yeah, there's, there is, I'll use my grandmother's technique, which I've always thought was absolutely brilliant when she was alive. My, my grandmother and I were very close, she adored me. And so, when I blew it as young people do, right, she'd walk up to me put her arm around me, and she says, "You know, I think you're just amazing." And I was like, "Yeah," and she's like, "Can you walk me through why an amazing person would do that?" Right. And what was amazing about that was, as a young person, what I found was, I wasn't coming from a place of shame or guilt, she was genuinely trying to understand why, with the expectation she had of me, that I would do that behavior. And I was in the position of trying to justify, rationalize, explain it. And usually I couldn't, right, and usually it was just youth and inexperience and whatever it was growing up, but it, it forced me to make meaning of the event, rather than to react to her judgment of the events. Now, she, this, this isn’t in, this isn't in isolation, she would then say, "You understand the standards in this family. Right? Moving forward, like, that's, that's not what we do. Right? You understand that?" So, she made clear what the standards were. But she made it my problem, not her problem. And I think that for getting people who are passionate learners, what we call weaponize their curiosity, right? They’re people that really want to do better, when you do it that way, when you say, "Look, I hired you. I think you’re extraordinary. I’m now having to defend you because of this or explain this. Can you explain it to me? Not from a place, I just need to understand what principles were you using, was it your training, was it that you, you have a different opinion about this? I actually need to know, because we do have standards, we have to move forward. And I need to have trust in both you and me, and me and you. And so, how do we do this forward?" And I find that from a place of advocacy and curiosity, well-intended people will get it and move forward. Now, that's the key thing, right? Well-intended people. We're not talking about toxic personalities. And just for the record, I know this has been gone over a lot. But the research on this, I think is really clear. The last I read was holding on to one toxic person basically cancels out your top three performers. And so, if you as a leader are not courageous enough to address or, in fact, remove a toxic employee, you will, in fact, suffer the results.
Toni Dechario 35:31
That's so interesting, especially from kind of a profitability perspective, and that you might have a higher, higher, the highest performer may bring in a certain amount of profit. But if you look at the next three highest performers, they're actually as a group, probably more valuable to you than the highest performer.
Preston Cline 35:45
100 percent.
Toni Dechario 35:46
So, we've talked a lot about kind of the individual implications of some of this kind of backward-looking thinking about how things have gone. Can you talk a little bit to the cultural impact that practicing this after-action reviews or other types of reflective thinking can have on, on a larger organization apart from that, you know, that a small team?
Preston Cline 36:12
Yes. So anytime that you’re in dialogue about behavior and process, right, you immediately will trigger a defensive mechanism in the people that created that culture, those processes or endorsed those behaviors. And because they’ve worked very hard to establish their expertise, they finally got the trains running on time, and now you’re in there talking about the upholstery or, you know, how the design of the trains, and they're, they're understandably like, "Whoa, what are you doing? And so, you know, it took me forever to get this right, why are you trying to break it?" And there has to be an honest conversation about, you know, the impact on multi-generational people within an organization that have an historical context about why certain things were developed, that happens, and you can't just dismantle that or disregard that, that actually really matters and have those hard conversations. And the truth is that the older you get, the harder that is. And so there are huge cultural implications, because you're messing with people's, you know, domains, you're messing with their worldview, you're messing with their narrative and their identity, as experts, as the people in authority that can't be questioned. And those organizations can be very fragile.
Toni Dechario 37:31
Yeah, and I would think that there would also be kind of normative implications of just engaging in this practice on a regular basis. And, and like building that habit. And, you know, you talked about how uncomfortable people are with expressing gratitude, for instance. But the more that you do it, the easier it becomes, right? So, I think that'd be kind of normative applications of like, this is just what we do. There's nothing personal, we do it all the time.
Preston Cline 38:02
Yeah, yep. That's absolutely right. Yeah. And to your point, like the feedback loop, what people find, as you probably well know, is that when you start getting into a practice of gratitude, me as an instructor, I am constantly, you know, part of my job is to tell people when they're not doing well and help them improve. But when they're doing well, I take great pride and as often as possible going you are crushing it right now. And you just see people that are under the gun or under the hammer or whatever it is suddenly light up, like, oh, man, somebody sees me. And that that simple thing of like, I see you, I see you working hard. Keep going, we got you. It's amazing what you can do for people.
Toni Dechario 38:39
So, we've been talking about after-action reviews. But I also want to talk about some other ways of reflecting on past experiences, on how things have gone. Can you describe the differences between, you know, an after-action review and a post-mortem or lessons-learned exercise? Or other kind of forms of reflection? Group reflection?
Preston Cline 39:07
Yeah absolutely. And so, let's think about roles, right? We think about the fact that if, if you're in Boeing, let's say as an, as an organization, and, and there's a mistake made with an engine, that's a real catastrophe, that's a potential catastrophe. So, you need to do a factual investigation that literally has to do with how did that happen? And that's unemotional, that's not and maybe blame will be found. But really, the intention is to find an old term that's no longer accurate, but the root cause, right? To go back, or the multiple, the multivariable, sort of factors that were involved. That's one important thing. But that is not the same thing as making meaning of an experience, right, which is to say, or to, or to make meaning of a particular event, right? And then the third one is to influence the narrative. So, so there's the factual one I just talked about with the investigation, why it happened, then there's the after-action review, and that's literally like I said, helping the team make meaning of, of the event. And then part of or separate than an after-action review is what we call a narrative inquiry, which is literally just asking people what's, tell me the story of what just happened, right? And you go around everyone, just tell me the story. Just don't embellish. Don't, don't speak for others just for yourself. Walk me through it. And what will happen in those, you know, in my world, which is often happens, people will say, oh, that's why you move from there to there. And they're like, yeah, what did you think? Well, I thought this other thing, right? And it's this, this way to calibrate people's expectations of each other, build trust and cohesion, and build that narrative. So that next week, when everyone's sitting around at the pub or lunch or whatever, and the incident comes up or the issue comes up, you say, the narrative in people's heads aren't like groaning and ashamed and embarrassed, but rather one of curiosity, which is, hey, yeah, I know, that was I remember the story of that event. And I know that our team is working hard on making sure it doesn't happen again. That narrative is a very different narrative, then, oh, I'm so embarrassed or ashamed that happened.
Toni Dechario 41:20
Something similar to that, in an earlier season of our podcast, we talked to Holly Ridings at NASA, who you know, and she talked about how this type of debrief thinking has become so embedded in NASA’s culture, that they can’t help themselves, they like debrief everything. They, they’ll go to a conference, and they find she said, they find themselves in a huddle afterwards kind of debriefing the conference. How does that happen? How does that process happen, where it kind of becomes embedded in everything you do, and it becomes a reflexive part of your organization?
Preston Cline 41:55
So, I’ve been working with teams and various, you know, I’ve led expeditions in all seven continents, and I’ve trained children, I’ve trained everybody. And I have this Preston’s Principle, which is ‘distance to puppy pile’, and know people listening are laughing. But I encourage you just if you have children, or you’re in sports teams, or whatever, and you’re meeting them for the, for the first time, just watch this phenomenon. When kids or people are meeting for the first time, there’s a certain amount of proxemics, certain amount of personal space. But as immersive events, as life happens, as crises are overcome, that distance closes. And if you watch kids, when their first day of summer camp and their last summer camp, it’s distance to puppy pile, because at the end of it, they, if there's a room where there's 50 chairs and one couch, all 20 kids are sitting on the couch together, on each other at the end of that summer camp, because they just don’t want to be physically separate. And good teams, you can actually watch this, they just want to be in proximity to one another. Because there’s trust and cohesion and support. It’s part of their identity. They know those other people get them, they can finish each other’s sentences, all these things at then, they also trust them to help make meaning of events. So, here at Wharton, I have a team that I love working with. And because this is, can be a foreign environment for me from time to time, I will literally do that exact thing, which I'll say, “Hey, guys, walk me through just what happened. I am not tracking what that just, just help me understand." And they'll teach me here's what you don't know. Here's, oh, thanks very much. And so, it's very, it's very heartening to me, and it's very confidence building for me to have those moments. But it takes time to build, and you have to together as a group overcome some adversity together for that puppy pile to really to start happening.
Toni Dechario 43:37
I love that. I’m going to think about puppy piles all the time now. I want to talk to you about training. A lot of what you do at MCTI is training related. Training can sometimes feel very separate from real life. And so, this is kind of consistent with what we were just talking about that that kind of question of embeddedness. How do you? Or how should our listeners think about embedding training on whether it's kind of speaking up, acknowledging mistakes, you know, addressing mistakes? How do you make that leap from kind of training into everyday life?
Preston Cline 44:22
So, interestingly enough, MCTI does less and less training these days, and we do more and more education, and management of experiences. And let me break down that, why that matters. Training is for certainty, education is for uncertainty, and experience is for reality. So training, we train for skills, which are based on rules, right? This is how you do this, this is how you'll always do this, right? Education is for uncertainty. And that's about principles. When this happens, here are some principles to guide your decision-making. And then experience is, this is what the world actually looks like and feels like. Here's how to think about that. And what you need is a feedback loop between your experiences and your training and education to say, am I updating it, because you constantly need to be updating it. We all live in a world where we need to learn new software, we need to learn new techniques. We're more aware of multiple voices and the needs of those voices, and how they add or detract from, from what we're trying to do. And that's a constant learning process. Again, learning only happens from a neurobiological point of view, because if it's going right, there's no incentive for you to change your behavior. So, you don't learn by success, because you're like, yeah, good, it went the way I wanted it to. However, if you're doing something is important to you, and you screw up, there's a huge motivation and incentive to change your behavior. And so, learning, mistakes drive learning, but only if you’re honest about them, and you can have honest conversations with your team about them.
Toni Dechario 45:56
That’s great. We only have a few minutes left, and I want to cover a couple of things. One is jumping back, actually, to something that you said, when we first started talking, that your description of training versus education being about certainty versus uncertainty reminded me of. You said that you can have ordered versus unordered problems and that for ordered problems, you are probably best off with a team that knows each other well, that can finish each other's sentences, that perhaps lacks cognitive diversity. And for unordered problems, you don't want that. And you describe swarms as being kind of groups of people who don't necessarily know each other. Why are groups that don't necessarily know each other better at solving unordered problems than ordered ones or vice versa?
Preston Cline 46:52
So, thanks for asking that. So really, in that particular case, it's not that they don't know each other, it's that they're homogeneous versus heterogeneous. And so homogeneous is all female Guatemalan banjo players, right, they all grew up together, they know each other, they know the rules, everything else. However, because of their limited shared life experiences, they just don't have a lot of other options they can investigate. A heterogeneous team is made up of, you know, a dude from Nigeria, a woman from Japan, a guy from North America, et cetera, et cetera, right, of gender, of race, of religion, of sexuality. And the reason that these matter is because they're proxies for cognitive diversity, which is to say, they literally think differently than I do. So, to say the obvious, right? We, you know, this is well researched. But Toni, if we're walking down a city, a city road at night, a city street at night, you're actually navigating that space as a woman differently than I am as a man. Right? And you add gender, race, religion in there, and there's all, there's a complexity of thought patterns that go into the same exact experience, but because of life experiences, because of who we are, have to necessarily have to get navigated differently. And so, when it comes to an unordered problem set, I will really want to turn to you and go, “Hey, Toni, tell me what I don't know from your perspective growing up, living your life? How are you seeing this in a way that I’m just probably not and educate me on what I need to see.” That's why it matters.
Toni Dechario 48:19
Do you have any other suggestions for creating cultures of curiosity and learning? And then finally, if you could provide some recommendations for further reading or listening, you mentioned your HBR paper, if you could give us the title of that, while you're, while you're describing any other recommendations that you have.
Preston Cline 48:43
Off the top of my head, it is at the Harvard Business Review, just look up after-action reviews and Cline, I did it with Angus Fletcher and a good friend of ours from the FBI named Matt Hoffman. And then we also wrote a paper for the Journal of Orthopedic Trauma surgery on routine versus critical communication that’s been out recently. And then if you go to our website, we have, we have papers that we’ve released, we’ve scrubbed from the Joint Special Operations Command that we work with, we scrubbed those papers and started to release them to the civilian world. Some of them are going to be relevant to you, some of them will be not at all relevant to you. And so that that's some resources there. Can you just remind me of your first question, I'm sorry?
Toni Dechario 49:22
Yeah. But before I do that, I'm also going to mention that you also have a podcast that I've been listening to, an MCTI podcast which is fascinating.
Preston Cline 49:28
Yeah. And so, the TeamCast was created after COVID, from some requests from medical personnel that were really struggling, and they just wanted to know some lessons learned. And so, we interview astronauts, special operations, tactical law enforcement, fire, and it's all about, you know, lived experiences and what can we do better on Monday. And so that's really our focus.
Toni Dechario 49:49
Okay. And then the other question was other suggestions for creating cultures of curiosity, and learning from mistakes?
Preston Cline 49:56
Yeah, this is where leadership actually just really matters. You have to incentivize it, you can't punish it. You really want to seek those people that are kind of annoying, because they always have a question. You want to hold them tight. Really encourage them. I mean, give them some boundaries, right? But, but yeah, and I think it's really this idea of are you incentivizing it or not? And I will say, I don't know if this happens in financial services. But if, if you have a culture where the new person comes in, you're like, yeah, it's best just to stay quiet for the first year, not say anything and learn the ropes, you're missing huge opportunities, because they're closer to the emergent problem set than you are, just factually, they are. And so, they've been having to navigate it through their experience. And you're missing a huge opportunity by not, by not including them in the conversation.
Toni Dechario 50:45
And they're seeing stuff that you've become totally blind to over time.
Preston Cline 50:49
That's right. That's right. And the last thing I'll say to that is just my advice to any, any leaders out there is, if you're wondering, how does this all sort of come to a point, it's this: Next time you're leading, if your instinct when you're approached with a question is to lead with an answer, I really encourage you to pause and reply your first one with a question. Just do that. You do that one thing that when, when your people come up to you, and they say, "Hey, boss, I need this, or this is happening. What do we do?"" Just first thing out of the gate? Say, "Hey, let me ask you a question: My understanding of what you just said," or "Let me ask you a question: What other things haven’t you thought about?" Right? By doing that, you're putting, you're making them own the problem together. And by just doing that one thing, you're starting to foster that curiosity.
Toni Dechario 51:38
It's kind of like your grandmother.
Preston Cline 51:40
Yeah, yeah. Yeah. Just literally lead with inquiry is the bottom line and most great leaders, most great leaders, if you'll watch them, they lead with inquiry.
Toni Dechario 51:50
Well, I will end on that. Thank you so much, Preston.
Preston Cline 51:53
Thank you.
Toni Dechario 51:54
For more conversations like this, as well as publications and other resources related to banking culture reform, please visit our website at newyorkfed.org/governance-and-culture-reform.