John C. Williams 00:03
Hi, I'm John Williams. I'm the president and CEO of the Federal Reserve Bank of New York.
John C. Williams 00:09
So this is the latest season in the New York Fed's Bank Notes podcast called, “Banking Culture Reform: Norms, Mindsets, and Decision-Making.” We're going to have eight episodes featuring eight interviewees representing a broad range of experiences – from a banker and a regulator, as well as a neuroscientist and chief flight director at NASA. We really want to understand what drives behavior and what drives decision-making within organizations.
Toni Dechario 00:37
You've talked about this a little bit before. Mark Mortensen, one of our guests, described a story that I think you've actually heard him tell about an executive who put a banana peel in an elevator and didn't, you know, didn't make an express rule about picking up banana peels in elevators. And it took half a day before anybody picked up the banana peel. People get on the elevator and be like "Ugh, God someone left a banana peel, that's disgusting," and then walk off and not pick it up. And he said, that's culture. Getting people to take accountability and say, "This is my elevator, I'm going to pick up the banana peel" is culture, and I'm curious about your reaction to that story.
John C. Williams 01:18
I think it's a great example of how are people empowered to be able to take actions and not sure whether – “I'm not sure what I'm supposed to do here, but I'm going to use my judgment and do what I think makes sense.” You know, I go back to this: the basic observation that we often think about the, you know, the surface things like, “How did we get this outcome?” or “Why did we make this mistake?” or “How did this happen?” When really, when you when you peel away the layers, what you realize is, if there is just a basic approach to doing something that is very embedded in the organization, that's just leading to those outcomes rather than someone consciously making that decision.
John C. Williams 02:02
We started this whole process really focused on lapses in behavior and misbehavior and thinking about, “Well, what are the cultural aspects that lead to those types of actions and how do you change those cultures so that you avoid those kind of things?” And in thinking about that leads you down the road of realizing that culture is throughout an organization, it affects all of the decision-making and really is – you really need to understand what's driving the culture and what kind of cultures that leaders design and build in their organizations, rather than just looking at the outcomes in the organization.
Toni Dechario 02:43
So, what kinds of things do you hope that people will take away from this series and from other work that the New York Fed's doing?
John C. Williams 02:50
Well, I think that the first and foremost is that we learn from each other. And, you know, the past events that we've had – especially during the pandemic – have taught us, you know, different organizations face similar issues. And so, seeing how organizations not only build culture, but how that culture affects their decisions and ability to adapt to changing circumstances is also something I personally found very informative and interesting.
Mikael Down 03:20
We say culture is important culture matters, well, how?
Zab Johnson 03:24
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 03:30
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 03:35
So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 03:44
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 03:49
I can't emphasize enough the disempowerment of the lonely decision maker.
Taya Cohen 03:54
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 04:00
It doesn't make you unethical to wrestle with this. The smart money is, we help you to try to figure this out.
JB 04:07
This is “Bank Notes: Banking Culture Reform.”
Mikael Down 00:01
We say culture is important culture matters, well, how?
Zab Johnson 00:05
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 00:10
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 00:16
So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 00:25
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 00:30
I can't emphasize enough the disempowerment of the lonely decision maker.
Taya Cohen 00:35
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 00:40
It doesn't make you unethical to wrestle with this. The smart money is we help you to try to figure this out.
JB 00:48
This is “Bank Notes: Banking Culture Reform.” The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 00:57
Hi, and welcome to the Banking Culture Reform podcast, part of the New York Fed’s initiative to drive awareness and change in financial services culture. This series, “Norms, Mindsets, and Decision-Making” explores questions like, “Why do ethical people do unethical things? How can organizations encourage staff to speak up and executives to listen up? And what role does context play in shaping people's behavior?” My name is Tony Dechario, and I'm a member of the New York Fed’s culture team. Today we'll hear from Taya Cohen. Taya is an associate professor of organizational behavior and theory at Carnegie Mellon's Tepper School of Business. Taya studies moral character in the workplace, including the predictive power of guilt proneness in individuals. In this episode, we'll learn why highly guilt prone individuals may have a moral advantage. And Taya will share her perspective on individuals’ proclivities towards ethical behavior and honesty, as well as how to hire for these traits. So welcome, Taya. And thanks for speaking with us today. I'd love to start by asking what motivated you to work on organizational behavior? And more specifically, why do you care? And why should others?
Taya Cohen 02:04
Thank you? My original interest in studying organizational behavior came from my interest in psychology, of trying to understand why people behave the way they do. And through psychology, I became increasingly interested in how people behave at work, because we spend most of our lives at work, and how we behave at work has implications both for, you know, how we and other people behave in the workplace, but also outside the workplace as well. So, the field of organizational behavior studies people in organizations, and takes insights from psychology, sociology, and related fields to try and understand human behavior. For the most part, while people are at work or the implications being at work.
Toni Dechario 02:49
One of the things that you are known for is studying moral character, specifically moral character in the workplace. Could you please describe for us what you mean by moral character in the workplace? What is that?
Taya Cohen 03:03
So moral character is a person's disposition to think, feel, and behave in an ethical manner. There's a variety of different character traits that we can think of as moral character traits. I've organized this and thinking about it into personality traits that might be associated with the person's motivation to do good, and avoid doing bad, so we can think of moral motivation. We can also think of character traits that have to do with a person's moral ability, their ability to do good and avoid doing bad. And then other moral character traits we can think of as related to identity. So, does a person care about being a good person? Is morality central to how they see themselves? And so, the term moral character, I use it broadly to refer to a variety of different character traits, all of which would relate to people thinking, feeling and behaving ethically.
Toni Dechario 03:58
Specifically, within your work on moral character, you look at the concept of guilt proneness. Can you describe a little bit how you think about guilt proneness and what it is?
Taya Cohen 04:10
So, guilt proneness is a moral character trait that captures the extent to which a person would feel bad about their behavior if they did something wrong, even if no one knew about what it happened. So highly guilt prone people, they feel bad. They feel guilty about mistakes, transgressions, and especially when those actions negatively impact others. With guilt proneness I think of it as an aspect of a person's personality. So, some people are more disposed than others to anticipate guilty feelings before they behave badly. And that helps them act more ethically, more responsibly, makes them more trustworthy. So, people with high levels of this character trait of guilt proneness, they feel this great sense of interpersonal responsibility, and that helps them be better employees, better leaders, and because they would feel bad for letting others down or for doing the wrong thing.
Toni Dechario 05:03
So, is this something that's just innate in people? Or can it be learned, guilt proneness?
Taya Cohen 05:08
I think it's a little bit of both, you know, there's nature, there's nurture, there's some combination. But there does seem to be something stable, but not definitive or determinant because it certainly changes. Guilt proneness in general, for all of us increases over time, as we get older, we mature, as all of us mature, in general, our guilt proneness increases, but their guilt proneness may increase or, or even decrease at different rates for different people. Some people might be more predisposed than others from a young age and early in childhood, maybe even something that they're born with, although, you know, we don't fully know. And then a person's experiences over the course of their life and childhood and adulthood, shapes who they are. You know, there's lots of debate about, you know, to what extent is personality, a result of a genetic or biological influences how much of it is from the environment? What's the combination? I think the some of the best research in this area probably says about maybe we can think of a third of the variance. So, you know, the variability in who we are, across a variety of personality traits, we might link to something more genetic or biological, something that maybe we're born with. But then, there's another third that might be environmental. And then of course, there's the remaining third might be some combination of the two. And those are very rough estimates. And so, I think guilt proneness, much like how we think of other personality traits, with some nature, some nurture, and of course, the combination.
Toni Dechario 06:38
So, Taya, how do you measure guilt proneness?
Taya Cohen 06:42
To measure guilt proneness, I found the most efficient way is to ask people a few questions. So, my colleagues and I devised the five questions scale, we call the five-item guilt proneness scale. And so, we ask people to imagine different situations that they might find themselves in, and they read each one. And each situation describes how they might behave in this situation. And we asked them at the likelihood that they would feel bad about their behavior, would you feel bad about what you did? And to get a guilt proneness? So, for example, one question, after realizing you have received too much change at a store, you decided to keep it because the sales clerk doesn't notice. What's the likelihood that you would feel uncomfortable about keeping the money? One third one that I like, at a coworker’s housewarming party, you spill red wine on their new cream-colored carpet, you cover the stain with the chair so that nobody notices your mess. What's the likelihood that you would feel that the way you acted was pathetic? What all these have in common is you've done something wrong. Nobody knows about what you did, but would you feel bad about it? And so, we just ask people to imagine these kinds of situations. And it turns out that these questions about what you might think, are a little strange kinds of situations, they predict real consequential behaviors. And because of that, it's kept me fascinated for so many years about how just asking people these questions can then predict their behavior, say three months later at their jobs, you know, and so they do seem to tap into something about a person's character that is enduring. No one knows about what you did, but it's your own conscience, I think of guilt proneness as the strength of your conscience.
Toni Dechario 08:18
To what extent in your experience, do you think kind of individual motivations drive misconduct or unethical behavior, versus how much context and environment drives that unethical behavior? So, in your opinion, is that, would that follow kind of the same?
Taya Cohen 08:36
If we thought of misconduct broadly. I know from my own work, that I can ask people a few questions, the five self-report questions about guilt proneness and that predicts their behavior, their workplace deviance, counterproductive behavior, many months later, you know, three months later, over time, so there's something that we're capturing that's really about the individual. But at the same time, you know, it's not 100% perfect prediction and you know, people, who they are, sort of affects how they interpret situations and effects, what kinds of situations they enter into and experiences they have. It's really hard to kind of pit the person against the situation, because people create the situations they're in. And so, it's really a joint influence. When we think about it, it's hard to separate the two, because who a person is, how they interpret situations, how they treat others, then affects the situations that they're in. And so one example actually is a paper I have under review right now that I wrote with a former doctoral student, and what we found that the people who did more bad behaviors at work were subsequently a week later treated worse, had more mistreatment from their colleagues. And then, experiencing more mistreatment led them to behave even worse, right? And so, you get into this cycle where, you know, negative experiences being mistreated, you know, abusive supervision, ostracism, discrimination all these negative stressful situations can cause people to behave badly. And yet, we also know if people behaved badly, then they are more likely to be treated more negatively by others. And then you get into these patterns where people create the situations that they are in to some extent. But of course, not exclusively.
Toni Dechario 10:19
So, it's just a vicious cycle?
Taya Cohen 10:21
It’s a vicious cycle. And one thing that we found interesting related to moral character is people who had a reputation for being highly moral people – and these are people who their coworkers identified as being relatively more moral than, say, their peers – when they behaved badly, they weren't mistreated as much in the future. So, they were perhaps forgiven more. Whereas people who had lower moral character, when they behaved badly, when they enacted counterproductive work behaviors, they were subsequently mistreated more by their coworkers. So again, these joint influences of the person and the situation, and people creating the situation, makes it hard to sort of separate into one or the other.
Toni Dechario 11:02
I'm gonna take a step back a little bit. What in your opinion makes otherwise ethical or moral people do immoral things or make unethical decisions?
Taya Cohen 11:13
Recently, I've been thinking a lot about what we call moral awareness, or moral recognition. You can think of some people are more likely than others to have moral blind spots, maybe all of us have some moral blind spots. And in certain situations, situational pressures, or the way the situation is structured, could cause us to have moral blind spots or not read or have low moral awareness or not much moral recognition. And so, one thing that I found really interesting in my recent work is that people who have higher levels of moral character, in general, have greater moral awareness of the decisions they're faced with. They're more likely to recognize the moral implications. Other people might be in the same situation, and they might see the decision as, you know, something just a strategic decision, or maybe a financial decision, or a decision not particularly related to morality and ethical concerns. And so, part of the answer, I think, to the question is, do people even recognize what are the moral issues that could be at play here?
Toni Dechario 12:20
Many of our listeners are bankers, who are have an interest in having their staff make ethical decisions, make moral decisions, and have therefore increased moral awareness.
Taya Cohen 12:34
So, I think it's not just about what the person brings in and the lens they use, but how decisions are framed. I've been thinking lately about what's the opposite of moral recognition. And the idea that my colleague Eric Helzer, of the Naval Postgraduate School, and I have come up with is the idea of game framing. If you think of something as a game, there's two elements that might be related to having low moral awareness. So, if you think of something as a game, often you can think of it as adversarial, right, with winners and losers and competitive. And so, that could lead to a competitive motivation to achieve one’s goals and beat opponents. And then the other aspect of a game frame is that in games, starting and endpoints, rules of play, scoring methods, all those could easily be different. And what's important here is that when it's a game, there's a sense of, it's arbitrary, it's artificial. And the rules don't carry over outside the specific game. If you're a banker, and you see banking as an adversarial context, right, with winners and losers where the goal is to win. And how you behave in that setting doesn't say anything about who you are, right? It's just me playing the game, it doesn't say anything about my character, my values or anything about me, that combination can lead to a lot of dishonesty. So, some people I think, might be more naturally, by who they are, be more or less likely to play a game frame. But also how, you know, the workplace culture can amplify or attenuate that sort of framing of decisions. So, if there is a workplace culture, that frames things, as you know, very competitive environment, adversarial, and it has a message that you know, how a person behaves here in this workplace doesn't say anything about who they are about their true character, or that the standards in this setting are not real, or you know, they're more like temporary agreements than moral absolutes, than that mindset. And that culture that's created could allow for a lot of dishonesty unethical behavior. But conversely, if decisions and the culture in the workplace suggests that one more of cooperation or value creation, that is not this win-lose context. And perhaps most importantly, if saying, “Well, how you behave here, it is important. It is, it does say something about who you are,” right? It's not just a game that you're playing with no implications for anyone outside of this context.
Toni Dechario 15:00
Do you think there might be something about financial services that would lend itself more or less to this game framing or to potential ethical behavior?
Taya Cohen 15:10
I don't know if we could reach a broad brush to think of all of financial services, but certainly elements of it seem like they could lend itself to this very win-lose mindset where being cooperative is seen as a liability. If the financial services industry and the work is structured, that people are become very detached from the implications of their behavior for other people, or for themselves. If it's, you know, just numbers in a spreadsheet, it can be easy not to recognize the implications of that, and the ethical implications, to the extent that the culture and the work removes people or makes it hard for people to see the implications of their decisions for themselves or others where that could allow for a person to believe "Well, what I'm doing here doesn't really say anything about who I am," that would be a potential problem for the industry.
Toni Dechario 16:02
Are you aware or familiar with any interventions to counter either that detachment from the outcome of your behaviors in your decisions, or to counter kind of, the “I'm just a player in this game, this isn't me...?”
Taya Cohen 16:18
I think there's been work in a variety of different areas that suggest various kinds of reminders. So, you know, early work in psychology, not necessarily in the domain of unethical behavior, but if you don't have any mirrors up and making people kind of think of themselves, that can certainly have an impact. And I don't know if mirrors are the answers, per se, but things that make people sort of see themselves and see their behavior, as you know, how I'm behaving here does have implications for how I might see myself. And so, interventions that make people kind of understand the consequences of their behavior for themselves or others. Earlier in my career, I studied intergroup conflict and competition and cooperation between groups. And one intervention that we found worked very well there, which I think could have relevance in the about ethical behavior as well, is considering the long-term implications, the consideration of future consequences, we call it. So, if I do this now, how is, you know, that going to effect the subsequent decisions other people might make? What are the effects of that? A lot of unethical behavior comes about because when people are short-sighted, when they're focused narrowly on short-term gains, especially if they're focused narrowly on themselves, so interventions that encourage people to think more expansively both about the long term in addition to the short term, and about other people, in addition to themselves, those interventions may look different in different workplaces. There's the motivation to do good or avoid doing bad, but there's also the person's capability. And so, interventions that give people the information, the tools they need to behave more ethically, I think that can be effective as well. And there's a framework that Professor Mary Gentilly at the University of Virginia has developed called Giving Voice to Values. And the idea behind that framework of ethical leadership is that a lot of people, most people just want to do the right thing, and they notice something. And so, sometimes we don't notice, that's the moral awareness. But sometimes we do notice something that we're not comfortable with, but we don't speak up. Often, we don't speak up because we don't feel prepared. We don't feel confident, do you give people the tools, the information, the confidence to be able to voice their values to speak up when they have concerns, to figure out how they can do that effectively? That can be a potential way that to make people more willing to speak up when they do notice issues.
Toni Dechario 18:47
What might some of those tools be?
Taya Cohen 18:49
One thing I teach is negotiation. And in negotiation, we know people are more likely to lie when they don't feel prepared. And so, simple tools like having a negotiation, like having a planning document where you identify things that you want to reveal things that you don't. And so similarly, in this idea of, you know, how can a person give voice to their values or speak up? Do you have a script? Have you identified what are the ways the audience may be more receptive, so you don't make them feel threatened? And so, there's not a sort of magic bullet or one particular thing you might say, but the idea is, if you sit down, okay, what if I were to voice my values? What if I were to speak up when I noticed something I wasn't comfortable with? What are the implications of that? And what I have looked at in my work on honesty is that people believe having honest conversations will be much worse than they actually are, when they have them. And so, recognizing that and, and preparing for those conversations, figuring out what information you need, and how the other side might be more receptive to it. That's the process that I think can be helpful.
Toni Dechario 19:54
Is the connection between guilt proneness and outcomes. So, could you give a description of the work that you've done on linking high levels of guilt proneness to particular outcomes in terms of decision-making?
Taya Cohen 20:11
When I originally started studying guilt proneness it was to better understand the question of, “What is guilt? What is shame?” and to differentiate those two. So, guilt proneness is feeling bad about your behavior when you've done something wrong or anticipating that you would feel bad. Shame proneness is feeling bad about yourself, you know, when you do something wrong, you may feel bad about both what you did, as well as who you are. But shame is not as helpful as guilt. Because when you think, “Oh, I'm a terrible person,” it's much harder to change who you are, right? And that can not necessarily lead to as many repair-oriented actions. Okay, so with that in mind, once I have had a better sense of what guilt proneness and shame proneness are, people's responses to the guilt proneness questionnaire predicted whether they were more likely to lie to their peers in negotiation roleplay exercises, for example, several weeks later. So, that was really intriguing, because just asking people, a few, you know, four or five questions, and then it could predict with some degree of accuracy, who was likely to lie to their classmates, I thought that was fascinating, right, because it was capturing something about that would relate to their ethical behavior. And so, that was about 10 years ago. Now, I had those initial findings and back in 2011. And at that time, and then in the many years since, I've become interested, what are all the different other things that guilt proneness might predict? Things like trustworthiness, which we've looked at, in laboratory studies, online experiments, so say, you send me an amount of money $10, and then it triples to $30. And I have the choice, do I send any of that back to you to share with you or just keep all the money? Turns out, people who are low in guilt proneness, they're more likely to keep all the money. People who are higher in guilt proneness, they're going to share it back with the person who entrusted them with it. So, we've looked at things like trustworthiness and we find that that's motivated by this sense of responsibility, you trusted me, and now I have this responsibility and accountability to act in a trustworthy way. And also, what we can think of in a way is the opposite, the helpful behaviors in the workplace, what we call organizational citizenship, taking time to advise others, mentor others, rearranging your shifts, you know, vacation schedules, people who are higher in guilt proneness, they feel this sense of responsibility to others. And that can motivate both more positive behaviors and be less likely to, for them to do negative behaviors. And then finally, one other set of findings that was initially identified by Professor Rebecca Schomburg, at University of Pennsylvania, and Professor Frank Flynn at Stanford University, is the relationship between guilt proneness and leadership potential, which I've subsequently replicated in my own work, that people who are high in guilt proneness are more likely to be judged as effective leaders, by their peers, or by people who report to them, or supervisors. And then this has been looked at in a few different ways, with the idea that if a person tying guilt proneness, they have more of the sense of responsibility to others, and having that sense of responsibility to others makes the person more effective leader.
Toni Dechario 23:12
So, I want to go to kind of a practical conversation, which is, you know, putting myself in the shoes of somebody who's running a bank, and wanting to build and hire a highly guilt prone, well-practiced, aware, staff, that is not framing things as games, but rather kind of living their own moral compass when at work. It sounds to me like those are the elements that, you know, from your perspective that I would really want to make sure to kind of imbue.
Taya Cohen 23:48
We need to hire people with high levels of moral character. But we also need to create organizational cultures that is going to bring out the best in people in the workforce. And so, I often think of personality character, like a habit, once something becomes a habit, it can become hard to change. And so if you think of from the hiring perspective, if you're hiring people who already have established ways of behavior, whether that's something about who they are, what they learned at, say, a prior organization that maybe had a negative culture or some other prior experiences, if you hire people who kind of bring that with them, it can be hard, you know, to change that habit. Organization can start by hiring people that have the character traits that they think are important. But those you know, might be hard to judge. Some things may only come out in certain situations in high pressure situations, you know, you might not realize that. And that's not to say, you know, even if you hire good people, there can still be environments where the culture that doesn't allow them to be their best selves. And so, that's where that, you know, what is the organizational culture? What are the policies, how are decisions framed, are they framed very narrowly about you know, short term focus? Are they framed a little more broadly? Are the relevant stakeholders highlighted or is it more narrow focused on one particular set of people. But even the most moral people can have blind spots, could have situations that would hinder them from having a lot of moral recognition or moral awareness. Or even if they have, say, that they recognize a moral issue, there could be cultures that aren't they don't feel safe speaking up to the voice those values. And so, you need both the culture and character.
Toni Dechario 25:25
So how do you hire for moral character?
Taya Cohen 25:28
We've come up with two questions, interview questions that can help. With hiring first, you have to identify what you want to look for, what's important? So, three questions, I think, kind of capture the essence of this. Would this person feel guilty about committing to transgression making a mistake, even if no one knew about what they did? That captures what guilt proneness is. And would they feel bad about letting others down or harming them? Even if it was not done intentionally? Does the person have a strong sense of responsibility for others? So, one question we called the difficult dilemma question, we asked people to describe an experience in which they were faced with a difficult dilemma at their job, a situation where they found it hard to decide what to do. We asked them what factors did they consider? What do they do? What if anything, did they learn from the experience? The idea here is that people who have higher levels of moral character are more likely to mention some kind of consideration of others, maybe consideration of long-term consequences. They would have more, maybe, moral awareness of the implications of their decisions, when you ask them to describe a difficult dilemma and what they were thinking about? Is their response indicating that they're focused narrowly on themselves or very short-term focus, or is it thinking more broadly? And so, that question is interesting, because it seems like it provides information that people can make a judgment, a sort of a holistic judgment, of whether they think the person is relatively high or low moral character from reading responses to that question. And then we have a second question we've devised that also seems to provide helpful information. We call it the mistake question. Please tell us about a time when you made a mistake at work. How did you feel when this occur? What did you do? What if anything, did you learn from the experience? And here the idea is, you know, when people make a mistake, do they feel bad about it? Does it motivate them to correct their behavior, do something different in the in the future? And how people talk about mistakes they've made in the past, can provide some information about their character. We found that people can sort of read responses to these questions make accurate predictions about a person's likely to behave, how much workplace deviance they would do at their jobs, you know, decisions in different economic games. What's not yet clear, what I think we still need to learn is what exactly are the judges, the people reading the responses? What are they cueing into that allows them to make accurate judgments? Because I think it's more of an intuitive, people just get an impression from those questions, the information revealed by them. And it turns out those impressions have some level of accuracy. But of course, not 100%. Because why would we think an answer to one question would be 100% predictive, the fact that it provides any predictive useful information I find fascinating.
Toni Dechario 28:09
It's interesting that people have an impression, even though I believe you said that they're reading written responses versus reading someone's body language or, you know, some other kind of in person way that you have intuition about somebody.
Taya Cohen 28:24
So, the way we've done this work is with written responses. We haven't yet tested this in face to face or online or even phone call. There's earlier work on lie detection. That suggests sometimes providing me with more information leads people to focus on the wrong things. And I think there's early work by Dr. Paul Ekman, that found people are better at detecting lies over the phone than videos, for example, because people focused on the wrong things. It's interesting to test and we haven't compared the how people do with these sorts of questions written versus verbal, you know, audio, visual, but I think there is some reason for us to think that reading the written responses could be just as revealing, if not more revealing, because when it is face-to-face, people may tune into the wrong things by assuming, say, a person's more attractive then they must be more ethical, right, or some other way in which a person communicates their tone, you know, their vocal cues. That information may not be diagnostic at all, yet people could use it to form an inaccurate judgment.
Toni Dechario 29:28
Now that we've you know, hired people with high moral character using your interview questions, they've come into the context of whatever our culture is. And so, what do I do to make sure that I'm creating an environment in which people are able to have high levels of moral awareness, recognize moral situations, and kind of avoid a game mindset?
Taya Cohen 29:57
One of the findings from my work is that people assume having honest conversations with colleagues or friends with all sorts of data is going to be much worse than it actually is. So, people withhold information, they stay silent. So, the research on things like psychological safety, or voice and silence in the workplace that is focused on how do you create a culture where people feel comfortable speaking up, where they feel safe to speak up, I think that has implications for honesty and conversation about ethics, and I mentioned that idea of giving voice to values. And so, making it clear that people can disagree, create an environment from the leadership in the organization, from the policies in the organization, that allow for disagreement, that allow people to raise concerns without fear of retribution or feel, you know, that they'll be humiliated. When you think of an ethical decision making, we can think of what are the factors the organization can do to bring out more awareness, framing the goals, so, goal setting – there's some work, talking about “Goals Gone Wild,” which is a great title of a paper – with the idea that when an organization has goals that are too extreme, then people feel like they have to meet them, then they can lose sight of a lot of other things, right, that made it makes it more of this competitive environment, people might feel like they have to cross ethical boundaries to reach those goals. And I think it also reduces moral awareness, too, because people are so focused on achieving that goal, it can lead to that game mindset, that game framing. Thinking, you know, what are the appropriate goals your organization can set? If you want a culture that promotes teamwork? How are people being incentivized, especially on their individual performance, and where it creates an environment where people feel like they may need to behave, you know, in ways that could harm their colleagues to get there, you know? So, thinking about what are the incentives? How are the rewards set in the organization, all of that can be helpful, I think, for creating an ethical culture.
Toni Dechario 31:58
So, I'm so curious, as I listened to you talk about all of this, as I listen to you talk about moral awareness and the importance of practice, also, in moral recognition, how has your work influenced your own behavior and your own awareness of your moral character and your moral behavior?
Taya Cohen 32:18
I mean, certainly, I probably think about guilt and shame more than most people, right? And honesty. Actually, most recently, the work I've been doing is on honesty. And so, I've been very cognizant of situations where me and people I'm around could be more or less honest. I have two young children. And so, being really mindful with my conversations with them, you know, asking a question, because sometimes the easier answer is not to be fully honest, or thinking how do you be honest, in an age-appropriate way? So, I've been sort of very mindful of that. And also, with guilt and shame, you know, when I think of my own personal life and raising children, into the questions earlier about how do we develop this, if a person does something wrong, and they feel bad about their behavior, that's very different than feeling bad about themselves, right? That's the difference between guilt and shame. Right? So, you made a mistake here, but it doesn't mean you're a bad person. And so, I think about that, as I'm sure many parents do, how do we frame the messages to our children? How do we, you know, fighting with a spouse or a friend, you know, I think it can be easy to say you're this or you're that right? That would be a very more global way of saying it to put negative labels on people, as opposed to, here's this negative thing that you did, or I did, and to think specifically about behavior, which would be more about guilt, as opposed to more globally about a negative attribution about yourself for another person.
Toni Dechario 33:43
So, I was thinking about my daughter as well, when you were talking about shame versus guilt, and how do you create guilt proneness and not shame?
Taya Cohen 33:53
Yeah, I think it's a great question on with parenting, it comes up a lot. And I think the key thing that I take away from the research that I keep in mind is, it's really hard to change yourself as a whole person. It can feel overwhelming. It can make people want to hide. It can make them angry, if they feel like they are flawed in some way. That sense of shame is a really negative thing to feel. And it's hard to know what to do with that. It's painful when you know you've made a mistake or done something wrong, and maybe you can't fix it. And I think it's critically important when we think about children, and also some managers giving feedback, right? I mean, it's a thing, what are the workplace implications, when there's times at work when we have to give negative feedback to others, you know, best practices there? Can you give feedback about specific behaviors and what could be done differently? Right, as opposed to a more global negative evaluation the person or something that the person doesn't really understand how they could behave differently. So, that difference between a behavior versus a person's whole self, I think, is one of the things that that sticks with me and I'm not sure how it influence is my day to day but I'm sure it does,
Toni Dechario 35:02
Can honestly be induced through kind of good examples and sharing positive examples?
Taya Cohen 35:10
It's a fundamental truth and psychology that we look to others to try and figure out the appropriate ways to behave. We do this sometimes consciously, sometimes unconsciously. So, the examples that are around us, I think, especially people who are very visible positions, leaders in organizations, how they behave sort of sets the tone, sets a message, and people look to that. So, if you have high profile people, people who are very visible in their organization, very influential power and status, if they are perceived by others as being more honest, as a more ethical and whatever way, I think that can set an example because it lets people know what the expected behaviors are, what the organization values. So, I do think having what we can think of as moral beacons in the organization can help influence the culture in a positive way.
Toni Dechario 35:58
Taya, thank you so much. This has been fascinating and really fun.
Taya Cohen 36:02
Sure.
Toni Dechario 36:04
For more conversations like this, as well as publications and other resources related to banking cultural reform, please visit our website at newyorkfed.org/governance-and-culture-reform.
Mikael Down 00:01
We say culture is important culture matters, well, how?
Zab Johnson 00:05
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 00:10
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 00:16
So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 00:25
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 00:30
I can't emphasize enough the disempowerment of the lonely decision maker.
Taya Cohen 00:35
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 00:40
It doesn't make you unethical to wrestle with this. The smart money is we help you to try to figure this out.
JB 00:48
This is “Bank Notes: Banking Culture Reform.” The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 00:57
Hi, and welcome to the Banking Culture Reform podcast, part of the New York Fed’s initiative to drive awareness and change in financial services culture. This series, “Norms, Mindsets, and Decision-Making” explores questions like, “Why do ethical people do unethical things? How can organizations encourage staff to speak up and executives to listen up? And what role does context play in shaping people's behavior?” My name is Tony Dechario, and I'm a member of the New York Fed’s culture team. Our guest today is Mark Mortensen. Mark is an associate professor of organizational behavior at INSEAD. He studies collaboration, team dynamics and communication. We first featured Mark as a panelist in our webinar on “Culture in a Post-Pandemic Workplace.” In this episode, Mark tells us what drew him from engineering to behavioral science, why we can't rely on rules to drive good decisions, and the critical importance of psychological safety in building strong cultures. Thanks for joining us, Mark, and welcome!
Mark Mortensen 01:52
Thanks, Toni. I'm really glad to be here.
Toni Dechario 01:54
So, what drew you to study organizational behavior in the first place, and specifically to be interested in how teams function and team dynamics?
Mark Mortensen 02:04
It's the classic story, right? It's navel gazing. So, I have to start by saying, I'm an organizational behavior guy, but I'm actually a recovering engineer. So, I started off my career doing computer science, trying to build different technologies. And honestly, what happened was, we put out technology, of course, of course, our technology was perfect. We never made mistakes, but things went wrong. And you know, I'd hand something over to you and you'd start doing stuff that after a while, I'd say, “What?” And I started to realize that more often than not, the problems weren't, at least entirely, the technology. It was more about the interactions and the interpersonal interactions and the interactions between people and technology. So, we introduced a new technology, we introduced a new way of solving problems. That's kind of what got me into it. And in particular, when it came to teams, honestly, my own experience is working in teams, working in collaborations, trying to get stuff done, and then realizing that the dynamics of the team, the dynamics of the people, and the way in which they think about things actually really played a big role in determining whether or not this stuff was successful in the first place. So, I sort of retooled myself to go more into the psychology side of things and try to understand the original equipment as opposed to the technological stuff we were trying to build.
Toni Dechario 03:18
How do you think your background in engineering actually gave you a different perspective on organizational behavior than the people that you were studying with?
Mark Mortensen 03:27
To be fair, a lot of people when they see, you know, “Hey, it's the organizational behavior guy coming,” they have this sort of knee jerk, allergic, slightly annoyed reaction of, you know, “Kumbaya, let's hug and talk about feelings and this sort of stuff. And but hey, we got stuff to do.” You know, engineering is about solving problems, fundamentally. And that still is the core of what I what I love to do and what I want to do. So, it gives me a perspective, also thinking about people, not just as, you know, squishy things that are kind of arbitrary, but recognizing people follow rules in the same way as computer systems, social systems, you name it. And so, I think it gives me an angle that is honestly sort of an engineering angle to understanding social dynamics, and internally the sort of the wiring that we've got.
Toni Dechario 04:17
Okay, thanks. I'm going to jump right into kind of our central question, which is, why is it that otherwise ethical people do unethical things?
Mark Mortensen 04:28
So, I think you're asking exactly the right question when you start to frame by “otherwise ethical people.” You know, I hear a lot of people with this an overly simplistic, overly black-and-white view of, “Look, well, you know, the reason bad stuff happened was there were bad people and bad people do bad stuff, and thus we have bad.” Yes, that does happen. And look, we can all point fingers to obvious massive, massively egregious failures and people who are doing really bad stuff, but more often than not, it's the smaller things right? “It's bad people” is sort of an easy and convenient scapegoat. What we have to remember is it tends to be things more like exogenous pressures. Every one of us in our jobs have pressures, pressures to deliver to given deadlines, pressures to succeed in our work, the pressures we put on ourselves about wanting to be good at our job and have the reputation, this sort of thing. Now, you take that, and you add on to that, “Oh, by the way, this is also how you earn your living.” So, if you have a family that you're supporting, if you have any concerns out there in the world, other things that feed in, we operate under tremendous competing pressures, many of which are not perfectly aligned with ethical behavior. And let me be clear, I'm not saying they are necessarily unethical pressures. But if the pressure is you got to deliver, and you know, look, I can be sneaky and do something that's, well, not really sneaky, but, “Look, everything's got gray areas, what if I push a little bit further to the left just a little bit? It's what everybody's doing.” That's where we start getting into the situation. And more often than not, it's about those pressures and the other pieces. It's almost always about small steps and escalation of commitment. It's very rare that a breach is something like, you know, zero to 60, “We were completely ethical, and everything was great, and then we decided to go off the deep end and do something hard.” More often than not, it's, “Look, we're behaving well, we push the bounds a little bit on this. And then, guess what? Nothing happened, nobody got hurt, it was really fine. Okay, maybe we can push another a little bit further on the next one, because we're still under the same pressure.” And not only that, but, “Hey, we delivered last time.” So, everyone's going, “Hey, deliver, deliver some more, we want to see more.” It's that situation that we really have to, I think protect much more against then looking for the, you know, the real, real bad people who are doing terrible stuff in extreme cases.
Toni Dechario 06:55
I've never heard, is it “escalation of commitments?” I've never heard that. I've heard kind of a related discussion of normalization of deviance of that idea that, well, “It was fine last time and nobody got hurt. And we'll just do it this way again, even though it's technically against the rules. It's fine. Until it's not.” So, how do you counteract that? How do you if you're an organization looking to kind of encourage people to resist the pressure to deviate? What do you do?
Mark Mortensen 07:29
So, two different things. One, I think we have to think about what are the right tools that we need. And here obviously, part of what we're here to talk about is things like organizational culture, and I happen to believe that culture is one of the really most robust ways to tackle this. But the other thing is we need, you have to put yourself more in the mindset, or at least it from my perspective of helping people, you have to put yourself in the mindset of, how do we help people to maintain their ethical compass and to maintain that ethical behavior? And that means, when they're feeling those pressures: a) are you giving them a release valve? Are you giving them a place that they can talk about that? Are you normalizing the conversation about the gray areas, right? Sometimes we actually don't exactly know where the line is. And so, if you don't know where the line is, you run into a very new or a different problem, which is, you may be unethical accidentally. Now, one of the problems is when you know that organization is going, “We're watching you, you Toni right there, yeah, we're watching you and everything.” The last thing you're going to do is say, “You know, I was thinking about this thing that I might be doing, and I'm not sure it might be unethical, no way, right?” This is about psychological safety, about creating an environment. What you need to do is you need to create an environment where you say, “Look, we're going to talk about this, we're gonna bring it to the surface and talk about how it's sometimes tough. And that's okay, it doesn't make you unethical to wrestle with this.” The smart money is, we help you to try to figure this out. And so, I'm a big believer in process, because look, we have to outsource what we're bad at. Just relying on you know, “I'll probably be ethical, I'm generally a good person,” you're giving yourself an unfair disadvantage. Outsource to a process. Say, “Look, once a week, we're going to have a roundtable where we talk about this stuff, we bring it to the surface. We put in place a process that’s there for anything where I'm the one person making this call, where maybe I might make the wrong call, we have a check and a balance. I have to run it past one other person who's unconnected to the project,” right? These are simple mechanical things, but one is normalizing the conversation. The other is giving people help to make the right ethical decisions and get into that.
Toni Dechario 09:37
One of the other things that you touched was surveillance and autonomy, or lack of autonomy, and that feeling that you're accountable for and responsible for your decisions. Can you talk a little bit about what you think the impact of being closely watched may be, and kind of the counterpoint of what is it to have autonomy? And how does that impact your behavior?
Mark Mortensen 10:04
Look, this is where it gets tricky, because some of it feels a little bit counterintuitive, right? Our logic would be, “Hey, when things are getting concerning, what do we do? Let's watch more. So, we make sure that we catch it.” There are a couple different things. One, you do run the risk that people, in effect, outsource their moral compass, or they outsource their ethical awareness. “Look, I don't have to worry about it. Because I know if I screw up, Toni is gonna catch me. So, I don't have to worry about it myself.” That works great, as long as Toni's always there to catch you. And this is part of the problem also. And this is what I was what I was getting at around that the idea of culture. One of my favorite stories was a CEO that ran a little experiment, there's an elevator that went up to his part of the floor, not only to him, but you know, when you got the top floor, that was his place. He had somebody put a banana peel in the elevator, and they just left it there. And then he stationed a couple of interns to watch and say, “Okay, so what's going to happen, who's going to pick up the banana peel?” The banana peel sat there for a while, a good chunk of the day with people walking and going – and this is a very substantial firm, you know, revenues in the billions, this is – and they'd be walking, “Oh it's gross, banana peel, who would do that? Well, that's my floor,” right? And they would get out of the elevator. And, to me, it's a really illustrative example, about the power of culture versus rules. If you have a culture of “We own this place, we care for this place,” the first person who walked into that elevator should go, “Oh, that's gross. I don't know who did that. But I'm going to fix that because I feel a responsibility.” If you rely on rules to do this, and surveillance is the rule approach, if you rely on those rules, you better make sure that you have a rule that says “When there is a banana peel in the elevator, thou shalt pick it up, including rider, if it's an orange peel, also pick it up. If it's a tangerine peel, also pick it up, clementines, those are okay," you need to have some rules for every possibility. That's the problem when you rely in effect overly on the rules is you give people the freedom or the right to not have to worry about it. If it's not fitting into one of those categories, and that's the danger when it comes to especially challenges around ethics. How many times is the ethical breach a new thing? Very, very often, it's not the same breach that we've seen, it's, “Oh, well, now there's a new type of derivative that we have to think about in a slightly different way in the cost. And there's something that maybe nobody even realized was a concern. But yet, that's bad.” That's why culture, in my mind, is the best way to approach it. Because if you instill in people a sense that they need to have ownership, and they are responsible for maintaining the ethical standards of the organization, you also trust them to make the judgment call to say, “You know what, we've never talked about this, this is making me feel a little bit weird.” And again, if you created a culture where you talk about it, that's what comes up in the next meeting, going, “Hey, guys, have you ever thought about the fact that somebody could do this?” and suddenly red flags and you catch something before it's ever happened? And that's why I think you need to have this interplay between the broader culture and the rules that you're putting in place. And you have to think about them in the way in which those pieces fit together.
Toni Dechario 13:18
So, I have a couple of follow ups to that. If you're able to create a culture in which someone is willing to say, “Hey, I haven't seen this before, and I'm not sure about this.” How can you make sure that that's heard? And that's incorporated into decision making?
Mark Mortensen 13:37
You're hitting on exactly the right challenges that are questions, right? One is, do they raise it in the first place? So, is it vocalized, so that there's the possibility it gets picked up by people who can do something about it? The second is when it's localized, is it actually heard? And do people then take that and say, “Whoa, we've got to be concerned, we need to do something with it.” On the vocalization front, right? You need again, to think about some more cultural things, psychological safety, things like that, do you have you created an environment where people feel free to share one of the critical drivers of that, you need to model that behavior at the top. You need to model vulnerability you need to model and show, and this some of the work that I did with Amy Edmondson around psychological safety. Obviously, she is the source of you know, the great thinking around psychological safety, but you need to think about what is it that will make people feel comfortable and able to share that stuff, and then we'll get into the conversation. Now, whether it gets heard, you've got a new set of problems, right? One, again, the same competing motivations I talked about in the beginning, those are there as well. Right? “Look, I've got an employee who is going on and on and on about this thing, and they're really worried about it and I don’t know, maybe it's a concern, maybe it's not. I've got competing pressures I've got…” Managers face the same ethical challenges that everybody else faces in terms of the pressures they're getting from their bosses and from laterally and from their employee. So again, you need to put in place some structures to, in effect, force the conversations to happen to force, the openness and the discussion around it so that people can take action. You know, the other piece of the puzzle is, that all of these things require a certain amount of policing sounds bad. And when I say policing, I don't mean, I'm going to watch you like a hawk and see what's happening. I mean, they require vigilance. So that when these things come up, they are dealt with. Everybody listening knows how toxic it is, or how devastating it is to any culture around whether it's whistleblowing, ethical awareness, whatever it might be. When a complaint is raised, and they feel like they aren't heard. I bring this to the “Hey, guys, I got a big warning flag” and people go, “Yeah, yeah, yeah.” That's pretty damaging, because I risked myself by, potentially, either for going something that could have been, you know, maybe very lucrative, or whatever else for me, maybe I opened myself up, and certainly when it comes to whistleblowing, to retaliation to all this other stuff. When you hear something, you have to show that it is heard and understood, and it has to be acted on. That doesn't mean you have to take any accusation. And so, there's a balance there, you need to show that it's heard and acted on. And it needs to be acted on in a fair way. Fair process that, “Look, as soon as we heard it, we took it seriously, we've documented it, we've started a process.” Now, the process may go through, and in the end, say, “Look, it's actually okay,” or, “This person did do something wrong.” You need to make sure also that there's no retaliation going towards the person, you know. They did what they thought was right. It was in everybody's best interest that we vocalize this, that we talked about it, and then we take it from there. But when they feel like they've been heard, and that's what I mean by the policing, you need to take it seriously. You need to make sure that people feel that when they do follow the rules when they do what you want them to do. That behavior is rewarded through recognition through whatever else, I'm not saying monetary reward, but they feel validated for having taken that action.
Toni Dechario 17:11
Yeah. How about in cases where the issue isn't raised through formal channels? It's not like a whistleblowing incident. You know, it's kind of just a person raising their hand and a meeting. And let's say the issue is not… let's say they're wrong. How do you handle that in this world in which you want to make sure that they're not afraid to raise their hand, again?
Mark Mortensen 17:33
Totally fair. And I'm really glad you're saying that, because again, people say, look, but we can't grind everything to a halt, you have to try to deal with every one of these different situations. Again, what is important is that people feel heard, and that hearing is validated. Now, what that means is you need to have a way to explain to whatever is the person, whatever is the entity that's bringing it up, we've listened, we've heard it, we've evaluated it fairly. And we've reached this conclusion. Now, look, they may disagree with you. You may say, “We've looked and this is actually okay, we've looked and we don't think this is malfeasance, whatever, you know, bias, whatever it might be.” They need to feel that that is clear. One of the ways that you can, that you can move in that direction, is remember that the last thing you want to do is set policy when there's an issue on the table. So, one of the ways to deal with this better, is have a conversation before there's a potential breach or a concern, or somebody brought something up. Have the conversation now and say, “Look, here are the steps that we will take, you know, if Toni brings something to the surface, we're going to do this, this, this and this, and you get everybody to agree on. If we do these four steps, do we agree that that is good due process for how to handle these sorts of issues?” Now, of course, if they say, “I refuse, you're wrong,” you can't please everybody. Not everybody will always be happy. But having that conversation out front is way, way better than waiting until you, Toni, raise something. And then we go through and at some point in the process, decide, “You know what, it's actually not to concern,” you're never going to feel as validated. You get people to commit to the process beforehand, without knowing what the outcomes will be. Then when they run through the process, it's a lot harder for them to push back beyond saying, “Did you do the process? You did, then we're good.”
Toni Dechario 19:26
I think you've kind of answered this question already. I think I know what your answer is, I'll say, but I'm curious as to what your thoughts are on the kind of relative weight or importance of individual motivations versus group norms in decision-making.
Mark Mortensen 19:41
It's a tough one. In a certain sense, you’re asking me to pit psychology versus sociology, right? You know, psychologists are gonna say, “Look, it's about the original wiring and that stuff.” Then the sociologists will say, “But look, the system affects how we behave.” And my perspective is it is impossible to disentangle those. Go back to the examples I gave earlier on about motivations, right? What motivates you? What motivates me to do anything? Well, some of that is my own personal psychology, very much about personal motivations, what it means to be me. Where do you categorize family pressure? Well, my family wants certain things from me and is that not personal? It's kind of personal, but it's kind of not, it's also my family, it comes from other sources. So, then we start getting a little bit in the middle. Then at the same time, there's broader societal pressures. And where I sit in the community in my role. I think it is impossible to differentiate and it is extremely idiosyncratic. There's a lot of research, for example, on identity, how do we think of ourselves? One of the very consistent findings, everybody has about as many identities as things that they look at in the day. I mean, you have so many. I have an identity of me, as an academic of me, as a researcher of me, as a father of me, as a son of me, as a maybe as a sibling, or as whatever, all of these identities. And in any moment, one of them is triggered more or less than another one, and they're constantly moving around. That's why I think it's really hard when people say, “Well, which one is it?” It isn't one, it's the mix of all of them, there may be a time and coming back to what we're here to talk about. When we talk about ethics, we talked about breaches, these sorts of things, at any point in time, you know, “Hey, COVID hit and by the way, you're now feeling really, really at risk for your job, because the organization is going through a whole bunch of stuff, maybe you feel a whole lot more pressure to stand out.” And to excel, that might be a really strong motivator to toe the line a bit more than you naturally would have. At a different point in time, you may be much more motivated by your own personal drivers, I'm, I'm passionate about this, we all ebb and flow over time, it's impossible to pull those pieces apart. But I think what is useful is to give a little bit of thought to what are the different things pulling at you at any point in time. And I think that's a really good exercise for anybody to do, you know, sit down with a piece of paper and ask yourself, “So, what's motivating me now?” It's kind of fun, you can do it. Now you can do it in six months. And you may find out the same list.
Toni Dechario 22:09
That actually reminds me of, you know, this, this famous study that was done with bankers, testing their ethics, basically. And they were primed, the first time they took the test. Well, they weren't primed with anything, they were just asked to take the test as a human being, right? And then the next time they took the test, they were primed as bankers. And they were shown to be far less ethical, the second time around. And so, I wonder what your perspective is on? Is there something about financial services about banking, that impacts the extent to which ethics are an issue in decision making?
Mark Mortensen 22:50
You're trying to get me in trouble with everybody listening right now? This is not good. Do I think there's something intrinsic? I think that there are, look, there are structural components in this, you're dealing with an entire industry in which pretty much everything you do, feels in a lot of respects, very black and white. Look there, numbers on a piece of paper. It's in one column, or it's in the other one, which I think for a lot of people obscures how gray this stuff can be. And look, I have colleagues, we have, you know, at INSEAD, we have an accounting area, we have a finance area. And I have colleagues who have spent their entire careers researching nuances of things that way beyond me and my comprehension. But the point being that black and white, it's not so black and white. But I do think that there's a little bit of that perspective that, you know, “Oh, no, but it's, it's clear.” It's not clear. And I think sometimes we underestimate how much gray area there is, on top of that, look, because of the success of the industry in terms of the financial rewards that has paid over the years, particularly because that has been you know, it has that, you know, Wolf of Wall Street, all of those sorts of examples. There is a rhetoric out there about the times of excess and the high flying. But in a certain respect, look, you also had some of those stories in other professions as well, back when they were making lots of, when doctors were making loads of money, with the doctors in the fast cars and doing like never happened with professors. But you know, for the rest of the, you know, when you're in a profession that is viewed as a high flying, one that others don't necessarily understand high barriers to entry to those high levels. All of those things can create an environment where there's also hyper competition, you know, you look, we've got lead tables, we know exactly who's ahead of whom and that does breed some more competition. Now, does it breed more unethical behavior? I don't have the data. I mean, I'm in empiricist, you can also look at other things. I mean, you look at bad behavior among professional athletes. You look at bad behavior among anybody who is at the pinnacle of their career. I think that’s more of what you see is as you go higher in a given career, you have both more opportunities and also more pressure to continue to deliver and deliver and deliver. And I think that's what pushes more of the unethical behavior. I don't think that in general, somebody who is… somebody who's a teller in a bank, helping people to get loans or do whatever else on a day-to-day basis, I don't think that they feel significantly more pressure to be unethical than does somebody at a comparable level in another role, another industry.
Toni Dechario 25:33
You've talked about pressure a lot. Can you describe a little bit more the relationship between pressure and decision-making?
Mark Mortensen 25:41
A number of years ago, we recognized that rational decision-making is a beautiful ideal. It's not a reality in any way, shape, or form. Now, in what ways are we irrational? So, all of Dan Ariely, his work and Predictably Irrational, everything's great work that shows that we have biases. So, all of our great ideas on how to make decisions were broken. We have psychology or psychology doesn't fall into the nice buckets and do always the right thing. On top of that, you have, you're always dealing with incomplete information, imperfect information, you've again, got the biases. And now you add in all the pressures and everything else, decision-making is nowhere near as nice and objective a category as we like to think it is. So, you know, in terms of what's the relationship between pressure and decision-making? Given that you've got a lot of different things in play, the more pressure you put on the system, the more likely it is that you end up making a shortcut, either intentionally or otherwise. And you don't do the step-by-step testing of everything. And that's where we start to get to the suboptimal decisions. One of my favorite books, Pfeffer and Sutton, wrote The Knowing Doing Gap. And they're basically arguing that more or less hyper-simplified, most of the errors that we make are errors that we wouldn't make if we actually took the time and stopped and reflected and thought through this stuff.
Toni Dechario 27:01
So, my last question on this is, if you're an organization looking to improve decision-making in your organization, it sounds like you'd be well-served to figure out where there are points of high pressure, and where your staff is really feeling under pressure, for whatever reason. You know, it sounds like pressure in one area can translate into behaviors in a different area. How would you recommend figuring that out?
Mark Mortensen 27:30
Conversations? Now I'm belying the organizational behavior piece. And I said to you, I'm an engineer, it's not all the warm and fuzzy and stuff. I'm sorry, the best tool you still have is a conversation. It's not just a, let's have to get the CEO to say, “Hey, everybody tell us where you're under pressure.” That doesn't work. This has to be at the level of the operating unit, right? The team level, were there real relationships, if this is something that you're really thinking about? You're worried about that pressure managers need to have that conversation, say, “Look, let's talk about this.” And again, model the behavior you want to see. If you want people to honestly tell you about their stressors, their pressures, the things that are pushing them, especially if you think that those might be things that are pushing them towards behavior that they don't feel good about, you better be coming up with some examples of your own. That's a tough bill of sale for a lot of leaders to start off by saying, “So let me tell you about a time where I probably got a bit too close to the gray line.” Most leaders will say, “Absolutely not. No way. I'm not going there. That's uh-huh,” but if you really want people to be open and honest, you need to take that first step. You are in a position of power and safety relative to them. Use that for some good by saying, “Look, I'm going to reveal where it was tough for me.” And that's a first step, not the whole step. But a first step towards getting people to say, “You know what, I kind of had something similar, or here's what I'm wrestling with, right?” But again, it has to be real vulnerability, not the kind, not the, you know, the one that we use when we're applying for a job, what's your biggest flaw? “I'm too dedicated,” like, come on. It's not that. It's real vulnerabilities actually doing the stuff that matters.
Toni Dechario 29:08
So, let's get into your sweet spot. How would you say the pandemic has changed the way that people make decisions?
Mark Mortensen 29:17
There's been a lot of debate, honestly, going back and forth. A lot of people started saying, “Look, people are a lot more productive. Because I'm not distracted by all the stuff going on that chatty person in the office who's always talking at me, I don't waste time with commuting. I don't, I don't, I don't, I don't. Look how much more productive I am?” At the same time, we have to be careful, right? There's productivity, and then there's collaboration, and it depends on what kind of productivity you're looking for. So, what I've been seeing, yes, I think people are more productive, at least in terms of all those hours put in. We also know and we have lots and lots of data that shows people everywhere across the world working longer hours. They start earlier, they end later. That's more hours in the day, they take less time for lunch that, you know, we also know, and we've known this since, oh, I don't know, like the 80s, that when people communicate via technology, they tend to be more task focused. So, we're more scheduled, we're hyper-scheduled, we are more task focused. It's not surprising that people say, “So, I'm more productive.” In a certain respect we are. The problem is, we have to be careful of what we're losing. So, if what you're losing is more collaborative work, work that you need to get other people's opinions on, right? Come back to our, into the focus of our conversation, you're talking about ethics, maybe you have an environment where people are able to produce more widgets, and whatever that might be, you know, sell more accounts or settle more deals, whatever it might be, they're able to deliver more, but they spend less time talking to other people about it. That could be a very big red flag for ethical breaches. You know, it was really funny. We were, I was talking with this guy about this deal that we're closing. And he said that you could actually write off a certain something and I hadn't thought about it that way. And you chime in and say, “Whoa, no, no, I actually just learned about this, or I just talked about this. That's actually not okay. You can't write that off because of whatever.” Those random communications may stop me from doing something that could get me in trouble. Could get the firm in trouble. Could whatever else. So, are you more productive? Yeah, you got more stuff done. But maybe you're missing out on the opportunities, collaborative opportunities to catch stuff. Same thing goes obviously also for innovation. Innovation is a team sport. We know this. The idea of Newton, just be, you know, the apple on the head, the bolt of lightning, the eureka moment. Does it happen? Yeah. Does it happen often? No. Does it happen most of the time, no. Collaboration is typically the vehicle of innovation, where we talk about things. We use analytic learning. I take something I learned from over here, and I plug it in over there. And that fits with this piece. And now suddenly, we've got a new solution. The more people are working remotely, we have to be much more cognizant of the stuff that we've lost, right? And I always tell people, “You have to pay for the stuff that used to come for free.” We need to stop and make sure we've done the shopping list, the inventory list of the stuff we used to get, and then think about how we can put it back in, because otherwise it really does start to affect the way in which we're reaching our decisions. Some of it may be positive, some of it may very much not be.
Toni Dechario 32:24
You basically get a higher quantity of decisions, but probably lower quality of decisions, even when it's not an ethical question.
Mark Mortensen 32:31
Let's be clear, I guarantee right now somebody is listening and going. “No, that's not true. I've cranked out lots of stuff. And I've been totally ethical.” Sure, you may have been, again, remember when we talk about ethics, when we talk about ethical breaches… In the same ways when we talk about innovations and other things, it's a numbers game. And you have to ask yourself, “What's the cost of a failure?” Just like a venture capital firm doesn't expect to get a unicorn. Every time they fund something, one unicorn funds all the other stuff. Same thing goes for ethical things. Just because you didn't have an ethical breach doesn't mean that you're not at risk, or there isn't a problem. Maybe you got lucky. Maybe you didn't. Maybe you're totally fine. But it's a question of how conservative, how risk averse do you really want to be? And then you have to balance that out.
Toni Dechario 33:16
So, what have we missed in terms of any recommendations that you have, for organizations that are looking to build in more ethical decision making?
Mark Mortensen 33:27
I come back to the same the same two things, I sound a bit like a broken record. I'm a huge fan of all of Amy Edmondson work around psychological safety. And not just because she is an amazing scholar and a truly delightful person, but because she's absolutely right. One of the best ways that you can avoid breaches and basically failures in the decision process – whether they're ethical breaches or not, they can also be planes falling out of the sky, they can be surgical goofs, they can be, you know, whatever – the foundation of that is open conversation to surface when something goes wrong, right? It comes back to what I was saying before: step one, the issue has to be surfaced so that you know that there's something to talk about, then step two is how do you deal with that? Psychological safety is the fundamental underpinning of that. It is an attribute of an organization, it's a cultural thing, it is something that you can build over time into something that is personally felt. You do or don't feel psychologically safe. And whether you do or don't, I have no control over that. I mean, I can influence, I can do things to try to make you feel more psychologically safe. But look, if you Toni, just don't feel safe psychologically around me, there's nothing I can do to change that. That is you who owns that, but we can create environments that, on average, raise the level of psychological safety, create an environment where people are more willing to bring these issues to bear, you know? So, what can organizations do? One, take that seriously and really invest in it. Invest in building a culture where people are talking. What you want is a place in which people say, “Look, I'm willing to say, ‘I'm not sure if I'm, if I just did something wrong, can somebody help me?’ Because I know that they're gonna say, ‘Wow, Mark, you totally screwed up, that was really not okay, we're not gonna judge you for it, we're going to try to help you fix it, and we're going to do something about it.’” So, I think psychological safety from a cultural standpoint… and then again, think about process. Remember outsourcing what you're bad at. Think about putting in place the things, not in terms of monitoring, but in terms of helping people to be more honest and open and have those discussions. I think that's really at the crux of what is necessary.
Toni Dechario 35:35
You know, if I were a CEO of a relatively big organization, is that you have subcultures, right? And some of them might be subcultures that are psychologically safe, I might feel safe in one environment with one set of people but not in a separate one, or not, that might be a subset of a larger environment in which I don't feel psychologically safe. How would you recommend organizations try and understand whether they have a psychologically safe environment?
Mark Mortensen 36:05
It is a process. Again, it is something that you create by modeling the right behavior by showing exemplars by keeping your eyes open and making sure that when something comes up, it is truly dealt with. And people feel like you know that they actually take this seriously. But at the end of the day, it's a bit like, it's like parenting, you can do everything you try your hardest to instill the right values and all this stuff. They're still their own people. Sometimes they don't do exactly what you wanted them to do. Frustrating as that might be, you know. And I think that that's the reality, I think leaders need to be thinking about this from the perspective of saying, “Look, I'm going to prioritize this, I'm going to send strong signals to the organization that I legitimately and sincerely think this is a priority. And I'm going to do everything I can to try to instill that. I'm also going to recognize that it might not work 100%. It might not work in certain pieces, there might be subcultures and everything else, I don't close my eyes. We say, ‘Look, we're going to create psychological safety, to bring these ideas to the, these issues to the surface, we're also going to keep our eyes open, to make sure that we're keeping an eye out.’ And if we see something that's going wrong, we say, ‘Hey, what's going on over there? Little funky thing in the gray over there, that doesn't feel right.’” And you know, again, as long as that as part of the contract upfront, employees can't really complain about you keeping your eyes open, you're looking after your own shop. And that's your job. The main thing is making sure that again, those conversations, that contract implicit-explicit is upfront, make sure that gets set and is clear before there's an issue on the table. Because as soon as there's an issue on the table, it pollutes the whole discussion, and you can't disentangle the two.
Toni Dechario 37:50
What resources would you recommend on these topics? If people want to dig further?
Mark Mortensen 37:55
I'm Amy's cheerleading section. Fearless Organization is a fantastic book. Again, it gets people thinking about these sorts of issues. What I like is, again, it's important to be able to ground these in reality. It's not just an abstract thing. It is, “Let's actually see where this comes in over and over and over. And what is the concrete impact of this on people's real behavior?” Because that's really what matters. Culture doesn't matter for culture. Culture matters for the behavior that it drives.
Toni Dechario 38:22
Alright, my last question is kind of a fun one. It doesn't have to be about culture. What are you reading, watching listening to that you're really enjoying that you want to tell the world about? Could be anything.
Mark Mortensen 38:34
I love Ted Lasso. It is a great show, for two reasons. So, I love it because it's really funny. But the nerd answer is, it’s all about working across cultures, which is something I happen to be passionate about. You know, you have a sports setting, but it's working US, UK, etc. But it's also all about the power of positivity and kindness, which I think we need a lot more of. So, it's kind of a feel good. It's silly. It's funny, it is really funny. That's kind of what's on my mind at the moment. I'm really enjoying it.
Toni Dechario 39:06
Thank you. Is there anything we didn't cover Mark that you want to make sure to include?
Mark Mortensen 39:12
No, I think this is a great conversation. I had a lot of fun. I encourage people just to think about this stuff. In a weird way, it's not rocket science. It's just something that we don't often give the time it needs to actually reflect. We have as a society and industry, a huge action bias towards, “We got to get stuff done, things have to be happening.” A little bit more thought wouldn't always be the worst thing. Huge grain of salt: I'm an academic, it's what I do all the time as I sit and nerd out and think about things all the time, but a little bit of a pause, you know, 80/20 rule, I think can give a lot of benefit. But this is what we also know about behavior change: behavior change, cultural change, neither one is achieved through a, sort of like, big actions of, “I will now be this going forward”. No, you won't, you or something else up until now for a lot of good reasons. Behavior change is accomplished through small nudges through small shifts repeated over and over and over. We're creatures of habit. Sometimes you just need to force yourself into that new habit doing enough and eventually that becomes the way you think, “That's really the game that we're in right now.”
Toni Dechario 40:16
Thanks so much back. It was really good to see you again.
Mark Mortensen 40:19
Great to see you, too. Thanks so much for having me.
Toni Dechario 40:22
If you want to hear more from Mark, watch the New York Fed’s webinar on “Culture in a Post-Pandemic Workplace,” which you'll find at newyorkfed.org/governance-and-culture-reform.
Mikael Down 00:01
We say culture is important culture matters, well, how?
Zab Johnson 00:05
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 00:10
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 00:16
So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 00:25
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 00:30
I can't emphasize enough the disempowerment of the lonely decision maker.
Taya Cohen 00:35
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 00:40
It doesn't make you unethical to wrestle with this. The smart money is we help you to try to figure this out.
JB 00:48
This is “Bank Notes: Banking Culture Reform.” The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 00:57
Hi, and welcome to the Banking Culture Reform podcast, part of the New York Fed’s initiative to drive awareness and change in financial services culture. This series, “Norms, Mindsets, and Decision-Making” explores questions like, “Why do ethical people do unethical things? How can organizations encourage staff to speak up and executives to listen up? And what role does context play in shaping people's behavior?” My name is Tony Dechario, and I'm a member of the New York Fed’s culture team. In this episode, we'll hear from Holly Ridings. Holly is the chief flight director at NASA. NASA spent more than six decades thinking about how culture connects to outcomes. We first met Holly when she joined our web panel on “Trust and Decision-making.” In this episode, she'll talk us through how NASA teaches directors of human spaceflight to develop their command presence and remain laser-focused on the ultimate goal of getting astronauts home safely. Holly, welcome, and thanks for joining us.
Holly Ridings 01:50
Thanks so much for having me today. It'll be an exciting conversation, I think.
Toni Dechario 01:53
So, we want to kind of start with the basics of why we're talking to you today. What got you interested in culture? And how is it relevant to your job?
Holly Ridings 02:02
If you think about human spaceflight, and where we started – so the beginning of NASA, back in the 60’s – we have had a culture of excellence and leadership. Trying to do really hard things since the very beginning. When I came to work here in the mid-90’s, there was a tremendous culture already established here at NASA, and specifically in the operations community. We have foundations of flight operations that have been passed down for generations. We got our roots in military test pilots with our very first astronauts. And so, you're kind of surrounded by culture and indoctrinated in a really positive way to just be part of this amazing team from the very beginning. I've been a team person since I was young. That's always resonated with me. It's part of my DNA. And so, you take that sort of natural tendency and then the NASA environment, and as I've moved through the leadership ranks, I’ve really gotten more specifically interested in, you know, what does it mean? How does it grow? How do we teach it, what is its value? And also, when does it need to evolve and change? You can't be a victim of your culture as the world changes around you.
Toni Dechario 03:22
That's a great point. And one of the specific aspects of culture that we're interested in exploring through this series is decision-making. And so, I want to kind of dive a little deeper into why decision-making is so important to you? And what is it that you've seen over time, that's caused you to focus in on decision-making as much as you have?
Holly Ridings 03:46
Okay, so decision-making in human spaceflight, there's a couple of really important aspects, right? The first one is, that we teach everyone who comes through the door, is that you have to lead your leader. And so, we really empower everyone to be a part of the solution. It's not a hierarchy where the person at the top tells you the answer and you go implement. From the very first day, there's this idea that you are valued for your thoughts, for your capabilities, for your technical knowledge, for your specific view of the world. So, lead your leader. It's a very specific thing that we teach. One of the other aspects of decision-making that we teach is, it's okay to not know the answer. So, when you do human spaceflight, you can find yourself in very stressful, very critical situations where you absolutely need to know what to do. But sometimes in that same situation, the information you're looking at doesn't make a picture and you don't quite know. Now, you might still have to make a decision, but you have to communicate, your confidence level, the information that you're looking at, and it's a really hard skill to teach. Because you tell people, you have to absolutely be able to make a decision, but at the same time, you have to admit you don't know the answer. So, everyone in the room is contributing, and you have to both be confident and humble at the same time,
Toni Dechario 05:04
One of the things you mentioned is pressure, and part of what we're interested in with respect to decision-making for financial services, at least, is there can be a great deal of pressure on bankers with respect to the bottom line. For flight directors, the pressure is well, is definitely even more extreme, right? You're dealing with people's lives and their safety. How do you manage that pressure?
Holly Ridings 05:27
I think you have to teach flight directors – and really, operators, flight controllers – to narrow the focus. What is the most important thing? And what is the next important thing? So, the most important thing is always the safety of your crew. Safety of your crew. Are we keeping them safe? Are we giving ourselves an opportunity to get them home, if that's the situation you're in. And so, the safety of the crew, that's always, kind of, your very, top-top of the food chain thought. Anything you do gets evaluated against safety of the crew and safety of the vehicle. The next piece is what we call mission success. We want people in space to accomplish things, right? International Space Station, science, Artemis to get to the moon. So, there's a big, bright, shiny goal. That's why we take on all this risk to get people into space. And the third piece, beyond the safety of the crew and the vehicles that they fly in, is the success of the mission. All things are evaluated against those priorities. And it kind of narrows your focus. The more complicated pieces, the risk of each of those potential paths, and we talked about risk trades. Evaluating risk or risk assessment or risk management, obviously, risk is just the probability of success versus the probability of failure, right? And so, that skill takes a lot of time to learn, when you have a bunch of information in front of you. After a while, you can start to see, okay, well, I go ten steps down that path over here – it's little bit like playing chess, right? You go ten steps down that way, and, “No, okay, somebody's gonna take my queen, maybe not the best path. I go ten steps that way, and then maybe I checkmate them, that's maybe a better path.” So, your brain tends to think like that in terms of the next steps, down any given solution path, toward the best solution to keep the crew safe and get the mission done.
Toni Dechario 07:22
I'm thinking about the dual goals of safety and mission, and wondering – the extent to which the mission is very exciting, and very shiny as you described it – how do you ensure balance? That that safety component is as important as the mission component, despite the fact that the mission is much more shiny?
Holly Ridings 07:46
So, we have been taught throughout the process to weigh, as you said, both sides. But more importantly, we're talking here about flight operations. NASA is a much larger group of people who all, again, have the same goals that I just articulated. We have a safety community, a safety team. We have an engineering community and engineering team. And NASA overall has a culture of very methodically weighing the different options, and each of those groups of people gets to provide their assessment. Now, typically, that process runs before we launch. We have readiness reviews. If you were going to launch a product, you'd have a readiness review. I mean, we're launching people. So, obviously, a little bit higher risk than a product, but the thought process is the same. You go through and say, “Hey, are we really ready to do this? Do we think it's going to succeed? Do we think it's going to fail?” And so, all of the NASA entities weigh in with their assessment and their judgment based on their expertise. It's a really robust process that we call a certification of flight readiness. Now, when you're talking about sitting in mission control, and something happens and you’ve got to make a decision in the next five minutes, yes, a lot of that is delegated to the operations team. But that same methodical process, although it runs a little bit faster and maybe with fewer people, still exists. And you are always weighing the safety of what you're about to do against the success of the mission that you're going to accomplish. We talked about flying, safely and successfully, right? You can't divorce one from the other. The safest thing to do is stay on the ground, right, instead of put people into space. So, you know at the beginning you are accepting some amount of risk. There's no baseline zero risk, you are accepting some amount of risk. And then the challenge is to not push too hard to towards the goals that you mentioned in terms of the mission.
Toni Dechario 09:45
One of the other questions that I was curious about is, we had a webinar recently on purpose and the kind of relationship between purpose and culture. If ever there were a mission-driven organization, it's NASA. Your purpose is really clear, and so how do you think having that sense of shared mission and shared purpose – and by mission here, I don't mean Artemis, I mean purpose – how do you think that has impacted, kind of, individual decision-making at the organization and among the flight directors specifically?
Holly Ridings 10:15
I think the impact is huge. I mean, you really almost can't put a number or a value on it. When you have a group of people who have a shared purpose, right? I mean, fly humans in space, make progress for the human race, that is an amazing thing to spend your life on. You feel very relevant to the world. Not just to NASA, but certainly to the world, and you feel like you have this responsibility to make it work. We don't want to go back in the opposite direction. We want to keep making progress forward. And so, I think it creates this shared purpose, this sense of team that is not just NASA, but across all of our international partners, and our commercial partners, where everyone really wants to do the right thing. Right now, people have different opinions about what the right thing is. You may look at something and have technical opinion, different than the person sitting beside you. You may even have a cultural different opinion about priorities than the person sitting beside you. But at the end of the day, everyone wants people to fly safely in space, and so it creates this team feeling that really can't be, almost can't be described. A lot like if you've ever played sports, and you go and win the championship, and you have that camaraderie. And certainly everybody has tough days, but at the end of the day, we're all here trying to do the right thing for the human race. NASA, the partners, and all of our providers have made progress for human spaceflight. And right now, it's just growing exponentially. The barrier to entry for human spaceflight is lowering. There's more people involved. There's excitement about it. We're going to go back to the moon, and then sustainably onto Mars, and so much energy. So, I'm not too sure I'm asking here answering your question, other than to just tell you, there's a lot of energy, and it's really exciting.
Toni Dechario 12:07
Okay, so despite all of that, there are times though when bad decisions get made. And so, do you have any sense of when bad decisions are made, why bad decisions have been made? And is there a way to influence that or to change that?
Holly Ridings 12:20
You know, we have a great culture around evaluating our decision-making. Our tagline, our name for it is, “a lessons learned process.” And a lot of people do this. The military has a hot wash. You do debriefs. What went right? What went wrong? And I'd say we actually tend to focus on what went wrong more than what went right. We have to remember to do the cheery parts. We tend to be a little bit hard on ourselves, but we need to be right? We need to be harder on ourselves than anyone else is on us. And so why do bad decisions get made? Some of them are really complicated. The space station, as an example, is a very complex piece of machinery. It's built by lots of different people. It has lots of different software, lots of different hardware. To be able to seamlessly integrate all of that information so that it is available, so that everyone remembers, so that you understand what the space station might do in any given situation. It is challenging over time, right? And you probably see this in your systems, right? Somebody built a system ten years ago, and that person's gone, and then it gives you an error, and people are looking at that error like, “I don't know what this means,” right? So, we do spend a tremendous amount of effort, trying to keep all of that information current and documented. But you know, occasionally that doesn't work out and you learn something new. And so, we call those features, right? Where you learn a new feature, and then you take that lesson and go figure out what you need to do with it, document it, record it, fix it from a people standpoint, right? Bad decisions get made, sometimes it’s lack of training, where we feel like we've communicated the important skills, and we've evaluated them, but we were just, maybe a little off the mark. And so, you go back and you look at your training program and see what you need to adjust. This is particularly challenging with all of the development that's going on in human spaceflight right now. Because you never know what you need to know until you've done it, right? And so, you try to apply all of your lessons, again, that you've learned and your history to this next problem. But fundamentally, we haven't solved that problem yet. And so, trying to get it 100% perfect, then you fly brand new vehicle in space and you're going to learn things. So, you talked about mistakes. I categorize a mistake as we don't get our crew home safe. And luckily, human spaceflight and NASA we have not lived that mistake very much. We have a couple times and there are books and reports about it, classes. It's a tremendous part of our culture, because that's really the mistake. Everything else to us is just an opportunity to learn. So, my threshold of pain is pretty high, right? If a human isn't in danger, then we have time, we will solve the problem. I tend to stay pretty calm. But when you're talking about the mistakes, Colombia and Challenger, we have that as part of our training, to go back and really look at that thread, and what happened. And if you read those reports, there's communication errors, there's technical information, documentation errors, there's training errors, all of those. Your system is usually robust enough to handle a couple of errors before it turns into this catastrophic mistake, but your system’s not usually robust enough to handle multiple of those stacked on top of each other. So, you're trying to build some tolerance and some margin with all the work you do. And then guard against that day where they all stack up, which of course, has unfortunately happened to us a couple of times. That's how I kind of see mistakes. It's an interesting question.
Toni Dechario 15:53
And do you conduct a “lessons learned” discussion? At what points do you conduct those discussions? How do you think about when it makes sense to have those conversations?
Holly Ridings 16:03
Depending on the length of the mission or the style of it, do it periodically while the mission is in progress. So as an example, our crews live on the space station for six months. We call that an increment. And in the beginning, we would get them home and we do a bunch of debriefs, where they literally go sit in a room and talk to the people on the ground. And we realized after a while that that's not the best way to capture the information, because the thing that we're talking about might have happened, five months ago. And so, now, for the crew onboard the space station, and that six-month period, we actually after big – we call them dynamic operations. So, we bring a new vehicle to the space station, and they're involved in the docking or the birthing. Or if we do a lot of maintenance where we've got to take something apart and fix it, or a new science experiment, we'll have a debrief with them, even on orbit and are able to capture the lessons learned when they're much fresher. So that's one example where we've systematically looked at inflection points where you might gather a lot of data where you need to have a discussion and learned how to capture that better.
Toni Dechario 17:11
What do you think the cultural impact of having these regular debriefs and these regular lessons learned sessions has been?
Holly Ridings 17:19
I actually think it's been really important. And it's interesting, because when you first come here, you hear about lessons learned and what that looks like to you, is it again, the end of this project, this mission you've been working on, you show up with your PowerPoint slides, and you talk to the flight director who's in charge or someone else. But because of the continuous operations of the space station, which I was just describing, where we have these inflection points, we now, I now even personally, view it as a as a more continuous process than when I first started. And so very fluid, in my opinion, in a positive way, versus the static sort of milestone thing, that you check off at some point. And even now, it's really funny, right? So, you'll take a bunch of folks from flight operations, and you'll go – when we were traveling more, we haven't as much with COVID – but when we would travel and be somewhere together for a launch or a meeting going to see our international partners, and we're sitting in different parts of the room, we would actually naturally group together, a little, a little bit of that can be, you know, “Hey, we're there, and we're a team and we know each other,” but it over time really turned into this real time lessons learned debrief process. I can remember a couple years ago, we went to, like, a women's conference in Houston, drove up to town, and there was maybe half a dozen of us sitting in different places. We didn't know anyone else in the room, and at the end, the room filters out and we're all standing in the middle of the room like the six of us, you know, comparing notes like, “What did we learn? What did we hear the speaker say, you know, did we think this was of value? Should we come back? Would we like to invite any of them to come and talk to us?” And so, then we kind of like realized, “Oh, wait, we're doing our quick response, you know, lessons learned debrief process.” And so, now, I noticed it like we do it everywhere.
Toni Dechario 19:04
You have a flight director voice? Am I hearing your flight director voice right now? Or is it different from what I'm hearing?
Holly Ridings 19:08
Yeah, different, different, not as cheery. It's very, like serious. So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right? And so, we call that command presence, where you can be just perfectly having a sort of normal conversation like we are and if my phone went off, and it was the control center and something was going on, then it's like, boom. You're just, you're in the zone.
Toni Dechario 19:36
But I wonder whether being kind of in that frame of mind influences your behavior in one way or the other, if you think it does?
Holly Ridings 19:46
We're talking about culture, right? So, one of the great things about human spaceflight is I think that you don't often run into inconsistencies between, you know, your sort of personal belief and what you do at work, right? Again, we're all here. We're trying to make progress for the human race and get back to the moon and on to Mars and keep our crew safe with the space station. So, yeah, to me, I think of it more as a tool set, right? Where even if you have friends, you know, some friends, you may act a little bit differently. If they're your friends that you've known since you were little, you have a different personality and the way that you communicate with them versus someone that you just met last week, right? You're talking to them with a different style. So, for me, it's more, you're using the tool that's appropriate for the moment. Flight directors in general have to have a lot of skills. You have to be a team builder, you have to be very technically competent, you've got to be pretty adaptable with all the things that are going on in human spaceflight. And you also are responsible for the lives of humans. And so, when you get in a mode, where you're weighing the risks for people, it requires a certain amount of, of gravity that is applied to the situation. And I think then, that starts to, a little bit, feed back into your personal life. We don't have a tremendous amount of separation between work lives and personal lives. I mean, people work nights, weekends, holidays in the control center. Your family, like it or not, is sort of involved in your job, right? You don't go home and never talk about it. The phone rings in the middle of the night, and you get up and leave, and you come to work, because that's what we do. We come when we're called and we're responsible for keeping everyone safe. So, I actually think about it as different tools for different situations. And I just have to remember, my nine-year-old might not need the flight director tool as an example.
Toni Dechario 21:42
But it sounds to me like your tools, it's almost like a framing or kind of a shortcut into the, like, this is the dead serious me, and it's like you can get there quickly through practice.
Holly Ridings 21:54
Yeah, you can cut out all the clutter, right? If you watch a flight control team, you can tell the difference, right? If they're working a really hard problem, if the crew has the day off and people are a little relaxed. And my husband can tell, like, if I’ll answer the phone, he can tell just by my face and the tone of my voice whether you know something's wrong or whether it's… because you're just necking down your focus. And so, you're kind of applying all your brainpower to, really listen to the problem, think about what's going on, maybe you're considering resources you need, how we're going to go make a decision. I mostly think about it as, I block out all the clutter and I just focus. We talk about scenarios in the last many years where you kind of go into, I'll tell my husband, like, the phone will ring and, “Something's going on,” and I'll be like, “I'm going to work, I don't know when I'll be back.” And I literally just disappear into the control center. And then he runs everything at home until I show up again. So, it says so much about, you know, total focus for the safety of the crew and the mission. It doesn't require that extreme all the time. But the term I used earlier was command presence.
Toni Dechario 23:04
It sounds exhausting. In thinking about the financial services sector, the kind of parallel that I would make would be these interns that work 110-hour weeks, 100-hour weeks, and others, and can get exhausted. How do you ensure that you and your team are still kind of making good decisions after hours and hours and hours of this kind of really focused, intense work?
Holly Ridings 23:30
There's two kinds of work that I think we have to keep an eye on, right? So, one of it is what you mentioned, just, the hours. Like, just how many hours have you been putting in? And, again, the space station is 24/7 operations, right? So, 24 hours a day, seven days a week, 365 days a year, we have a team in mission control responsible for the space station. And so, that's a lot of hours and a lot of shifts, you know, that the team has to do. So, again, we will fly a space station for 20-plus years. We have metrics, just like every good organization. How many shifts are people doing? How many nights? How many weekends? And now we can kind of predict when it's too much, right? So, in the beginning, you know, it was like people would do too much and you'd need to get them to back off. Now, we can predict a little bit. So, we have numbers that we watch, try to make sure people don't get close to those. You know, you will end up working lots of hours if you have a specific tactical, like, “You're responsible for the next crewed launches to the space station? Yeah, that's a lot hours,” right? But try really hard then on the backside to give people a break, let them take vacation. So, just like any job, it has a, it has a cadence to it. And you do as a leader, I think, have to really pay attention to the human aspect of that cadence and the work. If you just pay attention to the technical work, you're going to end up burning your good people out, right? Because they'll just take on more and more and more. And so, one of the first questions I talk to people about whenever it's, “Hey, we have a task,” is, you know, “Just tell me what your world looks like.” In the pandemic, it's been even more important to have that communication because you're not seeing it every day. You're often talking to them a lot on the phone, and you can't see their faces. And even the video is not as, not as good. And so, kind of, doing regular check-ins with how they're doing. The other part of that is really just the load you carry around all the time. It is a struggle to get people to completely unplug. The pandemic has actually made that maybe a little bit worse, but better in some ways. But let's tackle the worst first, is that you can now, kind of, work remotely at least when you're not having to physically be in Mission Control, right? So, one aspect of your job is that one aspect of your job, you know, you run meetings, and you get ready for the next mission. And some of that can be done remotely. And so, everybody is, “Okay, well, I'm gonna go check on my parents I haven't seen in a year and a half, but I can keep working,” versus like, “I actually am going to unplug and put the phone down.” So, I think that's a little bit where our struggle is right now, is the baseline load that you carry around with you all the time and encouraging people and finding ways for them to completely turn it off.
Toni Dechario 26:06
I could do like three hours of this with you, Holly. But I'm gonna move to a different topic. This question is about speaking up. So, what do you think makes people more or less willing to raise a concern?
Holly Ridings 26:20
I think a lot of it stems from leadership. I think that, one, you’ve got to be verbal, vocal about it, like actually say it out loud, “I want to hear your opinion, it's okay, for you to speak up.” You have to give them permission to participate. But I also think, then, you have to show it in action as well. I mean, that's kind of pretty basic advice. But I think that, you know, as you become a leader, your time is really valuable, you can be short with people, you can cut them off. And some of that is sort of time management and teaching them, you know, how to communicate, and skills that they need. But if you overdo it, then you're shutting down conversation and not allowing them to participate in it. I have new flight directors and some of the things they want to talk about maybe don't make my list of top priorities because we've already solved that problem ten times, but they haven't solved that problem ten times. And so, I have to continuously remind myself to, you know, not talk over people, to give them a little more airtime than maybe I think they need because my brain’s already moved onto solving the next problem. And to me, as you move up in leadership, you sort of get more clipped and more short and, you know, more focused, just because your time is so valuable. But then you have to remember, everybody's time is valuable, and they're at a different place in their journey. And part of your responsibility is to develop them and to grow them. So, that's maybe where I'm at in my personal growth, is trying to remember that part and have a little patience and give a little more time and space that then encourages people to speak up. Because if you, if you don't do that, even if you tell them over time, they'll realize, “Well, she says she wants to hear my opinion, but every time I mention something after three words, she says, ‘Oh, no, no, no, I understand already,’” then they won't talk to you anymore, right?
Toni Dechario 28:12
Your flight directors also need to be able to do this, right? I mean, part of their job is to be able to run a team that's putting people in the space, and the people that you're managing need to be able to hear the people that are part of that team. How do you help them do that?
Holly Ridings 28:30
So, it's interesting, because the communication is very different in mission control, you know, on console, so we do spend a lot of time training that communication to be efficient, to be crisp. Again, if someone does cut you off, in the interest of, “We're trying to solve a problem on the space station,” you have to have kind of thick skin in that room, because you really have to stay focused on what's going on. And even if it's just a normal crew day, and they're just trying to get through the timeline, you've got many, many crew members, right? Seven at the moment, trying to get through their day, and everybody's got science and maintenance and experiments. And so you just don't have a lot of time for chitchat. But then there's also a team-building aspect of it. So, the first thing you have to get them to understand is, there's different communication that's acceptable in different situations. And once they understand that, then we tend to initially focus on going to mission control and how to have that focused, calm, command presence that invites communication but also keeps everyone focused. People tend to maybe have a more conversational style off console already. So that's not as much of a ramp up in terms of, in terms of skills, but it can go really, really fast. And so, we practice a lot, right? We do simulations, where we practice a lot, and one of the tenets of our training is communication and you are evaluated on it very specifically and receive a lot of feedback. And you will not get certified to work in Mission Control if your communication is not good. It doesn't matter how technically competent you are. So, it is a vital component.
Toni Dechario 30:08
What can banking organizations learn from NASA's experience and observing decision-making? Based on your experience, what recommendations would you offer to organizations that are seeking to reinforce better decision-making?
Holly Ridings 30:19
I think that NASA is really good at soliciting a wide spectrum of ideas and inputs, but then also good at driving to a decision, right? So, I think organizations can learn to a) open the aperture, but b) not be afraid to, you know, neck it down and focus and go, right, people want to spend too much time evaluating, you know, to have the perfect solution. And sometimes you just need to try something. Needs to be safe, right? Certainly in our case, but sometimes you need to just try something, and be okay if it doesn't work, and try the next thing.
Toni Dechario 30:53
I can't imagine what your time management looks like. You also obviously spend a lot of time thinking deeply about these questions. When do you do that?
Holly Ridings 31:02
Let's start with time management. Time management is certainly an important skill for everyone. I think, as a flight director, or the chief flight director, it might be the most important skill. You're jumping between all the NASA programs, again, so ISS and Artemis and all of the things we do there, and then trying to keep the team running and spend time. So, it's a little bit of a battle on a daily basis, to make sure that you get everything covered. For me, there's a category of things that are associated with the missions we're doing, right? So, real time, we launched a cargo mission yesterday and it'll dock tomorrow morning. So, you've got that as your highest priority. And then, on the opposite end of the spectrum, we kind of have the strategy for the future. So, I kind of use this suitcase model in my head where the tactical highest priority safety things are covered, and I'm making sure those are in good shape. And then I try to spend some bandwidth on the strategy as a leader in the organization. You know, it's a challenge. NASA is really busy. Human spaceflight is really busy. The first ten years or so in my career, we had two vehicles, and we were building the space station and the space shuttle. We're just bringing on the international partners. And now we're, you know, half a dozen or more vehicles, all the providers, so all, of course, the international partners. So, it has gone from just a couple of interfaces to this huge labyrinth. Every morning, I feel like I wake up and the starting gun goes off and it's just, “Go, go go go go go go go,” until I get home and you know, walk through the door and have to do the things at home that I that I need to. You have to learn how to protect yourself, right? So, one of the things I tell the flight directors is that no one will protect you but yourself, no one knows what your limits are. No one knows when you need to recover. No one knows what your recovery strategy is. And you need one and you need to execute it. Because otherwise any job – and especially this one – will just burn you out. I mean, even if your leaders have the best intentions. If you don't have some boundaries and enforce them for yourself personally, you will burn out. There's just so much going on and people are excited about it. So, no one can help you but yourself. You have to be an advocate for yourself. I probably did not learn that until midway through my career and have been through the process of being, you know, burned out a couple of times. And so, I know, I know what it feels like. So, that's one of the things I really try to encourage people and try to teach people. You ask me what I do to think about this? I will tell you I actually talk to people, right? Because I learned the most certainly right now from answering your questions, you know, the ones I haven't thought about. I'm like, “Huh.” And the ones I have thought about, hopefully someone else can find a nugget that helps them think about the challenges they're having. So, I'd say right now, I'm probably finding great people and talking to them and that really helps me. We're about to go back to schools, but I really try hard to talk to folks. I've gotten a little bit into podcasts. I had not really been into it until recently, and a couple folks had asked me to do them, and then trying to figure out the value of a podcast, and how do you communicate in a way that that is valuable and useful to the people that listen to them? So, I do spend a lot of time thinking of it. Usually it's when I'm exercising, actually, that's like my, my free brain time. It just spins in circles and has these thoughts. And then I have a couple of really good friends and associates, who I'll call and I'll say like, “You know, this is bothering me, how do you see it? What do you think?” So, right now I'm just using conversation and relationships more than written documentation.
Toni Dechario 34:40
A lot of at least early astronauts came from the military. And a lot of people at NASA have come from the military, and the military is pretty famous for their culture, you know, and thoughtful about – different branches of the military obviously are different, but – about how they form their culture. How much of the military culture has NASA kind of inherited? And what parts of it kind of work for you? And what parts of it have you had to retool?
Holly Ridings 35:07
I certainly think that we in flight operations and NASA and I think overall, NASA inherited a lot of the discipline, right? Really be disciplined about your approach be thoughtful, you know, what are your goals? What are the resources you have? How long do you have to make a decision? And then, as we talked earlier, sort of the pros and cons, we'd say, the risk trade of that decision. So, I feel like our military roots gave us a lot of cultural and organizational and structural discipline. It also gave us a lot of focus on technical competence, right? You got to get it right. It's got to be, you know, good. It's got to protect people, they've got to be safe, you got to get them home. And so, that military mentality, and I'm not, I've not been in the military, right? So, this is just me observing many of the people, like you said, the astronauts and a lot of our other community came from the military side. They do a lot of debriefs, they do a lot of lessons learned. We are civilian organization, right? Our goal is a positive one for humanity. We have a lot of relationships and partnerships. And so, I do think we diverge from the military in our overarching goal and culture of how we're able to present ourselves in the world. I mean, we have international partners all over the world. And so, depending on what year it has been in the, in the last, you know, 60-70 year history of NASA, we may or may not have been, “You're gonna fly in space together, we're partners, we're friends.” So, I do think geopolitically, you know, we have just an amazingly positive goal that everyone resonates with. And so that is one of the things that I think is a real positive for NASA in the world.
Toni Dechario 36:59
That's a really great point. What happens to individuals and NASA who operate outside of normative expectations? Are they coached or disciplined or managed out? Or maybe this never happens? Because they're vetted so thoroughly before they join?
Holly Ridings 37:11
I would say that the short answer is, all of the above, right? So, there is, like any organization, a process where you can break the rules enough, right? Every organization has rules, and you guys do as well, where if you break the rules enough, you could definitely get managed out, as you put it. We have training every year, ethics, cybersecurity, some of the big things in the world that every employee of every organization needs to know that this is their organization's stance on all of those things, right? So, we do a pretty thorough job of proficient continuous education. And that's not just operations. I mean, that's everyone at NASA, who has to do all of those things. So, there's some baseline education and expectations, that's any NASA employee, any government employee, and then also all of our contractors who work for us. If you're talking about operations expectations, there's the, might have showed up late for shift, as an example, right? So, this kind of a small infraction, but one that we don't tolerate, that shows you're not prepared, that shows you're not taking this job seriously with the responsibility that you have. So, those things are much more in the, in the coached area, right? If you have new people coming in, a lot of times the structure of this job, you show up on time, and you leave, and there's a dress code, that's maybe not something they have experienced before, if they're coming in straight out of college. So, there's a lot of coaching. And then obviously, if there's multiple infractions, there's discipline, up to taking them off console, out of Mission Control, if they're not able to represent NASA in a way that we believe is required. So, that's kind of the spectrum. For flight directors, there is a lot of vetting beforehand, and so, luckily, I probably have the best supervisory job in the world, because I have this elite team of high performers. And so, we don't run into much of that. We've got quite a bit of knowledge about them before they move into this position, because they do have to be the representatives, right? I mean, everyone's looking at them. And if you don't have it together, well, then obviously it's maybe not as important to have it together. And then the whole thing kind of breaks down.
Toni Dechario 39:28
Yeah, it does. Holly, it's such an inspiration to speak with you. Thanks so much for doing this. I really appreciate it.
Holly Ridings 39:35
Yeah, you guys too. Maybe we'll get to travel again. We can meet in person. We've done this twice now.
Toni Dechario 39:39
Thank you, Holly. If you want to hear more from Holly, watch the New York Fed’s webinar on “Trust and Decision-making,” which you'll find at newyorkfed.org/governance-and-culture-reform.
Mikael Down 00:01
We say culture is important culture matters, well, how?
Zab Johnson 00:05
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 00:10
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 00:16
So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 00:25
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 00:30
I can't emphasize enough the disempowerment of the lonely decision maker.
Taya Cohen 00:35
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 00:40
It doesn't make you unethical to wrestle with this. The smart money is we help you to try to figure this out.
JB 00:48
This is “Bank Notes: Banking Culture Reform.” The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 00:57
Hi, and welcome to the Banking Culture Reform podcast, part of the New York Fed’s initiative to drive awareness and change in financial services culture. This series, “Norms, Mindsets, and Decision-Making” explores questions like, “Why do ethical people do unethical things? How can organizations encourage staff to speak up and executives to listen up? And what role does context play in shaping people's behavior?” My name is Tony Dechario, and I'm a member of the New York Fed’s culture team. Today I'm speaking with Elizabeth “Zab” Johnson. Zab’s a neuroscientist and the Executive Director of the Wharton neuroscience initiative at the University of Pennsylvania, where she harnesses insights from brain science to help understand and improve business outcomes. Zab was a panelist in our webinar on “Trust and Decision-making.” In this episode, Zab helps us to better understand the neuroscience behind the behaviors and decisions that we see and shares how practices like perspective taking can help build stronger cultures. So welcome, Zab, we're excited to get to speak with you again.
Zab Johnson 01:55
Thanks, Toni, it's really great to be here.
Toni Dechario 01:57
So, what brought you to the Neuroscience Initiative? What interested you in thinking about the intersection between neuroscience and business outcomes.
Zab Johnson 02:08
I've spent a lot of my years as a neuroscientist, actually thinking from the 20,000 foot view, to think about how applications from neuroscience can be moved more into the world, to real world, to address challenges that are cropping up in individual lives and in our organizations and in our societies at large. For many years, I led the Duke Institute for Brain Sciences, together with my colleague, Michael Platt. And when he moved to the University of Pennsylvania, there was this incredible opportunity to sort of shift our direction and really make an impact within the business community. This had been something that I think was a really interesting new dynamic, to take neuroscience into the real world now, not just within the sphere of academia, but also to interact with those who are out there in the world, really, at the frontlines of trying to figure out challenges, to our work and to our thinking,
Toni Dechario 03:12
Before we get into discussing decision-making and the linkages between our brains and decision-making, I wanted to ask you quickly about something that you introduced me to, which is neuro-economics, kind of bringing together learnings from neuroscience and economics, which are very different disciplines and very different ways of thinking about the world. And if you could just give us kind of a quick introduction to what that is and why it exists.
Zab Johnson 03:43
The first papers in academic work came out in there probably late-1990’s and early-2000’s, so it depends on your perspective of time, on recency, but neuro-economics was really this sort of burgeoning new interdisciplinary approach to combine some of the more theoretical and behavioral approaches that were in economics, in the world of neuroscience, and to think about the mechanisms by which we do things like calculate value, and think about how decisions are made under risk and uncertainty, but also the brain mechanisms that are at play, in order to determine better outcomes. So not just taking behavioral approaches, theoretical approaches, but actually sort of looking underneath the hood itself at the data that the brain produces, in order to get better predictions of how those things are calculated.
Toni Dechario 04:41
What are some things that we've learned about decision-making as a result of fusing neuroscience and economics into this neuro-economic approach?
Zab Johnson 04:50
I think one of the things that's been really critical is the idea that we don't always have access to the information that's part of our neural processing. And so things that people will report, self-report, or even do, aren't necessarily directly correlated with what the underlying brain activity is. I think this is really apparent actually in the sphere of decision-making, often because we rationalize our decisions. And sometimes, as we know, from heuristics and other kinds of behavioral psychologists’ findings, we are, in fact, irrational much of the time. We rationalize our decision-making process as if it has rules governing it, but the reality is that underneath we are complexly organizing our emotions, our emotional responses, our internal motivations and incentives, as well as a weighting of accumulated evidence, as we approach different things. And the brain's mechanism for making decisions is at play constantly. We think of decisions, you know, order of things that are really, really important, like making investment decisions, what to do with our financial resources, when to buy a house, those mammoth kinds of things. But we also are making decisions all the time, at the subconscious level. These are things that help us navigate through the world, almost without thinking, this more automatic kind of processing of information that we used in the moment and then disregard so that we don't become exaggerated. And I think some of the most important work that has come out in the decision sciences, from the neuroscience perspective, is sort of a glimmer into all of that that goes underneath. It has impacted theoretical models for how we can better understand more complex and real-world decision-making in contexts where the stakes are very high, to these very low-level decisions about which snack food we might grab in the line at the grocery store. And I think it's a really exciting place to understand the rich complexity of the neural underpinnings across all of those kinds of decisions.
Toni Dechario 07:07
So that kind of gets us to the heart of the question that we really want to answer as part of this series, which is, why do otherwise ethical moral people make unethical decisions? Is there something that neuroscience can tell us about why that happens?
Zab Johnson 07:27
I mean, I think what's really interesting here is that we live and work in highly complex social environments, and our work environments and our societies are getting more and more complex, and more, you know, global and larger as communities. And so many of our most important decisions are actually made in the context of our interactions with others. And I think we have thought about moral reasoning and ethical behavior at the level of the individual for a long time. “What are the drivers and the motivations and the incentives of an individual?” And I think one of the challenges is that because we live in these highly dynamic, rich social worlds, we also know that an individual-based perspective is probably not going to give us all of the answers. We really do need to start to think about more complex social interactions and the social aspects that are at play in these environments. And we know that rather than a property of the individual, sometimes, in these kinds of behaviors, it's actually the decision not of an individual, but the decision of a larger group and the dynamics of the group at play. To further answer your question about this idea of why otherwise good people would do bad things, we also know that there are specific brain areas that are at play in things like moral decision-making and moral reasoning. And these are areas that are also implicated in our social cognition. So, you can sort of see why these things are intertwined. These are areas of the brain that are things like the temporal poles, the medial temporal junction, which is an area that we know is really important and valuable area that is calculating our theory of mind our sort of our calculations of the motivations and effects of others, in other individuals, and ability to take perspectives of others. And also, the medial prefrontal cortex and work that was done actually, at the University of Pennsylvania by my colleagues a number of years ago, actually measured brain areas during moral reasoning. They actually had MBA students, 700 MBA students, take a moral reasoning questionnaire, and then they could separate them into different cohorts. And in a subsample of that 700, about 62 individuals, they measured their brain activity. And what they found was that there were differences in reward-related areas, especially around the dorsal medial prefrontal cortex, for individuals who are higher in moral reasoning, and then others who are lower in that. The really interesting thing about that particular study is that it's sort of a chicken-and-egg debate, right? Is this innate? Are these differences in the brain structures and the strengths of those networks, something that's malleable and can change over time? How much of it is driven, you know, genetically predisposition only? And probably, it's an interplay between both of those things, a little bit of nature and a little bit of nurture.
Toni Dechario 10:39
Well, this is interesting, because there's this kind of play between the individual and the collective in driving decision-making. What do you think the balance is between those two?
Zab Johnson 10:50
I think that you have to align the motivations and the incentives of the individual along with sort of an emphasis right of the community, and cultural environment as well. We know that people that have high moral reasoning, are more pro social, right? They do more community service, they're more likely to do charitable giving. So, they're thinking about their impact on others. I think that there's this other balance that we need to think about from both the scientific perspective, the business perspective, and how these things can come together, to really strategize about new solutions that emphasize both sorts of individual decision-making, but this sort of rich social aspect of our work and our society.
Toni Dechario 11:38
Let's start with the individuals. If there are individuals who have high moral reasoning and individuals who don't, how do you find the individuals that do? How do you know whether somebody has high moral reasoning?
Zab Johnson 11:49
Well, in the lab, you can give people questionnaires. Should we give those to people as we onboard them in human resources? You know, maybe, right? That's it, gives an indication. I mean, we were going to push things even further, right? If we do believe that predictions about structural brain differences amongst individuals, what might imagine a time in the future where we would actually measure the biometrics and neuro-metric signals directly from the brain. Now, that gets us into another kind of ethical dilemma, sort of the Minority Report idea. Right? But I think, as a toolset, we are at a really pivotal moment, where we are using more data analytics to make different kinds of decisions on who we hire, who we assigned to different kinds of teams and different kinds of work. And I think that there's always going to be this interplay between the sort of ethics and responsibilities of the leadership in our societies, to keep questioning how we're applying different methodologies to doing that. I do think that there needs to be a push towards modernizing some of the things that we've used routinely, in our human resource management, things around personality measures and skill sets. Because the future of work is radically changing. And what we're asking people to do is really different than it was, you know, even a decade ago, but definitely more than 50 years ago, and the kinds of diverse teams that we want to put together to strengthen organizations and to create better group decisions. It, you know, needs to catch up with that as well. It would be great if we knew, devise better methods to both sort of take individual differences, but also to say, “These people together would just be a fantastic team, where they will do innovative, creative problem-solving, and come up with brand new strategies that could really impact the future of a business, an organization, and our inner lives.” That's the key, that's the dream. But one of the things that's still missing, I would say, as a scientist, is the applicability out of a well-controlled laboratory into the wild into the real world. There's still a disconnect, there's still a long way to go. So, in the context of what we talked about earlier around neuro-economics and decisions, we can simulate value-based decision-making, using very typical economic games, things like the ultimatum game, or the Dictator Game, are classics, in economic theory. We can see how people in the lab that respond under those circumstances, make their decisions, and what the underlying underpinning neural activity is. But that is a far cry from understanding what happens in a high stakes environment which you can't ethically simulate in the lab. You can't make people lose their entire retirement portfolio and know the kinds of differences that might happen as they age and make different kinds of investment decisions. You can't really put a team together and say you know that this might lead the business to go under in any way except a theoretical way in the lab,
Toni Dechario 15:15
There are many avenues I want to pursue with you, based on that answer. Maybe I'll start with the most recent, which is, what are the biometrics that are kind of indicators of good decision-making?
Zab Johnson 15:28
Of course, you always have to define “good,” right? So that's something again, that in the lab, you can divide the task where there's an optimal decision, that's not necessarily guided by personal preferences. We can do at the individual level, we can use techniques actually, like eye tracking. This is a technique that I use, often to look at where people gaze at a screen-based decision. A lot of work on this kind of decision-making, unfortunately, done with snack foods, something that's very low stakes, where you ask individuals to choose between two different kinds of snacks, and those kinds of gaze patterns are actually predictive of ultimate choice behavior outcomes. It can be modeled really nicely as an accumulated drift diffusion model that shows the sort of interplay of the different variables and weightings of the gaze trajectories. How long you look at something gives an indication of how sure or confident people are in their decision, their starting point, whether they even gaze at all at the other alternative, or only briefly gives an idea of biases and preferences. Those are some of the biometrics that one can use in this that are really simple and non-invasive. Don't require, you know, millions of dollars of equipment, are things that consumer insights in business are already using. Mostly in the marketing and advertising context, but more and more actually, in financial institutions, because there's sort of a rich data set already, that's being gathered about how long people spend on a website, how often they might click on something. There are other aspects of mousing or using your trackpad or your cursor, that mimic the kind of things that you do with your eyes, use of eyes and your hand movements are known to be linked together for obvious reasons. We often look where we want to pick something up or choose something, and so those mechanisms are linked together.
Toni Dechario 17:32
I want to go back to moral reasoning. We were talking about how and whether you can hire for moral reasoning. Is there a way to teach or grow someone's moral reasoning? Is that possible?
Zab Johnson 17:46
Yes, I think that it is. I mean, again, we don't have all the answers of how much is you predetermined and how much is experiential learning. But there's indications especially around social cognition, that suggests that there is plasticity, there are changes that can happen with practice, and with pretty simple applications that include things like perspective-taking exercises, and other aspects that we know build more awareness of the others around us and their own motivations and goals. Perspective-taking practices can include things like listening to others, and then thinking about how they might see the same challenge in front of them from their own sort of set of experience. Of course, this is always biased, right? We always make assumptions about others. So, this is one of those areas where perspective-taking exercises can really grow that capacity. There needs to be a back-and-forth dialogue, usually, in this situation where people can correct you when you've made an assumption about the way that they see the world. This is work that we did actually with a Swedish bank called SCB. We partnered with their organization and their risk team to look at how teams that were involved in a professional perspective-taking workshop – instead of training that they would do over the course of several months – would actually impact their abilities to make better, more optimal decisions moving forward, and even work together more effectively in light of the pandemic. So, we know that there's some flexibility and perspective taking that can be exercised. You can't take someone who lacks any perspective-taking skills. There are some people who are sort of born without that ability. It just seems to be miswired. Clinically. We can't take those people and turn them from a one into a ten. You can sort of think of it as the dial, and there's some wiggle to the dial. Most people are somewhere in the middle and then life might alter some of that. We might be able to shift that knob to a seven, you know, up from a five to a seven, if you practice if you really take a concrete effort to do so, that's at the level of groups. And that's one of the things that I wanted to emphasize, is that a lot of organizations are really emphasizing how we might help revise and help knowledge-sharing at the individual level for a lot of these things. This is also the same for things like implicit bias training, right? And things like anti-racist theory. There's a lot of evidence that individual fluctuations are pretty noisy. And what you really need to do is emphasize the collective, really emphasize the organization at large, to really shift those kinds of things. So, one of the things that we know is impacting that, is work that Adam Galinsky and Emily Falk – Emily Falk is here at the Annenberg School at the University of Pennsylvania and one of my good friends and colleagues – their work actually suggests that the higher up you are on a socio-economic ladder, your own perceptions of your position in the world, the more tamped down your perspective-taking is without practice. So, those are who are in leadership positions are actually at the highest risk for deficits in their own social cognition, interestingly, so we sort of reward them by elevating them to high status. But organizationally, I think we have disregarded the fact that we actually may need to do more work to make sure that our leaders continue to have very effective perspective-taking abilities, and that they are probably in the most need of continuing to work at that skill set than the workers who are in the trenches doing most of the groundwork. So that's a really, really important aspect of perspective-taking.
Toni Dechario 22:00
That's fascinating, especially for us, because the population that we are dealing with are bankers, who tend to be higher up the socio-economic ladder than society at large. And so, is there something to take away from that? Is there maybe something unique about banking and financial services that might lend itself to perhaps lower levels of moral reasoning, as a result of this perspective taking issue?
Zab Johnson 22:28
I think, on the whole, yes, probably. I mean, it's not going to be unique just to financial services and banking. Any organization has a natural hierarchy, and our society does as well. Sort of biologically, it makes sense that the lower you are on the totem, the more you actually need to integrate the motivations of others around you in order to optimize your survival. And the higher up you are in that social ladder, the less important all the others are around you. So, there's a deep biological root to this. But the great news right is that we can do a lot to practice this. And I think if we can move our bankers and financial services personnel to think of this not as sort of a wishy, soft skill, but something that's really critical for organizational culture, for leadership, development, and balance, and to minimize some of the risks that's inherent in this business challenge, I think that will go a long way, I think, of really helping. And like I said, it doesn't require any biometrics, doesn't require any expensive equipment, or even having a neuroscientist, you know, on consult. These are well documented strategies that are behavioral-only that teams and groups can start to implement right off the bat. But the reality is that these are things that need to be reinforced and practiced, over and over again, you know, throughout our professional development. This is not a “one and done.” This is not a “tick the box, and past that, and we're moving on.” This is something organizationally that we actually need to build into our teamwork and into our cultures and keep it maintained throughout our professional lives.
Toni Dechario 24:22
It's fascinating, because I wonder whether, you know, a higher socioeconomic status, it seems to me is largely about money. And banking is the business of money. I wonder whether there is a connection between being in the business of money and the ability to take on others’ perspective. I wonder if there's a link there.
Zab Johnson 24:48
I wonder as well. I mean, it's a great question. You know, I think if we can do more real-world experiments and can think about individuals and their predispositions even for different industries, I think we can begin to sort some of that out. Banking is not – and actually, you know, economics – are not the only aspect of power and hierarchy. They are linked, of course. As societies we reward those monetarily, usually, who are higher in power. But there are examples where that's not the case. And I think that it'll be interesting to see, we know from the neuroscience perspective, that the areas that encode things around value are not just about money, that these areas also encode other things that we prioritize. So, this can be, you know, the perception of your own family and its value to you. This can be other aspects of what you find rewarding and motivating. We, again, haven't thought a lot about individual differences in those kinds of motivations and goals at the individual level, as well as in the collective. But we also know that sometimes throwing more money at someone is not actually going to be the right motivator. I think we're right in the midst of watching that, for example, in the certain industries, and during the pandemic, that we are going to become more sensitive and perhaps a little bit more creative about thinking about how people's motivations and values might be on something other than money. You know, it might be time. It might be the flexibility of your work environment. It might be the team that you work amongst, that could have as much or more significant impacts in making that a rewarding and stimulating kind of environment than just making a higher salary. Again, I think this is something that in the financial services, there hasn't been that much creativity, thinking about the individual differences between values and motivations.
Toni Dechario 26:55
Before we move on from this topic, I wonder if you could walk me through – if I wanted to run a perspective-taking exercise – what that would look like? How would I go about doing that?
Zab Johnson 27:08
We have one online at Wharton, there's a Wharton nano tool that's freely available. But roughly, it's to think of, for example, a professional one would be to think of a challenge that you are facing, either in your work environment, or even in your personal life. To reflect on that for a little bit. And think about your strategies around that. And then think about the strategies and perspectives of the other person. So, in this, it's some sort of modest perspective-taking. So, I wouldn't like, fill the room with people, like in a team, a more complex team. But think about how they might be thinking about the same challenge. You can do that with past challenges that you might have had, or figuring out who is going to pick up your kid from childcare, or how you might have asked someone to help with a more recent challenge in your work, think about the ways that that went or didn't go as planned, and then think about the way that it might have impacted the person that you are thinking about. So that's one really easy way to do it. I do this exercise in a different context, I do this kind of exercise with small teams. In an art museum, I really like the idea that how we see is also very personal. And it's based on our own learning and experience, although we sort of go navigate through the world and think that others see the way that we do, we know that's not the case. So, actually, if you get a small team of three to five individuals around an artwork, and you choose an artwork that, you know, it doesn't have all the answers right in front of you – most of them don't – what you find is through an active discussion amongst team members. You first write down the first five things that they saw, and then share them one by one by one with each other. There are these sort of “Aha!” moments where people are like, “Oh, I didn't see it like that.” And that is a really excellent perspective-taking exercise and application that's really neutral. You know, sometimes I have pushback from executives and business professionals because they think it seems really different from their work. But at the end of the exercise, when we go over the motivations and the goals, they realize that that actually has really enhanced their ability to work together with their team, to take a risk and feel more psychologically secure, that they can say something but others may have a different perspective and perception of, and come up with better strategies going forward. We are poised to try and do some data collection of that exercise and move it from sort of anecdotal evidence, hopefully find some of those sorts of systemic changes that are might be happening in our bodies and our brains that can corroborate why. I have found that that's a particularly strong exercise.
Toni Dechario 29:55
Let's take the SCB example. Several months on, after conducting this perspective-taking exercise, was there a difference noted in how decisions were made? Who was contributing to those decisions? Whose perspectives were being considered in those decisions?
Zab Johnson 30:11
There was more effective teamwork, especially in the context where teams were thrown into remote work. They inherently reported more trust and psychological safety with one another. So, some of the early evidence, there has been some transformation. In our laboratory experiments, the different groups didn't actually reach the optimal decision. So, it shows, that we know we can’t – it was a small sample, we don't actually know whether that was due to language-, context-related differences between Swedish culture and tasks that we had devised. So, we would have to do, we'd have to do more work. But one of the things that was really interesting was that groups were more likely to have a balance of a turn-taking in their discussions. So even though they may not have reached the optimal laboratory-controlled decision, ultimately, the indication was that they were actually listening to other people and that more voices could be radically different from a team that had never gone through this kind of experience. So, I think there's room more to study for sure, and in different contexts, but there's also some indications that teams are actually doing better work together by integrating more diverse perspectives.
Toni Dechario 31:32
Do you have any other thoughts on important elements to either encourage speaking up, or encourage listening on the part of the decision makers?
Zab Johnson 31:46
We know that strong cultures tend to actually retain workers who share a similar viewpoint, and that that sometimes is not a very great environment. So, it's that interplay, I think, between optimizing culture, in the sense where you do that to really strengthen teamwork, and this idea that multiple perspectives enhances your ability to problem solve and innovate and create new strategies, you actually have to diversify. But I think right now is sort of a counterpoint to this strong culture, right? This drive for strong culture. Because, again, we don't want to overweight the groupthink idea, the single viewpoint aspect. I think that the more that we can diversify everything from our leaders down, the stronger that will be, right? Again, it's building in the idea that what the culture of an organization values is the diversity of perspective. But there's so much work to do, right? You, you have to make individuals and teams feel more secure in that environment. And this gets to this idea of balance, of sensitivity to reward and punishment. That's a whole other dialogue of neuroscience. And to realize that when you have your voice heard, that will be information that's not used against you, personally and professionally, is one that I think is inherently a challenge in this particular kind of context. So, building more diverse teams is the first great step. And then, you know, perhaps integrating into our workplace, this idea of real perspective, taking professional development opportunities for our groups, not just our individuals, but our groups, again, reinforcing that that might be really important at the level of our leaders, and those at the tops of our organizations that have to actually be prioritized. I think we have a long way to understand the difference between how we code an organization and build one that we feel has our backs as employees, while also realizing that nothing can be permanent. And that, we as individuals, but also as an organization, have to know what we can't guarantee. Like the tenure system and academia is relatively a relic of the past as well. So, I think these things are really going to be some of the grandest challenges of business organization moving forward.
Toni Dechario 34:24
You've already given a few recommendations. I think, perspective-taking is clearly a recommendation. Do you have any other recommendations for organizations that are looking to reinforce more ethical decision-making?
Zab Johnson 34:39
There are lots and lots of ways that we can reinforce that. I mean, I think as organizations we have to constantly think about how we're addressing motivations and incentives, for example. So, a little bit of what we talked about before, sort of knowing that there are individual differences. There are strong drivers to that, that individuals can change their attitudes towards social norms relatively quickly. And I think, you know, in the financial services industry, on the whole, we know that the most egregious aspects were where there have been ethical and moral challenges, has been this misalignment of incentives. What the organization is actually asking their employees to do, without regard, actually, to the methods by which they get there at the end product. And so, that reinforcement of considering the humans and the brains, that are behind your business, is a really integral aspect to the work that any organization is doing. We're powered by our brains. We're not always rational. We are influenced by others. And I think we can go a long way to actually share knowledge with one another and begin to come up with new and transformative applications that prevent those kinds of things, just through open dialogue and knowledge sharing. I don't have any great examples of how to build your team to be highly moral. That's, again, a really big thing. But you can devise ways to make your teams more effective. Again, like decision, group decisions, there are ways that you can optimize feedback. Even in person, once we can return to in person meetings, I think there's a lot of body signaling. If you can emphasize that people need to put their phones away during meetings and be fully present and attentive to the others in the room. It just comes from leadership, that goes a long way to enhancing our abilities to take into account how others are feeling and thinking and what their motivations are. It also emphasizes that everyone in the room is sort of important. So those are, you know, there's really simple ways of starting to rebuild some of this. And then to question, are the brains behind our business capable of reaching this goal, or are we asking them to bend something, because it's not possible if they don't?
Toni Dechario 37:20
There's nothing that sends more of a message of, “I don't care what you have to say,” than someone looking at their phone while you're talking.
Zab Johnson 37:27
I think we're gonna have a really big challenge moving forward, because we are all out of practice.
Toni Dechario 37:33
There are people who have meetings with me, and I know they're working. They're typing an email while they're talking to me, and it's like, “I can see you, I can still see you.” It's kind of the same thing, right?
Zab Johnson 37:44
It is the same thing, I think. And we also jump to conclusions, right? So, using a lot of different kinds of visual or non-visual cues. You know, when you hear the keys clicking, you believe that they're doing something just like if they're looking at their phone underneath the table. That's a such an obvious cue. I don't know why people hide the phone underneath the table. Except it's sort of an inherent admission that you're doing something surreptitious. They are not really paying attention. So yeah, no, I think that's right. You can get really paranoid when those kinds of cues are either correct or incorrect. Your assumptions about others does matter in how you feel within your own environment. And I think those are those really easy things to fix that, again, don't require a lot of resources, but do require some willpower.
Toni Dechario 38:40
Last question from me, what resources on these topics would you recommend for people that want to know more?
Zab Johnson 38:49
I would be remiss if I didn't mention my colleagues' book, The Leader’s Brain, which is published by Wharton press that came out in the last year. It's an easy quick read. It sort of has a lot of Wharton neurosciences, thoughts, and ideas behind it, sort of encapsulated in a quick – it's actually designed to read in one sitting. It's not something that is a coffee table book or something, but isn’t a painful, long investment. It doesn't assume any neuroscience background, either. And, of course, keeping in touch with us The Wharton Neuroscience Initiative. We are an open and inclusive community. We actually want people from the public and outside of academia to contribute to our knowledge and to our questions. And so, we actually have events that are free and open to the public. We have social media channels that people can follow and a general mailing list that people can find on our website. That's neuro.wharton.upenn.edu. So that's a really easy way of staying abreast and in touch with the work that is relevant and important in this area.
Toni Dechario 40:03
If you want to hear more from Zab watch the New York Fed’s webinar on “Trust and Decision-making," which you'll find at newyorkfed.org/governance-and-culture-reform.
Mikael Down 00:01
We say culture is important culture matters, well, how?
Zab Johnson 00:05
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 00:10
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 00:16
So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 00:25
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 00:30
I can't emphasize enough the disempowerment of the lonely decision maker.
Taya Cohen 00:35
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 00:40
It doesn't make you unethical to wrestle with this. The smart money is we help you to try to figure this out.
JB 00:48
This is “Bank Notes: Banking Culture Reform.” The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 00:57
Hi, and welcome to the Banking Culture Reform podcast, part of the New York Fed’s initiative to drive awareness and change in financial services culture. This series, “Norms, Mindsets, and Decision-Making” explores questions like, “Why do ethical people do unethical things? How can organizations encourage staff to speak up and executives to listen up? And what role does context play in shaping people's behavior?” My name is Tony Dechario, and I'm a member of the New York Fed’s culture team. Today's conversation is with David Grosse. David's responsible for conduct risk culture and behavior at HSBC global banking and markets. A longtime banker, David became convinced that understanding human behavior was the only way to really understand what was happening at a bank. So much so, in fact, that after a 20-year career in operations, audit, and operational risk, David went back to school to earn an MSc in behavioral science. In this episode David will tell us about his own experiences with context and group behaviors, including at rugby and soccer matches. Welcome, David.
David Grosse 01:55
Thank you very much, Toni.
Toni Dechario 01:57
So, David, I want to start off by asking you how you landed in the spot where you did? You've had an interesting journey to get there. What motivated you to work on culture?
David Grosse 02:07
I've been working in banking for quite a long time now, over 25 years. I'd say that my background was kind of the more traditional route, in risk and control, coming through internal audit roles, head of operational risk roles, traditional roles. And I think I got a sense as time went on, that maybe I was missing something, maybe there was a sort of sense of frustration there. Or maybe I shouldn't say frustration, maybe I should say curiosity. It felt to me that we were expending a lot of energy in traditional areas, but not necessarily getting the outcomes we're expecting. I think, particularly, I look at my life in operational risk, I got into operational risk at the time – it was the thing to get into around the turn of the millennium, with things like Basel II – and trying to make it much more of a sort of scientific discipline, if you like. And I thought it was very interesting that ever since then, through the millennia, it's been bizarre that a lot of the issues we've had in banking have been in exactly that area. We don't seem to have tackled the root causes at all around such as comebacks. We've had hundreds of millions, in fact, hundreds of billions worth of costs and fines in the space. At the very time, we thought we were focusing through operational risk on the areas which would help reduce the risk. It doesn't seem to reduce the risk. It almost seems to have increased risk. So, I was frustrated, but curious, and really wanted to try and understand why. I've increasingly got into looking at the drivers of human behavior. Because we work in banks, banks are full of tens of thousands, or, in the case of my institution, hundreds of thousands of people. And that's what really drives what's going on. This is what we are. And so, without understanding human behavior, to me, it felt like we were never going to really properly tackle risk within banks.
Toni Dechario 04:07
So how did you convince HSBC that this was something that they should let you do?
David Grosse 04:11
It really was just introducing it through the work we were already doing. So, we were already doing a lot of work around the broader conduct agenda, about trying to improve what we're doing around the conduct agenda. And in doing that, we took it apart and said, “Well, if we don't tackle – not just the hardware, not just the middleware, the areas where things might go wrong conflicts of interest, etc., but actually – the underlying human behavioral point, we were never going to crack this.” And like many banks, we had a fair bit of scrutiny from some very important regulators, so we needed to focus on it. And so, there was definitely an appetite to look at this in a more scientific and behaviorally scientific sort of approach. I think a key part of it is not making what we do and how we think about behavior and behavioral science, just about the bad stuff, if I can put it that way, just about looking at the downside of cultures that aren't working. There are many, many areas of applicability in how we look at our own staff, how we look at our behavior with our customers. Indeed, if you look at things like trader behavior, asset manager behavior, there are lots of very interesting areas of exploration, which are good for the P&L, but equally good for the internal culture of the organization. So, I think focusing on the broader benefits you can get from behavioral science is absolutely key, and not just pocketing it into one specific area of looking at the downside of people's behavior when they misbehave.
Toni Dechario 05:52
I am going to focus for a minute though, on the bad stuff. One of the central questions we're trying to explore through this series is, “Why do otherwise ethical people do unethical things?” And I wonder if you have some perspective on that, after a few years being out this.
David Grosse 06:11
That is such a great question, many books have been written on the topic. I think I'd start with looking at the what's called, “the better-than-average effect.” And so far, as you know, many people are aware that people think they're a better driver than average. Everybody thinks they're a better driver than average. I'm not great at statistics, but that seems unlikely. When it comes to your own concept, your own ethicality, it's even more extreme. People think they're better than the average around conduct and ethics, etc. And then there was an interesting piece of research in the UK, which actually looked at prison populations, and asked them questions about their view of their relative efficacy versus the wider society. And again, it came out that they believed they were more ethical than the wider society. And in their heads, they had all sorts of situational reasons, which led them to do the unfortunate things. I actually did a little bit of research when I was doing my master's in behavioral science, looking at the better-than-average effect. And I'd set out all sorts of different ethical scenarios, which I was getting people to rank themselves as to how they thought they'd behave in certain situations, on a sort of scale of, you know, they would always exploit the situation or they'd never exploit it. And I deliberately, as I, after I did that, also asked the question at the end about, “How do you think all the other people in the survey would have responded when compared to you?” The reason I did that, is because if you ever want to do any research, and you want to cast-iron way of getting a result, is look at the better-than-average effect. Off the back of that, I think it was something like they felt that 58% of people would exploit situations more than them. That about 36% would exploit the situation is about the same, you know, there's about the same as me. And only 6% would exploit the situations less. So, you can very much see how people position themselves as being at the better end of things, ethically. Now, if you think about that, that means that people basically think that a lot of what they do is ethical. It is, in effect, you are kind of blind to some of your own behaviors, because you have rationalized them internally. So, that is a really important thing to consider as to why people do unethical things, and then you can actually come up with other examples. I'd use an example where, when I used to commute into London, before the COVID days, I would drive into a car park, and I'd have to tap into my phone the fact I've got that car park to pay the carpark fee. And then I drove up into London, I'd go to work. And then as I was coming back out again, on the train, I suddenly have this thought of “Oh, no, I don't think I paid, you know, my car parking fee.” I'd get back to my car in the carpark and celebrate because I didn't have a penalty notice on my car. I climb into my car, I drive home and that would be it. Is that in any way ethical, because I haven't paid the car parking fee? When I gave that example, people also very kindly gave me all the reasons why it was perfectly acceptable for me to do what I've done, which is very kind of them. But what it shows us that in your daily life, you can have all these situations where you rationalize things which ultimately perhaps aren't ethical, because you understand the context of it yourself. And so, you know, beyond that, I think there are lots of other things that tip people towards perhaps potentially behaving in less ethical ways. There'll be things that, you know, around being tired and emotional. There’d be the sense that things aren't fair. The slippery slope is often quoted where a very small transgression, you know, gradually builds and builds, and you can't back your way out of it. And things like loyalty to others is really important. Because ultimately, many decisions, you're in a conflict situation, “What's best for me? What's best for my team? What's best for my family? What's best for the bank? What's best for wide society?” Sometimes those things can be in conflict with each other. And then how do you balance those things? That kind of loyalty thing often comes into play. And maybe one other thing I'd say that sometimes allows people to rationalize the behavior is, they go, “Well, everyone else is, so I'm going to as well. If I don't do that, then, you know, I'm the outlier.” So, probably one of the key things is the context and situation in which they make decisions is really, really key. And it's not just about their own personality, in isolation.
Toni Dechario 10:54
I think that the concept of ethical blind spots, of maybe not recognizing the lack of ethicality in a decision that you've made, is a really central one. Do you have thoughts about how to help people recognize situations, decisions as ethical ones, how to help people kind of frame a decision as an ethical decision?
David Grosse 11:18
So, the question being, how do you help people recognize potential ethical blind spots in their own behavior, or their own situations, and I think that is actually a really key point. When we're looking at conduct and culture in the financial services industry, for the reason being there are very many situations, scenarios, gray areas, conflicts in financial services, over and above that of other professions. I think that's why we have many issues in banking, is because we've got a plethora of opportunity and a plethora of gray areas. Getting people to understand the behavioral drivers and the situations, scenarios where they may be treading into dangerous areas is really the key of tackling these things. I think a very important thing to do is look at things like conflicts of interest, and really have a detailed way of understanding in a particular role, in a particular product, in a particular location, where you might be exposed to those conflicts of interest, and really try and think about it as broadly as you can, because it's from those situations, it's from where you are conflicted, that you will start to nudge the interests of one party over and above the interests of another party. And obviously, there's some more basic ways that you can do it, when faced with a decision, is to do what people often say, sort of look at how you think you'd explain that to somebody outside of the bank? How would you explain that to a journalist to ask you that question? How would you explain that to a regulator and ask that question? And probably more importantly, how would you explain that to your family, to your friends, to your parents. But it is absolutely key, I think, to conduct and culture within financial services in understanding the full population of the gray areas that you might be faced on, on a day-to-day basis.
Toni Dechario 13:14
It's such an interesting area. We had a conversation with an academic at Carnegie Mellon, a woman named Taya Cohen and she studies guilt proneness, and the linkages between someone's guilt proneness, using similar questions and scenarios to what you described. So, it raised that guilt was an interesting conversation to have, and you've touched on some of the same topics that she did.
David Grosse 13:38
Toni, you I mean, you raised the question there a bit of guilt proneness. And how do you, in essence, get people to feel that, because that then helps moderate their behavior thereafter? And that's actually something I did look at, in my work around my master's in behavioral science, and I was particularly looking at a potential use of oaths in financial services. So, very similar to what's used in the medical setting with Hippocratic oaths. Can you use oaths in a banking setting, to see whether that can impact behavior? People may be aware that that's actually used in the Netherlands as a mandatory thing. It's used in Australia as an optional oath for bankers. And I was sort of doing a bit of research to try and understand does it actually make a make a difference, and that kind of plays into this guilt proneness thing, and, really, what you're looking at there, is something called cognitive dissonance. To see if you can get people to think about the sort of person they are, and if people have two thoughts about the sort of person they are, and the activities they undertake, and they're in dissonance with each other, or clash with each other, that makes them feel really uncomfortable. And they want to bring those two things in alignment with each other. So, they may, well, adjust their behavior. If it doesn't come in line with the sort of person they think they are, and potentially doing things like an oath can set that up. It would have to be done in quite a thoughtful way. But if you set something up where you give people a lot of sense of personal choice over about what they're committing to, or how they're promising something, so they feel like they themselves invested in it, if you then wrap around that quite a lot of ceremony about how you are publicly explaining to peers and to other people in your institution, and potentially to the wider society and family, the sort of person you are. So, if you kind of make it a personal thing, a thing with choice, a thing that resonates to somebody, then actually, there is evidence that that has impact on subsequent decision-making. Because, obviously, it will then set up that guilt proneness, it will make you feel awkward, it will actually give you some sort of physical side signs that you're feeling, that you're in discomfort, and then you might dial back on the sorts of decisions or unethical decisions that you might otherwise make.
Toni Dechario 16:01
The question of an oath brings me to another question, which is, if there were an oath to be instituted, it would likely be some sort of industry wide initiative, right? It's, it would be unlikely that it would be one firm that would have bankers take an oath. And in some ways, that culture issue is something of a collective action problem, because bankers move around, people move from institution to institution. Sometimes whole teams move. And so, what is one organization's either problem or success can quickly become another organization's problem or success. Do you have thoughts on a sector wide approach that could be deployed beyond an oath?
David Grosse 16:45
The first thing I'd say is, I think of all the things that we can look at around banking, banking culture, risk-taking. So, I think this is an area which absolutely requires collective action, which absolutely requires institutions to share openly with each other. Because as I say, all boats float in a rising tide. So, I think that what I wholeheartedly encourage people to do is to reach out to all the other institutions that you see out there that have behavioral science capabilities, that have thoughtful cultural capabilities, and discuss with them what you're doing. I don't think there's any sort of secret sauce in this where people should be holding on to their own cards as if it's some sort of magic, sort of P&L advantage over other institutions. So, I would definitely encourage people to speak to what I call the Coalition of the Willing of banks and bankers who are already engaged on this journey. I'm not going to name them in this podcast for fear of embarrassing them all, but I think they're relatively well known. So, I certainly encourage pan banking engagement with institutions who are on this journey already. I would definitely encourage joint work with academia, it seems to me that academia has done so much work in this space, which is of use to wider corporate life and to wider banking life. And they sometimes struggle with the applicability into day-to-day corporate situations. On the other side, we're in the banking world, and we struggle with actually getting the kind of scientific underpinnings of what it is we want to do. I just think it's so important in this particular area that we have joint work with academia, that we have industry-wide relationships, we have bilateral relationships, to try and improve what we're doing, and to do joint research projects. Then, everybody wins in that. I think, equally, there are some key regulators who are much more advanced in this area. I do want to just deliberately be very nice to the New York Fed on this call, but definitely the New York Fed, the UK FCA and others have some very good resources that are easy to find that takes people on the journey.
Toni Dechario 19:06
Thanks! So, I'm gonna shift gears a little bit and look at decision-making again. What, in your opinion and in your studies, do you find makes someone either willing or unwilling to raise concerns to decision-makers?
David Grosse 19:23
I think the first thing I'd say is, when we look at “speak up,” I always want to try and reframe it towards the “listen up” side. Because we're great as an institution saying, “It's about ‘speak up,’” which puts the onus on the individual, which makes the individual feel, “Okay, something about me, it's something about me needing to be brave to raise a topic.” And actually, we should invert it as much as we can into the “listen up” construct, into, “How do we ensure the environment that has the institution and management listening?” rather than, “It's about the people, why aren’t you speaking up?” So, I think that's a key inversion, I'd like to put in. Then the other thing I'd say is, it's also very, very important to look at it in the context of day-to-day conversations, and the safety that people feel in meeting situations, in bilateral conversations, how do we get people comfortable in those environments? I think too often we frame it as if it's the egregious end of the situation where something really bad happened, and somebody has to flag that and, you know, a whistleblowing case goes in, etc. And I think that's too mischaracterized. I think the most important thing we need to do is to ensure we're looking at the day-to-day decision-making environment, because it's within that that gradually issues will come to the surface. If we get that right, it's very unlikely we'll need to spend too much time, you know, looking at the more extreme end of “speak up” and “listen up.”
Toni Dechario 20:52
What hinders listening up, and what can be done to improve it?
David Grosse 20:58
The key thing I'd say, is that it's not necessarily deliberate. I think there's a sort of sense that people feel, that maybe there is fear that is percolated through organization, and that that washes people's comfort in talking up. But I think the key part of it is actually people not necessarily understanding fully the power dynamics. So, it may not be deliberate. But if you don't give people the sense of psychological safety, the sense of easily being able to discuss topics, then they will feel that they can't, even though you haven't deliberately put fear into them, if you like. So, understanding the power dynamics is really, really key, and giving people permission to talk in an open environment is important.
Toni Dechario 21:46
The BSB has written and spoken extensively on the “fear versus futility” roots in speaking up. How do you think about the futility angle of that? How do you advise management at HSBC to combat any potential for your staff to feel a sense of futility? And speaking up?
David Grosse 22:10
I think there are a number of elements to it. The one is the… if people have raised points before, and no action has come off the back of it, then it comes into the, “Why should I bother?” points. That's one element to it. And I think the other point is, if and how people raise concerns, or how they escalate issues, it is felt the onus of responsibility, or the hard work falls on the person who's raised it, that's definitely also going to impact how people want to engage in it. Because if they feel that it's a time absorber for them, if they feel that, in raising an issue, it puts a lot of onus on them to do a lot of hard work, then that's going to knock their sense of wanting to get engaged. So, you know, I think it's both, “Have I raised something before?” and, “How's it been actioned?” But also, “How much hard work is this going to infer upon myself?” And so, obviously, it's so important that people feel points are actively listened to and actioned, then you need to give very quick and engaged and transparent feedback to people. Even if, in effect, it's bad news. I think what people sometimes forget is that, if there's no specific good news that can be handed to somebody about how a decision’s being made, how their life may be made easier, how something has been escalated, you still need to communicate with them what you've done, how you've done it, how they'd be listened to, because otherwise they feel it's just gone into a void, and that no action has been taken.
Toni Dechario 23:43
So, I want to move on a little bit to the question we think can be framed somewhat as a debate between personal motivations, and culture, or context and norms. And I want to ask you how you describe, how you think about the relative weight of personal motivations versus culture in shaping people's behavior, and shaping decision-making?
David Grosse 24:12
I think the first thing I'll say is, it's hard to ascribe an exact weight. There's a huge amount of work and research that’s gone into this, but I think the most important thing I'll say is that the impact of situation, context, and environment is often underplayed, and is far more important than that is often getting credit for. And it's the rebalancing we need to bring in, is to bring in the situational part. You know, psychologists call this the fundamental attribution error, is that, when we look at the behavior of others, we always tend to ascribe a personal, behavioral trait reason for it. And when we look at our own behavior, we can see all the situational factors that played into it. So, ascribing exact balance or relative weight is hard. You know my professor at the London School of Economics, when I was doing my master's, always used to say, “Context, context, context,” as to the sort of importance of situational factors. So, it's really important that we bring those much more to the fore. People don't act alone, they act generally as part of a team or a subculture. Those sort of subcultures are really important in how they drive people's behaviors. You can have two trading desks alongside each other, literally on the same trading floor, but they may have their own separate subcultures, which would drive the behaviors of the teams in potentially different and divergent ways. I have my own example, in my life, I have my own behavior, as to how a situation can drive a different set of behaviors in what would seem to be a very similar situation. I'm a relatively rare person in the UK, as far as I support a football team, which in America would be a soccer team, and I also support a rugby team. And those are kind of two relatively different sports. If you go to football, it tends to be quite an aggressive atmosphere, it tends to be quite tribal. And I've often found myself when I'm in the crowd, when I'm in a football crowd, I get myself quite aggressive. You get involved. With the chanting, you get involved with certain behaviors between different opposing sets of fans. And when I'm in the crowd for a football or rugby game, it's very different. You're mixed in with all the other fans from different teams. And it's a very sociable situation. And it seems to me very odd, that I can be the same person watching two different sports, but behave differently, given different situations. So, you know, I think bringing in the situational attributes is very important.
Toni Dechario 26:59
Such a great example. That's actually a great segue to my next question, which is, is there something about the context of financial services, and banking, that influences people's behaviors, and specifically, their decision-making behaviors in a particular way?
David Grosse 27:19
You know, there has been a fair bit of work on this, a fair bit of research looking into it as to, “Do bankers behave better or worse than others?” And what I can say from having looked at it is, the jury is out. Some research has shown that bankers appear to behave less well. They looked at this for a section of Swiss bankers, using some gamification, when reminded of their own role in banking, appeared to behave worse. But that hasn't been replicated. There's been further studies after that, which didn't come up with the same results, which tended to show that bankers didn't actually necessarily behave worse. I think what is different about banking and finance is, though, that there is a very high prevalence in banking of opportunity of conflicts of interests, of gray area situations. And that gives more opportunity for bankers to have a self. If you like a self-interpreted way of how they behave in the conflict situation, there's just basically much more opportunity, and much more opportunity to note one's own interests than other professionals’ situations. So, I think it's less that the bankers themselves behave necessarily worse or better. But actually, there's much more opportunity, and much more situations, which tends to therefore lead to more issues. I think you could actually move this also into the world of biology and neurobiology, in a situation where there are big rewards and uncertainty that can impact dopamine and that could impact behavior. And there's also a lot of work around things like elevated cortisol levels, elevated testosterone levels, which if they happen in conjunction with each other, can lead to more unethical decision-making. So, in some ways, it's unsurprising that in certain situations, behaviors are driven by the environment of banking, particularly. I think a few other areas that should probably worth mentioning about banking, which may be play into this is one probably the remoteness of bankers from the end people that they are dealing with sometimes. When you can't see the impact on an individual, that gives a little bit of extra moral wriggle room. And something else would be, the framing a lot of the activity in banking around, necessarily, around economic terms, which again, can lead to ethics seeming to be a little bit further down the track. And a lot of that sort of situation played out in 2008 in the GFC, and in the complexity of the products and in people not necessarily seeing the impact of the decisions on the people on Main Street. The GFC, being Global Financial Crisis.
Toni Dechario 30:06
The pandemic obviously has created distance between all of us. And so, I wonder whether you've seen an impact on decision-making, or whether you have thoughts on whether that's impacted decision-making?
David Grosse 30:17
The first thing I'd say is, you know, what a once-in-a-lifetime – hopefully, once-in-a-lifetime – opportunity for us all to learn. If any of us in the industry were to try and plan such a massive natural experiment, we would plan and plan and plan and never get around to doing it. So, the fact that over a few weeks, there was a remarkable change in work patterns and how we worked and where we worked, and we've lived with the longer-term consequences of that over an extended period of time. It is absolutely an error, I think, as an industry and as banks, that we should be furiously mining to learn from, because it is such a rich population to have of interesting behavioral experiments, if you like. We actually did some work with a psychologist, trying to look specifically at people's lived experience and staff's lived experience of the of the pandemic. And I think one very important thing that came through on the impact on decision-making or how people felt their experience of jobs is actually on the positive side, which I'm going to raise insofar as people felt that because of the speed of change, that there was an agility and an empowerment that hadn't necessarily been seen before. And people actually took that as a positive. Then, equally, there has been something that's come through on a sense of being more trusted, and with there being more agency for people in their day-to-day work. Because necessarily, when people have been working more remotely from each other, you've tended to need to be judged more on the outcomes of projects rather than sort of day-to-day micromanagement. So, I think an interesting thing, and that came through strongly from our research, is that those things were seen as an absolute positive. And I think, therefore, it's very important that when we look at the impact of the pandemic, where people work, hybrid, working, etc., we don't just focus on physical location, we don't just think about office, home, home office, isolation, etc. We actually look at what was it that changed in the way we did work, which people took as a positive, and they would like to continue with, regardless of whether they're working in an office location, or home location. And those things around trust and empowerment and agency, I think, are really important. And across all the banks, and, you know, if we look operational losses, or conduct issues, etc., it'd be important to look at those and say, “Well, you know, what actually happened during the pandemic? Did they go up? Did they go down? Did they go sideways?” Because maybe we'll learn something there about the nature of, not necessarily it being hyper control, hyper surveillance that impacts people's behavior, but maybe there's something else around trust agency, empowerment. And I think that's definitely an important, important lesson to learn. There are obviously a lot of other issues within the pandemic that definitely need to be taken into account, and people's living situation. Some people, it's much harder than others, for new joiners or junior staff at the start of their career, definitely much harder for people to sort of get a sense of, if you like, the corporate culture, harder to have the watercooler conversations. So, there are definitely some downsides. But I definitely would also like to raise the upsides of behavioral change, which it's important for us to weed through into the new world, the new ways of working.
Toni Dechario 33:52
You know, we've kind of come to understand over the years that conduct is much more likely to be impacted positively by things like trust empowerment, agency, by providing people positive motivation, versus trying to police negative behaviors, right? So that's such a great point. I want to ask you one last question, which is especially important for us to hear from you. Tell us about a success story, what's worked for HSBC? And what would you recommend for other organizations that are trying to reinforce better decision-making?
David Grosse 34:30
I think the first thing I'd say at the macro level is, is embrace the topic of organizational psychology, behavioral science, behavioral economics, and get on the journey. Try and ensure you have some sort of SME capability and build on that, but don't think it's something that somebody else does for you. You might need some expertise, but, absolutely, you want business engagement, you want business ownership. You want it to be something that everybody owns. You don't want it to be seen as something that a department does for you. So that's the sort of macro point. But for us, to look at one example, which I think maybe was a bit different, was that actually we did look at doing some work around the positive side of the behavior curve. So, rather than always looking at the negative side, rather than looking at the behavioral outliers from the downside, we did a piece of work, and then, again, using a psychologist to sort of say, “Okay, do we have some examples out there? Who appear to be the people who have strong ethics, have a strong conduct profile, who people look up to? And what is it, if we unpeel that, what is it in their DNA, which is different? Or, what is it that we would like to build upon?” You know, we did quite a lot of deep-dive interview work with those people, to try and peel the onion, and then what came through on that was very much that these people had a sense of agency over their own work. It wasn't work they were doing for other people. It was they had a sense of agency in what they were doing, and a sense of trust in what they were doing, which they took as absolutely key. They had a common work, and home persona. So, they didn't put on, if you like, a different hat when they went to work. They had a strong sort of lineage through the personal life and the work life, about what they saw as the right way to work and the right way to behave, which continued all the way through. They also had the ability to sort of push on through problems, to take ownership for problems and see them as things to sort of tackle with and not things to be defeated by. And I think there is a lot more depth in looking at the positive side of the behavior curve, which is an industry we could go into, and try to rebalance the fact that probably across the industry, in all our risk departments and compliance departments, etc., you know, we spend 99% of our resources and money looking for the 1% egregious outliers, and the opposite, you know, only 1% of our resources covering the 99% of people. And I think rebalancing that and looking at the 99%, and looking at the positive side of the behavior curve, would definitely give us all a lot more bang for our buck in the industry.
Toni Dechario 37:21
That's great. David, there's so much, there's so many kind of common threads that you've woven. I think that the sense of agency is a good one. I want to just thank you so much for joining us today and for being part of this conversation. We were thrilled to learn from you.
David Grosse 37:37
Okay, well, thank you very much as well.
Toni Dechario 37:40
For more conversations like this, as well as publications and other resources related to banking cultural reform, please visit our website at newyorkfed.org/governance-and-culture-reform.
Mikael Down 00:01
We say culture is important culture matters, well, how?
Zab Johnson 00:05
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 00:10
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 00:16
So, one of the things we look for picking flight directors, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 00:25
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 00:30
I can't emphasize enough the disempowerment of the lonely decision-maker.
Taya Cohen 00:35
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 00:40
It doesn't make you unethical to wrestle with this. The smart money is we help you to try to figure this out.
JB 00:48
This is “Bank Notes: Banking Culture Reform.” The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 00:57
Hi, and welcome to the Banking Culture Reform podcast, part of the New York Fed’s initiative to drive awareness and change in financial services culture. This series, “Norms, Mindsets, and Decision-Making” explores questions like, “Why do ethical people do unethical things? How can organizations encourage staff to speak up and executives to listen up? And what role does context play in shaping people's behavior?” My name is Tony Dechario, and I'm a member of the New York Fed’s culture team. Today we're speaking with Betsy Levy Paluck. Betsy is a professor of Psychology and Public Affairs at Princeton University. She studies the way group norms are shaped and changed, including how social perceptions and networks can be used to influence behavior change. We first featured Betsy when she was a panelist in our webinar on how Diversity, Equity and Inclusion influence organizational culture. In this episode, Betsy will explain, among other things, why we're really all just middle-schoolers at heart. Thanks for joining us, Betsy. Welcome.
Betsy Paluck 01:50
Thanks for having me here, Toni.
Toni Dechario 01:53
Maybe we could start off with kind of a broad introduction to your work. And you could talk a little bit about what motivated you to study social psychology.
Betsy Paluck 02:03
I'll tell you one story about the first study I ever worked on as a research assistant. What it says about the appeal of social psychology for me, and it's relevant to some of the things we're going to talk about today. I worked on a study of sexual harassment. It was an experiment in which the professor who I was working for, Marianne LaFrance, and her graduate student, Julie Woodzicka, posted an ad in a Boston newspaper for research assistants, and they invited in women who really believed they were coming in for an interview. And they were randomly assigned, experimentally, to either be asked about God or asked about whether they would wear a bra to work in the course of this interview with the same guy who interviewed all of them. So, the idea was, ask everyone something weird and kind of inappropriate, but in one case, it was a sexualized question. And unbeknownst to these women, who were fully debriefed afterward, there was a camera recording their reactions to these questions. So, I'm wondering if you can guess… Did people answer the question about whether they believed in God and whether they would wear a bra to work? Do you have any idea? Was there a difference?
Toni Dechario 03:11
Oh, yeah, I suspect that people did answer the question about whether they would wear a bra to work and thought that the God question was totally inappropriate, and said so.
Betsy Paluck 03:18
So actually, there was no difference. Everyone answered it. Everyone wanted the job. And it was pretty undetectable what question they were asked. So, my job was to turn off the volume, and just watch these women's faces as they were asked this question. And it turned out that the only way you can really tell the difference between these women and the way that they smiled and self-presented in these interviews, was just how much they smiled. So, when they were asked about whether they would wear a bra, they didn't have what, at the time, in this study, we described as the Duchenne smile, the smile when you smile with your whole mouth and even your cheeks. With the God question, they were more likely to actually smile – out of surprise, out of shock – somewhat, you know, uncomfortable amusement, but they answered. And so that was something that I think a lot of people wouldn't have predicted at the time, that given the situation in which women were in this position of very little power, they needed the job, many people thought that women would refuse to answer the question. And the experimenters even gave them an out, they gave them a chance to complain about the interviewer afterward, and almost no one did. I just thought, you know, I want to be one of the scientists who is working on questions of such macro-political importance, but boiling it down to those micro-level moments, day by day, that you don't realize how power really rears its head in these kinds of moments. It got me hooked, and I started working on questions later on when I was a graduate student of social norms and how people could – surprisingly, wouldn't predict it – participate in things like mass violence or discrimination or hate crime. But given the right circumstances, given the right kind of leadership or peer pressure, etc. So, that's how I got into this.
Toni Dechario 05:07
I love the way that you describe these kind of big macro issues and these issues of such import that feel really overwhelming. And the ability to break that down into tangible, bite-size learnings is so important. So, one of the central questions that we're trying to answer with this podcast is, why do otherwise ethical people do unethical things?
Betsy Paluck 05:32
Why do ethical people do otherwise unethical things? I think that one of the big ideas of social psychology is that it's kind of hard to just label someone “an ethical person.” It's hard to label oneself anything that will endure situation-to-situation for eternity. We definitely like to think of ourselves as ethical, and many of us try very hard. And that is a top value to be ethical, in all of these different situations in our lives. But I think that the big idea is that there are individual differences, but also there are these situational, contextual pressures on us. And so, it's better to ask about ethical and unethical cultures or ethical and unethical situations. And it's not to say that we have no free will or possibility to assert our values in these various situations, but social psychology really recognizes that sometimes we can even think of ourselves as acting in an ethical way. But the water that we're swimming in is directing us, in certain types of ways that we now come to think of as ethical, but we wouldn't have judged it in that way to start. It's very hard to overcome the perspectives afforded by your surroundings. I would say that we would point to contexts and not always people, in terms of, “Why would an ethical person do this?” It's more like, “Why wouldn't you do this in an unethical, fundamentally unethical situation?”
Toni Dechario 07:02
Explain what you mean by, “Why wouldn't you make the unethical choice if the context around you is encouraging it?” Is that what you mean?
Betsy Paluck 07:09
I think that we can start from the premise that most humans are striving to see themselves and to be seen as upstanding people, as ethical people, as people who can be counted on, etc. This is part of our drive for self-esteem and so forth, not in terms of just feeling so good about oneself, but the true meaning of esteem, which is to think of oneself in a positive light. I think that, you know, certain circumstances, we define behaviors for us. This is one way in which circumstances can change the way in which we behave even when we're striving to be ethical. So certain behaviors that perhaps outside of the situation, we might label as unethical, they start to seem as ethical, perhaps, because most people are doing them and perhaps because along with that, they are justified using different standards than we would normally use. And so, all of these kinds of ideas, framing how certain behaviors are framed, norms, how many other people do them or think them to be either typical or desirable, and then personal accountability and responsibility. I think you see them in the sort of post-mortem breakdowns, when people are trying to explain why they've done something unethical. I think that we're really good at seeing them in 20/20 vision, right, that we're looking back and saying, "Well, these are all the things that I realized pushed me to do these things." We're less good at recognizing them in the moment, right? You'd really have to, you know, be bicultural, you'd have to be a fish that swims in two seas, and very few of us have that privilege, where we're immersed where we are. And so, maybe we can see it in retrospect, when someone comes to punish us or to question us. But it's very hard to detect that in the moment. That's the compelling thing about being a human, is that we get our reality from those around us, from the way issues are framed, from the way that we're led, etc.
Toni Dechario 09:11
One of the things you talked about was framing. The thing that I think about often in this context is when I first learned the phrase “normalization of deviance.” It was actually reading a study of the Challenger disaster (Diane_Vaughn_-_The_Normalization_of_Deviance.pdf) And there was a problem with an O-ring. And so, there were people raising this issue saying, “This is a problem.” But those who had lots of experience who had been there before, were saying “Yeah, yeah, but we're on a tight deadline. We've got a lot of pressure on us. And this has never been a problem before.” And so, this is kind of… how do you help people gain the confidence and the clarity to say “No, this isn't this isn't okay,” you know, to push the topic?
Betsy Paluck 09:58
So, dissent is a really great example. It's a topic that I've studied a lot in the past, in part, because I've worked in countries where there's been a genocide, and one of commentators’ favorite things to say about countries that have had a genocide is that they must be a very authoritarian context with very little dissent, people must just be ready to follow orders, that must be somehow in their bloodstream. I don't think that's the case. I do think that dissent can be something that is cultural, however, that is encouraged, and that's framed, as we're talking about, as a named practice, a recognized cultural tool. And I think that two things have to go together. So, one would be this kind of framing or naming of the behavior. So, in this case, it would be dissent or speaking up in a meeting. I think that what has to happen is dissent itself has to be named as an action, as a behavior, and almost branded in a way so that when you're doing it, you can say, “Well, I'm going to do this now.” It has to have a positive connotation. A lot of people speak up in meetings, I think, when they're worried about disagreeing with people to say, “Well, what about this devil's advocate position, right?” So, people sometimes frame it this way to maybe soften the blow or, I'll say to my graduate student, “Well, what if a reviewer were to say this to you in your paper?” So, they frame it in ways that make it more socially acceptable to speak out in disagreement or to critique. But the other thing that has to happen is that they have to see that it is acceptable by watching others do it. This is really the critical tipping point issue for many cultures, for many organizations, is that not only do you have to name it as a desirable practice, but you have to show people who are doing their social reality testing all the time, they're looking around others and saying, “Okay, you say that dissent is desirable, but is anyone else doing it?” You need to seed it – S-E-E-D – seed it within the folks at your organization or in a society, so that you can both name it as a desirable cultural practice – and believe it. Not just, you know, declare it so, but actually look around and see that other people recognize that you're doing this thing that has been encouraged, and that they're doing that themselves. And lots of people have different ideas about how to spread these ideas through an organization. Some people really argue that you have to focus on the ground up, that you have to look at a social network and say, “Get the people who are, you know, sort of most surprising to adopt these behaviors first, maybe they're not really the highest places in the hierarchy.” Because as the work of Jen Daniels and Dale Miller would show, they would say, “It's always really informative to look at people who are actually at the bottom of the totem pole, look at what their behaviors are, because we just assume that they're following the norms, the regulations, right? They're not like improvising here, they don't have enough power to do so.” So, if you look and see what people who are in the lower strata of this hierarchy, what they're doing, you sort of think, “Well, if they're dressing up, then it's probably a dressier place.” Other people have ideas that it should simultaneously come from the top, that leadership should name the practice and that opinion leaders should be adopting it so that those who get the most attention in an organization are the ones who are spreading it. And we've done some work on that, showing that that can successfully spread norms. So that's the idea, is that you have to try to make it cultural. And starting to think about where you encourage this practice is part of what I do in my own work, is thinking about networks and how to do a little bit of social engineering with them to make things seem more cultural.
Toni Dechario 13:42
So, you said that in cultures that go through that experience, genocide, there's always been this assumption that there's not dissent. But that's actually not true. There is. And people are speaking up. It would seem to me as if maybe they're not being heard, or at least not being effective in their dissent. And so what makes dissent effective?
Betsy Paluck 14:07
Well, what makes dissent effective? Sometimes all it takes is a minority opinion. Sometimes that's a very powerful thing in a group. And there's been a long tradition of work on group discussions in psychology, that shows that a single person or, you know, a pair of people who are the minority of a group, their dissenting voices can really change – if not the nature of the decision – it can change the degree of the decision, and it really can shape group discussion in very helpful ways. You know, things can also be really overtly structural first, that we say, “Well, we're going to red team this idea.” So that's a journalistic idea of, “We have this official, actual group of people that comes into question, a decision that's been reached.” And so, I think, having these sort of meso level structures, like groups of people who are formed with the express purpose and the license to do this is another way to think about this until this gets into water. I think also, then, we need to be very careful the way power is distributed, too. I mean, if we want to go back up to the macro level and to politics, well, dissenters are often silenced, because they're simply crushed by political power, machinations, and so even at the organizational level, one must be very careful that the red team has sufficient power, that there isn't the case that this is done for a performance right? Or for even just helping the original team to just justify their views, just to flush out all of the objections, come up with ways to argue against the dissent, but that there's a balance of power, in terms of the people who are discussing diverging viewpoints.
Toni Dechario 15:56
So, going back to “the water that you're swimming in,” do you think that there is something about finance potentially, or banking? Is there something about the water that influences the ethicality of decisions? In your view might there be?
Betsy Paluck 16:16
I have to say that one thing that psychologists aren't very good at is history. We're trained to look at the world around us right now. And the constraints on us right now is, we're making a decision. We're not alone in this, though. I think that, you know, everybody forgets history. And that's part of what makes culture, is just this built-up routinized way of doing particular things. And there might be something about the history of the sector that could contribute to certain types of decision-making, maybe individualized, concentrated in certain pockets of power, whose voice counts in these sectors. So that's something I'd like to point to as a weakness in my analysis, but one that I'd want to attend to, from a psychological standpoint. I think, also, though, you need to think about the structure of an organization. I think this is something maybe we're better suited to do is to say, “Who in this moment is making most of the decisions? Who was consulted? Are there consultation processes that are put in place?” And psychologists, we think a lot about voice and identity, right? And so, would this seem as ethical to a decision-maker, if they were making the decision in front of this group, in front of that group? “Whose identities are considered as different decisions are made?” I am not an expert on the financial sector. And so, I think that those are the questions that I would ask. I mean, part of the reason that I was attracted to social psychology is that – to harken back to this earlier example I was giving of women being sexually harassed in an interview – if you ask most women whether they would speak up if they were sexually harassed in an interview, even back then, in the 1990’s, when this experiment was being done, I think they would say, “Well, at some point, I would, maybe. Not right, exactly in that moment.” But they didn't, you know? We're not always good at predicting our behavior. And so, our actual behaviors don't align with our projections, our values. And so that's the point, though, that it's helpful to ask these questions about ourselves, rather than, you know, try to make judgments about whether this or that sector is particularly primed for unethical decisions.
Toni Dechario 18:24
I want to get back to what you were talking about in terms of our ability to predict our own behavior and identity. And what our visions are of ourselves. Then get a sense from you of how much do you think that decisions day-to-day decisions at work are driven by a particular identity versus social norms and others' expectations of us, as manifested through culture?
Betsy Paluck 18:56
What's more important, your personal identity or the norms, the cultural norms, in terms of what's pulling on you to influence your behavior? I would say that it's just impossible to pull those two apart, because norms belong to certain identities. And that's what's tricky about norms, is that you could have a norm at your company about compassion or care for the climate or ethics, but if you don't feel that that norm applies to you, right? We really think that norms have these little identity tags with them, you know? So, this group of people thinks that this is really important, “But I'm not one of those types of people, right?” So, the trick is to really try to communicate a norm, suffuse it throughout the entire organization, so that, you know, belonging to the company or working at this institution – the norm is about that identity, that really holistic identity, rather than the social justice warriors and in this branch of the company or, you know, the younger people who care about this stuff, or the old timers who are really interested in moving in this direction, right? And so, the real trick of a cultural engineer who wants to change the patterns of behavior in a multicultural, multi-identity space is to work on creating that shared identity. And then you have the grounds on which to build norms about that identity. If people don't feel like they belong to your organization, then you've lost your ability to really reach everyone in that powerful way. I still think there's a way to build shared norms, even when there isn't a very strong common identity, but that's harder. That means that you have to try to build the norm in each of the groups, right? And we've tried doing things like that in middle schools, which are notoriously sort of defined by different grades and different cliques within grades defined by race and class and interests. So, you know, working with the people who are very important to each of those identity groups, we were trying to change conflict levels and norms about conflict at the overall school level. And we had some success. But I'd say that's a really uphill challenge. If you have the capacity to do it, really working on that shared identity, making people feel like they belong to some common home with a shared identity, that's a really strong basis to work off of.
Toni Dechario 21:27
So, you said that you had some success in trying to create some shared identities, can you talk about what you think contributed to that success?
Betsy Paluck 21:36
I think that the success that we had, Toni, was not, in this particular case, in working in middle schools, was actually not in creating a shared identity. And it's especially key for that developmental age. That’s when you're actually trying to establish an identity, you know, as separate from your parents, as you're going out into the world, you're trying to find your group. So, we actually didn't try to create a shared identity at these middle schools. What we tried to do instead was to work with all of the different groups, and choose people in those different groups, networks, who were important to them, and work with those folks as opinion leaders to try to change everybody's ideas simultaneously of what was desirable at that school. For that reason, it was really critical for us to work with people who are normally not picked for opinion-influencer positions, because they were coming from groups that don't even have high status within the school. So, I think about this with respect to organizations quite a bit as well, because you don't have to be a 14-year-old, to be entering an organization and trying to establish yourself, and establish identity and distinguish yourself from others. It's not always the case that you come to an organization, you try to be like everyone else, that just doesn't make sense, career wise, right? So I think that, you know, the middle school comparison fits uncomfortably well, for a lot of adults actually from this very different developmental stage. And so, sometimes it could be really difficult to build a shared identity. And instead, it really makes sense to try to understand an organization very well, and who are the different identity groups there. And to address the norm that you want to build, of dissent, of ethics, of fair treatment, of compassion, whatever the case may be, within each of those groups, and with the people in those groups that matter to those different identity groups.
Toni Dechario 23:20
When I think about the organizations that we are thinking about, they range in size, but their cultures are enormous, and have a ton of those subcultures that you talked about. And it sounds like from your description, there's really no way to skip the hard work of trying to understand them one by one.
Betsy Paluck 23:44
I think that, like so many things, thinking locally, is often the most effective, right? And so, I would say that that applies to identities and norms as well. Because the people who you really care about are those who you go to work with and see every day. And the feedback that you care about is not just from a more distant supervisor, but from your peers who you're working with. This is the other hard thing is that when you want to try to, you know, have leaders model, some kind of behavior that you hope to diffuse throughout a large network, a large place. Sometimes the examples of that behavior doesn't make sense to everyone. You know, there are many people in an organization, who actually aren't even in a decision-making position. They're supporting those decisions. So, what kind of information do they send up to the decision-maker? What kinds of objections might they raise, what cautions would they attach to certain bits of information that they're sending along to decision-makers? So, if people have different roles in the process, then also thinking about the role that they play, and what does ethical behavior look like for them?
Toni Dechario 24:52
Do you have thoughts about how the pandemic has potentially influenced our decision-making behaviors?
Betsy Paluck 24:59
I have to say, like everybody else, one reason why I'm struggling with this question is that I just don't know what other people are thinking right now. We've been isolated from one another. We haven't had time to check in and to form impressions of what others are doing. I'm not sure. I haven't done research on it. And so, it's hard to make generalizations. I think that this is a time of such unsettledness. There's a sociologist Ann Swidler who talks about settled and unsettled times. Settled times are those times in between disaster, you know, pandemics. She actually studied the AIDS epidemic, was the unsettled time, when everyone's trying to figure out what to do, again, how to think again. I mean, psychologists talk a lot about short-term versus long-term thinking. I think that folks aren't even sure what kind of thinking to do right now. And so, this is where I think guidance goes a very long way. Some more centralized guidance, or feedback from peers, goes a very long way toward forming up and firming up our impressions of how to behave going forward.
Toni Dechario 26:07
That's interesting, that it is actually probably a good time to try and establish norms, if ever there were time to step into the space. You gave some recommendations about if you're an organization that's trying to affect better decision-making, more ethical decision-making. One of the things you mentioned that you might think about is really trying to engage with and understand individual subcultures within an organization and what they value. Do you have any other recommendations for an organization looking to imprint positive decision-making?
Betsy Paluck 26:45
I would want to know more about whether the folks who you're working with – your colleagues, your reports – what are they trying to do? Are they trying to make better decisions? Is this a goal of theirs? As a psychologist, you always want to know what way the tide is going, right? Are you working with or against the tide? If you're working with the tide, if you feel that people are already trying to do better, and they have this motivation to do better, and something's in their way, then it's a detective story about what are the obstacles to them making better decisions and more ethical decisions and/or broad thinking, etc. And that is more likely to be successful. When the tide is against you, when they don't share your goal, that's a lot more difficult. Then you need different tools. Then it's not about removing obstacles to their path. It's not about the sort of the famed behavioral science advice of make it easy, make it available. You can make arsenic easily available to me, I'm not going to drink it, and I don't want to drink it. Then you want to try to pair their motivation to try something new with something that they're already very motivated to do. I'll use this example – that was made into a great example of norms by Debbie Prentice – which was trying to get men in the American South to stop dueling back around the time of the inception of the United States. They were dueling all the time, as many people remember from Hamilton. And it was against the law, but it happened all the time. And men were not refusing a duel because it would be against your upper class, gentlemanly honor to do so. So how do you how do you fight against that? I mean, we've been asking this for a very long time right now, “How do you fight the tide of masculinity?” Right? Well, it turns out that the most successful project for reducing dueling was to disqualify men who dueled from running for office. And what was so compelling about that was that these men were in positions where that was also considered a masculine duty for men of their class was to run for office. And so, it gave them not just motivation to not duel, but a face-saving out to say, “I cannot duel, I owe a duty to my county, to my state, etc., to represent them.” So, you know, I just offer this up as a metaphor. This is sort of the… it's not exactly the same as carrots, right? It's not saying, “We'll pay you $100 to come get vaccinated.” I think it's more psychologically sensitive. It says, “Here's something that's very important to your identity, perhaps in some ways, important to the same identity that's driving you to do this other thing. So, can we let these two compete? And can we give you an out?” Now, the important thing about this case of dueling is that you could die if you duel, so, probably a lot of men did want to get out of dueling. It just shows you the difficulty, right? That if people really want to do what they're currently doing, you're trying to stop them, well, psychology might not always be the ultimate answer, these psychological manipulations in that case. So, trying to figure out how to frame and introduce and enforce new regulations. Maybe psychology will work with you on that. But this is my basic advice, to first go in and say, “Which way is the tide running? Are they going with you? Are they going against you?” And so, it's a different conversation. And it's a very hard hill to climb if they don't want to do this. I do think that if you find situations where you can characterize it, as you know, in their heart of hearts they wish they didn't have to do this, to have high status, you found a moldable feature of your organization. That's good news.
Toni Dechario 26:45
I want to share with you something that one of our other interviewees talked about, because I think you touched on this a little bit, and I'm curious about whether you have thoughts on it. One of the academics we talked to is a woman named Taya Cohen, and she's at Carnegie Mellon, and she does a lot of work on guilt proneness. And how much guilt proneness leads to good decisions, ultimately, is what she finds. But another thing that she studies is game mindsets. So, she says that a lot of what can lead to unethical behavior is when you're in a context where you're playing a role. And so, you're not bringing your own kind of moral awareness or rationalizing to that game, because that's not what one of the rules of that game is or are. And you kind of touched upon that I'm curious about whether you have any reactions to that framing of why people might make bad decisions.
Betsy Paluck 31:29
It sounds to me, like what you're describing is, one way of making people feel simply more accountable, is reminding themselves that they are themselves within this role. There's a lot of interest among behavioral scientists in general, I think, in trying to get people to slow down, to stop and reconsider from different angles, rather than rotely follow the rules of the role that they're in, of the group that they're in, and so forth. I think all of these are really interesting tactics. I'm really interested in what happens when you can do that with a whole group, though, because I think that what a lot of these kinds of tactics rely on is the individual decision-maker. And I just, from my perspective, I can't emphasize enough, the disempowerment of the lonely decision-maker. You know, and I want to get away from this individualization of the problem. I think that roles are roles because they're in relation to others. And so, you feel an obligation to carry out this role vis-a-vis others. And so, maybe we can provoke you to feel more yourself within this role and critically evaluate, “What would I really like to do?” as opposed to, “What is this role asking me to do?” But I think that it's really easy for the role to win out because it's ultimately obligations to other people. And so, a lot of these kinds of fast-thinking versus slow-thinking, and reconsider from a different position, I'd love to see these be more social as interventions. So, getting a team together to think “How do all of your roles contribute to these kinds of outcomes? How do your interrelationships to one another, your obligations to one another, end up in you deciding the same kinds of things every time that are maybe suboptimal for the following reasons?” I think I'd be really interested in those kinds of interventions. So, basically, making things more social.
Toni Dechario 33:19
My last question is whether there are any resources on any of the topics we've talked about today, that you'd recommend to listeners?
Betsy Paluck 33:27
I think that, you know, one resource that a lot of your listeners may be paying attention to, if they're interested in decision-making, and psychological and economics views on it, is a new edition of Nudge was just published. And so, it's a look back at all of that work and reconsideration of what's worked well, and what really holds up over time and how to think about this. So, that would be good summer reading, I think, for everyone interested in this. I would hasten to say that there's sort of this middle level that I just want to keep encouraging everyone to think about, which is, on one hand, you've got behavioral scientists talking about individuals and how individual people decide. And then the other hand, they talk about, “Well, here's choice architecture, and how to design policies at your organization that affects everybody.” I think somewhere in the middle, we have to think about groups of decision-makers and their relationships to each other in their identities. I'll recommend a little paper that I wrote with my student that was published this past year, where it's sort of a nice take off on the Nudge book, which is we were trying to think about how to recognize culture when you design nudges and identities, and so we were working in a Chinese factory and trying to help them with this problem, how to get people to change their behavior on the floor of the factory. How to get them from directing their waste in a different way. My student came up with the most ingenious nudge, which was to paste these religious symbols on the floor. These religious Buddhist symbols that have been blessed by monks, that the workers were told had been blessed by monks. And it was a really sharp drop off in people dropping some of this waste on the floor because that had such meaning for people, that they understood, “This is a different space now.” And my student is herself Chinese and worked with the monks to have these blessed and to introduce them to the factory workers. It was a very simple way to take into account some of the things that I've been talking about.
Toni Dechario 35:26
That's fascinating. Okay, we have one kind of quirky question, Betsy, is there anything else that you are reading, watching listening to, that you can't get enough of and that you're wanting to tell everybody that you know about and that you want to share with us?
Betsy Paluck 35:40
In my spare time, I'm actually reading a lot of narrative non-fiction to try to redesign this policy class, but I'm reading in that way where you can't even let yourself get like completely engrossed and excited. I'm like reading with a purpose, frantically, in small bits. I don't recommend it to anyone.
Toni Dechario 35:59
For more conversations like this, as well as publications and other resources related to banking culture reform, please visit our website at newyorkfed.org/governance-and-culture-reform.
Mikael Down 00:01
We say culture is important culture matters, well, how?
Zab Johnson 00:05
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 00:10
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 00:16
So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 00:25
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 00:30
I can't emphasize enough the disempowerment of the lonely decision maker.
Taya Cohen 00:35
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 00:40
It doesn't make you unethical to wrestle with this. The smart money is we help you to try to figure this out.
JB 00:48
This is “Bank Notes: Banking Culture Reform.” The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 00:57
Hi, and welcome to the Banking Culture Reform podcast, part of the New York Fed’s initiative to drive awareness and change in financial services culture. This series, “Norms, Mindsets, and Decision-Making” explores questions like, “Why do ethical people do unethical things? How can organizations encourage staff to speak up and executives to listen up? And what role does context play in shaping people's behavior?” My name is Tony Dechario, and I'm a member of the New York Fed’s culture team. This episode's conversation is with Mark Roe. Mark is the head of risk culture at the Australian Prudential Regulation Authority, otherwise known as APRA. With a background in criminology, Mark brings a unique perspective to the supervision of governance, behavior, and culture. Today, Mark will share his perspective on why risk culture matters to supervisors and how organizations can impact decision-making among staff. Thanks for joining us today, Mark. Welcome.
Mark Roe 01:48
Hi, Toni. Good to see you.
Toni Dechario 01:50
So, we're focused primarily on decision-making in this podcast. But we want to kind of take a step back first and think about culture more broadly. So, I wanted to start off by asking you why you're interested in culture and why should others care about this topic?
Mark Roe 02:08
I mean, I've been interested in why people behave pretty much all my life. As a kid, I read lots of crime novels, and nonfiction as well around crime. My background is criminology, and I ended up studying criminology back in the 2000’s. So, for me, it's always been a fascination of mine. Why people behave the way they do. Behavioral Science has become really front of mind now. It's a popular discipline. Why should we care? I can't think of really any major issue, which has not had a cultural element to it. So, that's why I care. It’s also really important to go deeper to think about what we call risk mindsets – the attitudes toward risk, why people behave in that way, what is it about the culture and the environment that's making people behave in a certain way? And by doing that we can go much deeper, and try to fix that cultural issue, which should give us a better outcome.
Toni Dechario 03:05
You mentioned a fascination with why people deviate from kind of the norm, from normal behavior. And so, I want to ask you why that is? Why do people make unethical decisions? What's driving that?
Mark Roe 03:18
It's a really big question. I think most people get out of bed in the morning and want to do the right thing. Where things can go wrong is a number of factors, actually. There's probably not one reason why people will deviate from that. And that's why we at APRA have established the risk culture 10 dimensions. We think there's a whole range of dimensions that we should look at, when we're assessing culture, or as we do, risk culture. How are we incentivizing staff? We've had a Royal Commission in Australia, which has shown bankers, for example, charging fees for no service, charging fees of dead people, miss selling products. There's examples of that across the globe, you know, the Wells Fargo “Eight is great.’” Unfortunately, it wasn't great for the customers involved, or for the staff, ultimately, who lost their jobs. And the reason for that, you know, that's an incentive piece. So, we will look at how incentives are aligned to risk management, for example. Tone from the top, leadership, all of those things are important as well. If you're not driving the right culture at the top. I think, as well, we shouldn't forget about context. That's a really important part of culture – coming into work, working with a particular team. That's where you get your cues from, your manager, your immediate team. If the tone from the top is not aligning with the voice of the middle or the echo from the bottom. That's an issue in itself as well.
Toni Dechario 04:37
What do you think about the relative weighting of individual motivations, versus context and culture in driving decisions?
Mark Roe 04:48
I do think context is really important. So, for example, a very well used example is people driving on the motorway. You know what the speed limit is because the signs are up regularly and yet, as you look around you, you'll see people going much faster than that speed limit and you go at that speed or even faster. “Why shouldn't I?” So, even though you might feel that you have those personal values, that you're always going to obey the law and obey the regulations, that may not actually be the case. So, I think it's a difficult one to answer. I think we all like to think we will follow the rules. Where individual decision-making is also interesting, is the cognitive biases, we tend to try and think quickly. Daniel Kahneman has spoken about that as well and written about that – “thinking fast and slow” – and making those speedy decisions using heuristics or shortcuts to make decisions as well. So, there's challenges on both sides, whether you're making a decision based on the kind of aligning to the culture or making your own individual decision. Ultimately, I think it's difficult to call, what actual percentage that looks like.
Toni Dechario 05:52
Do you have thoughts on what organizations can do to create the conditions to allow people to think slowly, and maybe not necessarily rely on those heuristics all the time, in cases where an organization wouldn't want them to? Or kind of put more simply, do you have any recommendations for how organizations can influence people's thinking and people's decision-making?
Mark Roe 06:17
it's important for leaders to really ensure their staff understand, for example, what the risk appetite of the organization is, and how it applies to their role. So, I do think it's about staff being aware of what the purpose and values of the organization are. What is the risk appetite? What does it mean for them? But even more importantly, how they fit in within the end-to-end processes that organization is undertaking on a day-to-day basis. You might just be involved in a small part of that process. If you don't really understand why you're following a process or control, then you can quite easily try and go around it or not really take it as seriously as you should do. And I think it's about communicating that in an interesting way. And behaviorally, it's important to design training and to communicate with staff in a really real way. And explain to them, you know, put yourself in their shoes and actually explain risk and explain why it's important to follow the controls from an end-to-end perspective. You're all in this together. And I think it's important to have those discussions within your team and say, “Well, you know, what would you do in this situation?” or even better, as part of team meetings, I think a good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it. And this is how I learned from it.” Because it really opens up people feeling able to speak up about mistakes they've made, raising issues in a good environment like that. And you can all learn from that as well, because we all do make mistakes.
Toni Dechario 07:43
So many communications you can receive and kind of ignore as like corporate speak, or as just, you know, like this next kind of compliance training that you need to get through in order to get to the real work. And another thing that you mentioned is the importance of kind of team leaders – like, not necessarily just the people at the top, but individual teams having conversations about maybe ethical gray areas that they've encountered. How do you, combining those two things, get your middle managers to feel like this is something relevant and something that they want to devote their time to these conversations with their teams?
Mark Roe 08:21
It's really about helping to equip them to do that in order to have those conversations with their teams around, “What is ethical decision-making? How to do it?” There was a really interesting article, I think, in the Financial Conduct Authority's essays, where middle managers were really called out as being that really important buffer between the top and the bottom of the organization. So, we talk about “Speak up,” a lot. “Speak up about incidents that go wrong,” but it's also about really about the “Listen up and act,” because as part of that reaction, is the person who's made that mistake feels terrible? And, whilst that being dealt with, will it kind of put them off raising incidents in the future? So, I think it's really about talking to middle managers about understanding the pressures they're under. Looking at that as well. You know, how much time do they have to, to do their jobs as well as manage a team? And also giving them some tips and some guidance on how they might react in a better way, or have a think about how that's playing out within the team? Because often, it's not black and white. There's a lot of gray areas, and there's some tools out there, which middle managers can use to kind of talk through cases. “What would people within the team do in that situation?” And there may not be a right or wrong answer, really, there could be, within that gray area, a number of options to take. I think it's about really empowering middle managers to be able to have that conversation with their team and for the team to feel able to do that as well, as part of team meetings, as I say, talking about mistakes that have happened, you know, from the leader’s perspective, in an open way so that mistakes are seen as something to learn from rather than something to blame people for.
Toni Dechario 09:53
You've talked about kind of creating an environment of psychological safety without using those words, for creating kind of a “speak up” environment. How do you know whether those environments have been created, where you have an environment of psychological safety where people can speak up, and listen up and act? Attitude among leadership?
Mark Roe 10:13
We actually do survey entities directly. So, we survey staff within entities directly. And some of the questions will be around, you know, how easy is it to speak up? And so that is one data point, we recognize there's other things that you can do as well. But that's certainly one area that you can look at. If you're not doing that, you can ask for the entity's own survey results and get the full report and just see how those staff have answered. We're looking at metrics at the moment as well, that we can take. If you're not asking those questions, or not going in to do deep-dive reviews – because we also go in and do deep dive reviews that do interviews and focus groups with staff within entities as well – and that's something that we're promoting for the entities to do themselves, and to provide that information back to us. Because you can glean a lot of really strong robust information from those processes, as well as obviously whistleblowing, which is right at the other end of the spectrum.
Toni Dechario 11:06
Have you seen any examples of organizations that have recognized, perhaps because you've told them, or perhaps because they've done the kind of legwork that you've just described themselves?
Mark Roe 11:18
We are working on that. It’s a fairly long journey to change culture or risk culture. We do think at APRA, for example, because the risk culture 10 dimensions are becoming more known, and we're going to be publicizing them even more. And, you know, what does that look like? What do we look for, as part of that? Well, already a lot of entities are asking those kinds of questions, you know, “How easy is it to speak up? Do people feel safe to speak up?” and we can get that information back from them. We obviously did a huge inquiry into the Commonwealth Bank of Australia, which had a huge impact on the industry when that report was published in 2018. And again, we've seen a lot of entities do more work in the culture, governance, accountability, and remuneration space since that point, as well. So, I think it's about being transparent with entities about what our expectations are, and the more that we can do to get the message out, but also monitor what's happening within entities, the better. So, we will also go in and review what organizations are doing in relation to cultural risk culture, on a fairly regular basis. We've trained our supervisors in the risk culture 10 dimensions framework, so it gives them something tangible, that gives them a consistent framework to really tie into that what they're seeing when they liaise with those entities, because sometimes culture can feel quite ephemeral. It can feel quite intangible to people.
Toni Dechario 12:40
So, you've mentioned the 10 dimensions a number of times, and you've touched on a couple of them, you talked about incentives, you talked about leadership and tone from the top. Maybe you could just share with us what are the 10 dimensions of risk culture, according to APRA.
Mark Roe 12:52
I mean, I'll give you some of the key ones around the behavioral side. One of the 10 dimensions is communication and escalations. And that's what we've just been talking about around psychological safety. “How easy is it for staff to really feel able to speak up? And how is that escalation taken on board when that happens?” So again, the “listen up and act” piece as well. There is another one called “decision-making and challenge.” So, having diversity of thought as part of your decision-making processes, and also welcoming constructive challenge. It’s really important for managers and leaders to be able to take challenge and be open to that. Again, that aligns a little bit with learning from mistakes as well. There's also leadership, really important, that tone from the top is very important, also the authenticity of that tone from the top. So, leaders might say the right thing, but if their actions do not align with that, and you know, it's not authentic, and staff will pick up on that very quickly, as well. We also have a risk culture and board oversight piece that is important for us as a regulator, because we want to see how entities are really assessing cultural risk culture within the organization. And how is the board overseeing it? We have a requirement in Australia for the board to form a view of risk culture of their organization, and take action when necessary, and also drive towards a desired risk culture. So, we're really interested in seeing how they're doing that. So again, it's back to the entities to do this work. We have a framework and the tools, but it's really up to them to drive that. And another one I've mentioned is risk capabilities. And the ability for staff to really have the training, to have the skills in relation to risk management as well. Responsibility and accountability is another one.
Toni Dechario 14:37
What's the response been, you know, among the banks that you supervise to the introduction of these 10 dimensions and to having examinations for risk culture?
Mark Roe 14:45
It's been interesting actually, because we sometimes get questioned around, you know, if a regulator is sending a survey to an entity, then how open are the staff going to be in responding to that? So, the survey is part of our deep-dive piece. But we also have a new project called the risk culture industry wide survey, where we're looking to send a survey to up to 70 entities across Australia, to get a benchmark of risk culture, across banking, superannuation and insurance. The response so far has actually been, on the whole, quite positive, because I think from our perspective, we have a full range of supervisory options from the kind of carrot, which is really holding a mirror up to them doing a deep-dive review, saying that “these are some issues that you need to resolve” and then getting on with it, all the way through to enforceable undertakings or additional capital requirements that they need to hold on the harsher end of the scale. So, they're aware of that full range of tools and enforcement tools, which I think is important. I think it's always important to have carrots as well as sticks. But if we're approaching some of our work, getting them to improve themselves, and for example, we'll go and do a deep dive, we'll give them an assessment aligned to the risk culture 10 dimensions, so we can really show what's working well, because that's really important as well to show the strengths, but also the areas of opportunity to develop further. That gives them something to take away and to really action going forward. We've had quite a lot of experiences as well of entities seeing the value of having that, in a way, independent assessment being undertaken by a regulator, which hopefully they take in the spirit of, “you need to improve.” And it's important that you do improve, and we do monitor improvement following a deep dive to make sure that those actions have been completed as well.
Toni Dechario 16:33
I wonder if you think that there is anything about the financial services industry that might influence people's decision-making behaviors in a particular way?
Mark Roe 16:45
It's a really interesting question. I think it's difficult to answer that question. What I would say is, there's definitely been plenty of examples where things have gone wrong. And in Australia, we had the Royal Commission in 2018, which found a number of issues of fees for no service, dead customers being charged fees, mis-selling of products. So, there's definitely more work to be done. Customers have lost in some cases, their life savings, they've been mis-sold products, their homes. It's very important following that royal commission that we do look at culture, governance, remuneration, accountability, and look to improve that. Having said that, I do think financial services are probably more under the spotlight than many other sectors, arguably. There have been incidents, as we know, across a number of other industries. Volkswagen, with their emissions issues. Boeing with their recent issue. Rio Tinto in Australia, which is a mining company, also had some issues recently, kind of blowing up a particular part of Aboriginal land. So, I think there's examples across all industry types, but that doesn't get away from the fact that there are areas that we do need to improve within financial services. And again, it comes back to what we're looking at from the 10 dimensions point of view: incentives, remuneration, how governance is working… “What are those governance structures? How accountable are people within the entities as well?”
Toni Dechario 18:12
Do you have any sense of whether the pandemic has been a major influence on decision-making?
Mark Roe 18:19
I mean, obviously, it has, because decision-making has had to be taken fairly rapidly in an uncertain environment. Often people would want more information at hand before making decisions. They've had to do that fairly rapidly. Again, coming back to the cultural cues that people get from working within a team, how have they changed? You know, how do you communicate with your manager? How are you supervised now, when you're working from home? How do you feel aligned to the team or even to the organizational purpose and values and for that risk appetite? So, all of these things will impact decision-making. And it's certainly things that we need to look at as regulators as well, to make sure we're on top of that, and to assess over time, the longer-term impacts.
Toni Dechario 19:01
So, looking forward, what recommendations would you have for organizations that are looking to improve their decision-making among staff?
Mark Roe 19:09
I think it's about having the conversation, first of all, and actually identifying that. Because at an individual level, you'll be used to using heuristics or cognitive biases. And it's important to be aware of that at an individual level. In the workplace, when you're having those kind of group discussions. I think, again, it's important about having that overlay with ethical decision-making framework. So, we've heard probably all heard the mantra now, “Can we? Should we?” often in the past, we may have taken more of a legalistic approach and thought, “Well, you know, there's nothing specifically written in the standard or the regulation to say we can't do it, so let's just do it.” Whereas more recently, it's around the “Should we? You know, would we actually feel comfortable being on the front page of the newspaper having made this decision?” So, it's about having a frame of reference which aligns with the corporate values and purpose of the organization. But your own personal values should also, I think, have some alignment there. You should feel comfortable with the decisions that you're making as part of that organization. And I think there's those frameworks that you can use and talk to teams about in order to help drive that ethical decision-making. You can also look at examples, they could be theoretical case studies, where you talk through a case study, which may not have involved you, but you can kind of talk it through and say, “Well, what would you have done in this circumstance?” That can be tricky, because we always think we're going to act in a more positive way than we probably actually will do. So, it's important then to also drive, okay, things that have actually happened in the team. And it comes back to that idea of where mistakes may have happened within the team, discussing them in a supportive environment, which also aligns to psychological safety, and really learning from those mistakes, because we all make mistakes, driven by leaders about the mistakes they've made, and framing that. Some of the powerful things I've seen is leaders doing that in a town hall, or having it as part of an agenda, you know, up at the front and saying, “Well, you know, the mistake I made in the last week was this, and this is how I identified it. This is how I escalated it. And this is how we ultimately managed it.” Or, you know, “What might the root cause of this have been?” So, with that openness, you're able to really get to the bottom, hopefully, of what's driven this issue? Was it a human error point of view? Was it because of a manual process? And hopefully, that will also drive improvements across the organization. So, I do think there's frameworks there. It doesn't have to be a difficult framework. It's more about aligning it to the purpose and values of the organization. The “Can we? Should we?” And also aligning it to your personal values and talking through the options, because often with some of these decisions, they can be gray. They could be in the gray area, there may not be black and white. And this is why it can be complicated. And why it's important to discuss them more regularly as a team.
Toni Dechario 21:57
Yeah, that's great, and modeling curiosity and a willingness to acknowledge weakness and mistakes. And yeah… The other thing that you mentioned that we've talked with a number of people about, is the kind of bringing your own moral compass to a question, your personal moral compass to a question, and how powerful that can be and how it doesn't always happen.
Mark Roe 22:21
It's a really important thing to think about your own personal values, and whether the organization that you're working for espouses that. I think that is important. As far as you can, we all need to make money and pay the mortgage or the rent, but you need to try and find an organization where those are in alignment. And where you're feeling uncomfortable, this is where it will be helpful to be able to speak up if you're feeling uncomfortable. It is difficult with context, you go into a team, which is very different in terms of the culture. It may be a subculture of the organization or culture that the leaders are trying to promulgate within your little team. And that can be tricky, because coming back to the motorway analogy, where you're kind of going a faster speed than the speed limit, because everybody else is doing it. This is where it gets tricky. But as much as possible where something doesn't quite feel right and is out of alignment with your personal values. That is a big red flag to kind of put your hand up in some way and try and call that out and not just getting used to that way of working.
Toni Dechario 23:18
So, you've mentioned a couple of books, you mentioned Daniel Kahneman. Are there other resources that you'd recommend for anyone who's interested in learning more about the topics we talked about today?
Mark Roe 23:30
I'm currently reading a book by Max Tegmark called Life 3.0. It's about artificial intelligence. And it's actually fascinating just reading the first chapter of that, which talks about where artificial intelligence could go. And the reason I mention that is, we do need to be really aware from a cultural perspective of developments in the longer term. Who's coding these robots, who's coding the algorithms in the first place? We always need to keep on the ball about how we can assess cultural risk culture, and looking at literature, like artificial intelligence that determine the ethics around that is also really important, because in a few years’ time, that could be an area of focus for regulators as well.
Toni Dechario 24:12
And there's the famous example of the robot that became, that was let loose on social media and became outrageously racist within like, a day or something. And I wonder how much can AI be used as a tool by management to understand culture within an organization? Right? If, indeed, the AI takes on the characteristics and the norms of that organization, then it would be an interesting case study.
Mark Roe 24:37
I think it's going to be fascinating looking at artificial intelligence in the long term. I mean, there's already companies that are looking at, not necessarily AI in a huge way in artificial intelligence, but they are looking at how to use technology more to really gain a better understanding of culture. Whether it's looking at how people are using social media, kind of informal power structures within an organization as well. So, in that It's also really fascinating work. There's the work on unobtrusive indicators of culture as well. So, not necessarily having to go out to an entity and asking them to fill in a metrics return, but looking at what data is available already, externally, potentially, that we might be able to get hold of to look at that. So, there's a huge amount of work that we can do beyond the frameworks and tools that we have now and the assessment techniques to really drive that going forward. But artificial intelligence is fascinating. It could go in a positive way, a negative way, or somewhere in between and could be used for good. But it could also be used in a more negative way. And we just need to stay alive to those issues.
Toni Dechario 25:35
My last question for you is, I'm fascinated by your background in criminology and kind of how that's influenced your perspective now. You talked about learning from other industries that are focused on culture related issues? And what do you think criminology has to teach us about culture broadly, or decision-making specifically?
Mark Roe 25:58
I think there's a lot to learn. I mean, there's a lot of psychological theories that I also look at criminological theories or sociological theories. And I just think there's a lot of things that have been tested already in the past. So, you know, Zimbardo's Prison Experiment, again, which is very context-driven experiment around students taking on the role of either prisoner or the officer, and the kind of abuse that then happened because of the contextual nature of that. Stanley Milgram's experiments, again, getting members of the public or students involved in an experiment, to potentially electrocute other people and people went ahead with it. They wanted to test how open people would be to obedience and those kinds of areas. So, I think it's really important that we're across that literature, as well. There's something called techniques of neutralization that Sykes and Matza were looking at. When people have done something wrong, they neutralize it, you know, the spy techniques. And again, those kinds of research, I would ask people to kind of stay alive or maybe read up on some of those theories, because they're really important. And they will teach you a lot around things that we're looking at now, you know, why people might commit offenses or might deviate from the culture that we would hope to promulgate within an organization. So, there's a number of things there. Something else to mention is, is enforcement the right approach? Or is it the first approach we should use? Does it put people off? If they know there's a fine ultimately? Or they can go to prison? Or do people go ahead and do commit the offense anyway? So, I think there's a whole range of literature there. White collar crime has been researched for many, many years. Again, it feels like quite a recent thing, but has actually been researched for a number of years now. And there's things that we can learn in relation to white collar crime, rogue trading, those kinds of issues. So, I find it fascinating. And I think there is, again, there's alignment there with behavioral science, with some of the psychological theories that we read about as well.
Toni Dechario 27:49
I agree. It's fascinating. Thanks, Mark. For more conversations like this, as well as publications and other resources related to banking culture reform, please visit our website at newyorkfed.org/governance-and-culture-reform.
Mikael Down 00:01
We say culture is important culture matters, well, how?
Zab Johnson 00:05
We're powered by our brains. We're not always rational. We are influenced by others.
David Grosse 00:10
I think I'd start with looking at the what's called the “Better-than-average Effect.”
Holly Ridings 00:16
So, one of the things we look for picking flight director, is that what we call command presence, right? You kind of just look at people like, “You better get it together,” you know, “We’ve got stuff to do,” right?
Mark Roe 00:25
Good practice would be for a leader to say, “Well, this is a mistake I made recently, and this is what I did about it.”
Betsy Paluck 00:30
I can't emphasize enough the disempowerment of the lonely decision maker.
Taya Cohen 00:35
So, the opposite of what we think of as an ethical decision frame, we can think of as a game frame.
Mark Mortensen 00:40
It doesn't make you unethical to wrestle with this. The smart money is we help you to try to figure this out.
JB 00:48
This is “Bank Notes: Banking Culture Reform.” The views expressed in this podcast do not represent those of the New York Fed, or the Federal Reserve System.
Toni Dechario 00:57
Hi, and welcome to the Banking Culture Reform podcast, part of the New York Fed’s initiative to drive awareness and change in financial services culture. This series, “Norms, Mindsets, and Decision-Making” explores questions like, “Why do ethical people do unethical things? How can organizations encourage staff to speak up and executives to listen up? And what role does context play in shaping people's behavior?” My name is Tony Dechario, and I'm a member of the New York Fed’s culture team. Today's guest is Mikael Down. Mikael is the Executive Director for assessment policy and insights at the Financial Services Culture Board, an industry body that assesses culture at financial institutions in the UK and also helps to design cultural interventions at member firms. Mikael was a speaker at the New York Fed’s 2017 workshop on culture measurement, and at the New York Fed’s 2018 conference, exploring progress and challenges in reforming culture and behavior in the financial services industry. In this episode, Mikael shares with us, among other things, some of the surprising ways the pandemic transformed firms' relationships with their employees. Hi, Mikael, thanks for being with us today. Before we get into our conversation, could you please explain your role and what it is you do at the FSCB.
Mikael Down 02:06
So, my name is Mikael Down and I'm Executive Director for assessment and insights at the Financial Services Culture Board in the UK. We were originally called the Banking Standards Board. We were set up in 2015 by seven of the biggest banking firms in the UK, to help raise standards of competence and behavior across the UK banking industry, with a particular focus on organizational culture. So, I joined pretty much from the start. And my role has evolved many times since then. But really, at the core of it, are kind of two things. One is providing annual cultural assessments for all of our member firms on the basis of a framework and methodology that we developed for measuring aspects of organizational culture that are important, we believe to be important. And secondly, to try and figure out what works in terms of changing culture, raising standards of behavior, competence, designing behavioral interventions, and running those as on a trial, developing good practice, bringing together members of banking standards boards to learn from each other so they can share and exchange good practices as well. That's really the way that we operate, and we have been going since 2015. And the reason for the name change, drum roll, is because we have extended out across financial services. So, broadening scope beyond banking to cover all of FS.
Toni Dechario 03:31
Can you talk a little bit about, Mikael, about what motivated you to work on culture and what drew you to what was then the BSB?
Mikael Down 03:38
I was working for the UK government when the financial crisis happened. I was actually out in Brussels, at the time working in a diplomatic posting, came back to the Treasury, which was my home department. Everything was pretty scary around that time, as you guys will know as well as anyone. I became involved in the early kind of regulatory response to the crisis, focusing very much on the prudential side of things, and new capital requirements, structural changes to banking, and then a number of conduct reforms as well, culminating with going across to the FCA Financial Conduct Authority, which was at the time a new regulator in the UK, and developed there the senior managers and certification regime, which again is you know, focuses very much on governance and accountability of senior leaders and material risk takers within, originally, banks, and now all financial services firms, and including a designated responsibility for culture. And, taking a step back from all of this, one of the things that I observed was, there was this kind of progression in terms of the focus of policymakers and regulators from safety and soundness, which was necessary after the events of the crisis, through to looking at factors that we now consider pretty fundamental. But weren't perhaps seen this quite fundamental before around treatment of customers, conduct duty of care towards customers, and market integrity, onto responsibility and accountability within governance frameworks and looking at the failures that happened in that front as well. And that being an important part of why the crisis happened. There was no focus on how organizations actually functioned and operated all the way through this process. It was actually, to their credit, the UK Parliament who really first called this out and said, “Look,” in a report that they produced, “there have been numerous failings, but actually the biggest failure really has been that professional standards and culture.” So, this idea of an industry body to help with that process of reform, not through regulatory fiats, not through compulsion, but certainly not through any advocacy either, was born. And I was working with the SC at the time, it was kind of tiny little part of my job to keep an eye on this sort of review into this new body that was ongoing. And it felt very important. It felt like something that had been missing from the debates. And it was also something the regulators have a very, very strong interest in, but it's very difficult for them to compel to the extent that they can get too far in the details. It has to be something that has to be owned by organizations, and specifically their boards and their executive teams. It felt incredibly important. And I was really excited to, you know, join this from the ground up and figure out what – actually, “Okay, we say culture is important, culture matters. Well, how? What sorts of aspects of this kind of huge amorphous topic do we focus on? Where can we make the difference? And how can we track progress?” And that was the, I suppose, the intellectual challenge that we faced, but also the, I would say, the moral imperative for the industry, as well.
Toni Dechario 06:58
So much of what you talked about strikes me. When you were talking about how this can't be done by regulatory fiat, it's so different from what regulators – and frankly, bankers – are used to, because you can have all the right rules and regulations and controls. And so, I guess the first thing that I'd love to explore with you is the question of, why ethical people do unethical things? What is it that happens to bring about bad decisions from people who otherwise have perfectly good moral compasses?
Mikael Down 07:37
I mean, this is such an important question. I fundamentally believe that almost all people are fundamentally good or have the potential to be good. And what we see, all too often have seen, within the banking industry – but they're not just the banking industry – when something goes wrong, it’s typically people who are directly involved in that incident, who are punished. And often, that is entirely appropriate and merited, particularly when people are acting illegally, negligently unethically. But what about the environment in which that took place? You know, there are some things that that are genuinely an anomaly, but that's often the minority. There is too much focus and has historically been too much focus on bad apples, and not enough focus on the barrel. So, I think this brings it back to why culture is important. Every single person has individual responsibility and accountability to their organization, to their stakeholders, to their customers, to the regulators whomever, individually. And that's important, but collectively, the organization they're operating in the environment they're operating in, what sorts of behaviors are championed, rewarded explicitly/implicitly? What behavior do they see other people undertake? How able they are to give voice to concerns and the extent to which those concerns are listened to and acted upon? The degree of pressure that they feel. The sense of purpose and connection that they feel with their organization and its objectives. All of these things will have a huge bearing and there's, you know, there's so much research on this, on the way in which they behave. We saw during the some of the scandals in in the UK, upstanding citizens doing terrible things, signing forms for PPI products on behalf of their customers in order to get the commission. But these people don't go home and look at themselves in the mirror and call themselves a bad person. Everyone needs to be a hero in their own story. And it's very, very seldom you see people believe that they are themselves predisposed to act unethically. But we all have to recognize that we have that potential. And it's important for an organization to create an environment in which that sort of behavior is not incentivized, is actively disincentivized and discouraged. For senior leaders to set that tone. So, I think there's such an important connection between an organization's climate, its culture, and all of our weaknesses and fallibilities to sometimes do things that we really shouldn't.
Toni Dechario 10:13
You mentioned a lot of things that influence norms and behavior. One of the things that you mentioned was escalation, and what the environment is for people to speak up, and whether they feel that they can speak up, as well as whether they feel that they can be heard. If and when they do speak up about something they see that's gone wrong, what elements do you think are necessary to create a culture in which people do feel comfortable speaking up, and what's required of management to be able to hear those concerns?
Mikael Down 10:47
This is a really, really critical part of understanding organizational culture, and it's a big part of the survey and our assessment framework, the questions that we ask. And one of the questions that we ask is whether people feel comfortable challenging a decision that's been made by their manager. And when we first asked this question, less than two-thirds of people said they felt comfortable. So, we wanted to understand well, why that's the case. So, the next time around, so we asked them “Why do you feel able to speak up, and if not, why not?” And the results that we got were surprising to us, because there were significant numbers of people who said that the reason they didn't speak up was because they were afraid of recrimination or blame. So, they attributed fear as one of the reasons, which is probably something that you would expect. We often attribute and associate speaking up with whistleblowing, people reporting kind of egregious issues that would have severe consequences for the organization or for other people. So, there's a natural element of fear to do that. But the thing that surprised us was an equal number of people said they didn't speak up because they felt nothing would happen. So, in other words, futility was the barrier, not fear. And this is really important to understand, because the intervention that you need to address one is not the same as the other. And this tells us that in order for people to be able to speak up, psychological safety is vital. And again, there's been a huge amount of research on how to create psychologically safe cultures, but that isn't all that you need. You also need in order to have a voice climate, you need a listening culture, and to see listening as a form of communication. So, for example, Professor Jim McNamara, at the University of Technology in Sydney has done some amazing work on this, seeing listening as an integral part of communication, and particularly focusing on listening by senior leaders who are often better at talking than they are listening, and listening to external stakeholders, listening to customers, and providing a feedback loop, even if that feedback is to say, “We've heard you, but we don't agree.” His research shows that even closing the loop in that way as a really positive impact on voice climate over and above silence, which has an extremely negative effect on voice climate.
Toni Dechario 13:07
That's fascinating. I haven't heard about that research. You touched on this a little bit. I'd love to hear some more about your thoughts on the apple versus barrel question. So, there definitely are academics out there that focus on the individual aspect of morality, especially in ethical decision-making. Though, I think the debate that you and I have heard for the most part, or the side that you and I have heard, for the most part, is the barrel argument that really your ethical decision-making is mostly influenced by your environment. I'm curious as to what your thoughts are on that kind of debate. How much of decision-making is driven by individual motivations versus kind of group norms?
Mikael Down 13:52
It's a big question. So, there's an interesting angle to this, which is looking at the role of purpose within organizations, because that's what connects the organization to the individual. Right? That was one of the things that connects. You know, there's been some really interesting research that's been done on this by Nava Ashraf at the London School of Economics to suggest that the extent to which employees see their roles as meaningful is linked to the extent to which they see it as benefiting others. So, this is the whole concept of what she calls altruistic capital. And we incorporated this into our survey one year, where we asked people to what extent do you see your work benefiting others? And we ran correlations between this question and some of our core questions. And what we found was that employees who said that their work had a positive impact on others, and who also felt valued by others, are more likely to say that their firm responded effectively to staff feedback, and they were more likely to say that they were treated with respect. Those who felt that their work had a negative impact on others were more likely to disagree to both of those statements. So, I think there's something really powerful within there about organizational purpose and meaning, and how to create an environment in which there is an alignment between what the organization says it's about, and how people see their role within that organization. And trying to achieve that alignment is, I think, an important part of leadership and a part positive culture.
Toni Dechario 15:26
How does having a sense of purpose impact staff’s ability to make good decisions?
Mikael Down 15:34
I think it increases levels of social capital and trust within organizations and allows decisions to be delegated. Because if everyone is effectively aiming at the same target, then you need far less controls in place to be able to monitor whether subordinates are acting in the way that superiors want them to be. It enables a more flat organizational structure that relies less on hierarchy. And that, you know, again, there's been a huge amount of research to show that's much more effective. The interesting angle to this, though, is diversity, equality, inclusion. And saying that there's a common organizational purpose shouldn't imply there's a single view about how to tackle difficult problems. In fact, it's absolutely vital that there's a diversity of views, and those views are all brought to bear in taking decisions and judgments. This is particularly critical for organizations such as yourselves, where you have a public duty, but also organizations such as banks and building societies, where building societies in the UK, savings and loans in the U.S., who have the public mission, really, because that they're licensed to operate derives from their ability to serve their customers and serve the needs of society. They need to be representative of the society they serve in order to do that, in order to be able to ensure in the decisions that are made, that’re affecting customers, they are being taken, factoring in a broad range of perspectives. And I think that's something that has really has certainly over here, and I know in the US as well come to the fore, particularly since events of last year, but there is a long way to go. So, when I talk about having a uniting organizational purpose, it's really important to understand that as something which enables organizations to move forward in the same direction while welcoming a diversity of views and how to do that and how to solve individual issues and problems as they arise.
Toni Dechario 17:33
You mentioned a few things that have happened during the course of the pandemic to influence how organizations expect decisions to be made and how they expect staff to incorporate various perspectives into decision making. Are there other ways in which you think the pandemic has changed the way that people make decisions? One of the examples that occurred to me was the potential kind of isolation of working from home, away from your colleagues, might impact decision making in certain ways?
Mikael Down 18:09
Yeah, so again, we did some research on this. When we ran our survey last year. We did ask a number of questions about whether employees saw their firm's response to the pandemic as being positive, across a whole number of metrics towards employees, towards customers, towards society. And actually, overwhelmingly, the results were very, very positive, which is to the credit of all of those firms operating in phenomenally challenging circumstances and delivering an essential service when we all needed that. The interesting thing is when you cut that by where employees were physically located, because as, you know, it wasn't possible for all employees, particularly in banking, to work remotely. And there was a difference in perception of fairness, of the organization, between those two groups. Those who were remotely based, were more likely to feel that the organization was fair than those that were based on site, which is interesting, and maybe slightly counterintuitive. It's possibly because, you know, leaders were hugely focused on ensuring that the person who's on their own is at home is okay. And actually, the other thing that we found from our research last year is there was much greater connectedness than there had been before between line managers and their colleagues who they're responsible for. They checked in a lot more. And there was a much more of a human type interaction, even though it wasn't physical. When you asked people to describe their leaders, they were more likely to describe them as authentic and human than before. And actually, again, counter intuitively seeing senior leaders in their very nice kind of big home offices, actually was welcomed by a lot of people. They understood that not everyone had the same setup in their home environments, but the fact that they can actually see their bosses or very, very senior people compared to them in environments that made them more human. There was evidence that we had to show that a number of people felt that Zoom meetings were actually a leveler of people who would never normally have been able to intervene or felt able to intervene, because they didn't get their hand up, or their voice up quickly enough. They always felt shy, they were talked over by someone, they had the hand function, which gave them a permission to speak. And these are kind of some really interesting dynamics that at first we didn't predict or think about. I think the really interesting question is going to be, and it's the question on everyone's minds at the moment is, “When things start to unwind, what kind of workplace are we going to design?” And one of the things that we are really trying to emphasize is that you need to build that around some aspects of culture that are really important to bear in mind. Things like communication, decision-making, inclusion, fundamentally organizational justice. How are opportunities distributed? That's going to have a huge impact on the culture of organizations and their effectiveness, and levels of staff engagement and well-being.
Toni Dechario 21:17
I wonder what you think about whether there's something about financial services that drives ethical or unethical behavior and decision-making? Is there something unique to financial services that makes this problem more salient for the financial sector than it is for other sectors?
Mikael Down 21:40
I think, in the many sorts of post-mortems that were conducted of the financial crisis, one of the findings from some of those pieces of research was that the fact that financial services deals in money, makes money more salient, therefore encourages greed. I'm slightly on the fence on that one. But I do think that saliency and the sorts of issues that are salient, are really important and actually drive behavioral norms and influence culture very heavily. Now, if you look at the work that's happened in the banking industry since the crisis, certainly in the UK, a number of firms have made conscious efforts to make the customer perspective, more salient, combined with other interventions, too. There's one firm that introduced quite early on what's called the “yes check,” which is a number of questions that you have, four questions I think, you had to answer before recommending a product to a customer, one of those being something like “Would you sell this product to a family member?” to put the customer at the center of the ethical decision that you need to make in order to take a decision. The other issue, I think, that has become more salient, and I mentioned it before, was equality and inclusion. That's become spoken about much more now. Particularly since the killing of George Floyd and the protests that followed that both in respect of race specifically, but actually more generally, you see senior leaders talking much more frequently and with more authenticity, about the need for greater diversity and inclusion. Now, that's not enough. But it helps to provide the context in which those further conversations and further action can take place. I think the next challenge for the industry is how to make salient environmental issues around climate and biodiversity, where the financial sector has an absolutely vital role to play if we're going to meet the Paris goals and the UN Sustainable Development Goals more generally. I think a big part of that is about making issues part of the day job for people, not something that's done on the side of the desk. I think progress has been made on that front in respect to customer outcomes. But there's a lot more that needs to be done on that, but also a broader range of societal issues, too.
Toni Dechario 23:49
There's also the question, of course, about whether banking in particular bears a greater responsibility because of the unique role that banks play in society, do they actually have to be better than everyone else?
Mikael Down 24:03
Yeah, absolutely. The regulation they benefit from, the depositor protection they benefit from, liquidity support they have access to, these are fundamental reasons why banks have to continue to earn and reinvest in that social license. To my personal reflection, you know, I've been so used to dealing with banks as institutions – you know this, right? – when I was a regulator, when you're a policymaker, you get the front end, which is normally the sort of person that's put to give you a sort of formal position from the firm, and you see it as a monolithic institution. You know, Firm ‘X’ thinks this. Take a step back. It's ridiculous to summarize the view of one firm that has maybe 50,000 employees. And you realize when you speak to people, that people really, really care. Most people turn up to do a great job. And it sounds kind of cheesy, but it's true, but it's actually realizing that there is a very significant number of people across the industry who want to do the right thing. And it's about creating the environments for them to do the right thing, and continuing to highlight where things need to improve, but doing so in a way that also supports as well as shows them where the barbed wire fence is, as it were.
Toni Dechario 25:17
That's so interesting, that there's research that shows that you will behave in the way in which you're expected to behave. That other people's expectations have a significant impact on your behavior. But I was thinking about it and related to what you were saying about people working in banks, reading every day in the press and social media, about how people think they're terrible. How people think they're greedy. How people think that you must be awful, because you're an investment banker who works for this terrible, you know, what was it? Squid? Evil squid organization? And is there the possibility that people's expectations of your behaviors grind you down so much, that ultimately you are going to act in the way in which you're expected to act?
Mikael Down 26:07
Absolutely. And the more you can encourage people to bring their own selves to work – this goes back to the diversity and inclusion point – the more you can move away from that sort of stigma… You've got people who are there to do a job in a way that is consistent with their values, is consistent with their beliefs, and encouraging that has got to be the way forward.
Toni Dechario 26:26
Mikael, you've been doing this for a while now. I'm curious about what keeps you at this, what really energizes you about the work that you're doing?
Mikael Down 26:34
I think the fact that the industry is changing, and the issues are changing so rapidly. So, there is the common themes that are always important in pretty much any organizational context. And these are the core things that we try to measure and monitor, but the external environment is changing all the time. And if you look at the issues now, that boards are grappling with diversity and inclusion, we've talked about technology change, environmental change, ESG type issues, inequality. These are issues that are changing and shifting all the time, and the pressures on boards and executive teams, the needs of society – which banks serve – is shifting and changing, demographic changes, societal attitudes, intergenerational differences. There's never a time to stand still. And you've got to keep alive to how the external environment is continually shaping the internal environment. What are the things you need to continue to measure and monitor, and the things that you need to always ask yourself? What's next? What's going to come that we need to make sure that we're starting to ask the questions now, so we can anticipate where the next little challenges might be? And that's what keeps this fun.
Toni Dechario 27:44
Mikael, I know that in your personal studies, you're looking at artificial intelligence and ethics. Can you talk a little bit about how your work over the past few years led you to this and what you're hoping to achieve?
Mikael Down 27:56
The thing that led me on to it – and the connection to my studies originally – was, we were thinking about our data. And we were starting to build up a really amazing dataset. I was kind of interested in how we could use it, but I wanted us to use it responsibly. And I came across Cathy O'Neill's book Weapons of Math Destruction, which is now very famous but at the time hadn't been around for very long. I use it as an example with my team of how data can be used irresponsibly, and with a broader set of undesirable, unethical outcomes, to understand it myself and to give them some case studies for that. But it kind of inspired me to think about this issue in a bit more depth. The more I thought about it, the more I thought this is kind of quite concerning. And we're used to talking about issues related to AI now, because it's very, there's a lot more awareness about it and research going on into it. But around the time, there wasn't much. And it kind of struck me as something that hadn't really hit the banking industry yet. You know, there were a few fintechs on the fringes experimenting with it. But it was, of course, used heavily within social media, big tech. And I sort of worried about what happens when this translates across to organizations that have absolutely vast amounts of data on customers’ behavior. And there is an opportunity for that to be used as a force for good. But there's also a huge opportunity to be a detriment, whether intended or not. And so, I just thought, “Well, this feels important and interesting.” And there is, of course, always a cultural aspect to this, because when you look at the way in which machines operate, it's tempting to think of them as kind of autonomous black boxes, but they're designed by humans and the decision-making that goes into every single process that leads to the creation and the deployment of a machine learning algorithm through the data cleansing, the engineering process, decisions taken about how and in what way the AI will operate. They're taken by humans. And it goes back to the point we were discussing at the start of this: humans are influenced by their environment, and therefore the environment is going to have a big bearing on risks such as those that are particular to AI. Of course, the fact that AI is so complex makes it very difficult to spot and monitor these risks. And that is also an interesting analogy to the complex derivatives that were traded ahead of the financial crisis, where they were hugely complex. They were traded largely in volumes by people who didn't fully understand what they were trading and the risk that they were bearing… It’s quite interesting parallels with the use of AI. So, what I wanted to get in my research – which will probably need to get narrower, because these are lots of big questions – was actually the lessons from the financial crisis that we can learn, to help inform us and guide us as we seek to deploy AI in a responsible way within financial services, and then what organizational hygiene factors need to be attended to, in order to make sure that stand the best chance of making sure that the design and use of these algorithms are a force for good rather than a force for bad?
Toni Dechario 31:04
Is there any further reading or other media out there that you would recommend for anyone that's interested in learning more about any of the topics that we talked about today?
Mikael Down 31:15
Well, I think if you're new to organizational culture, you can start with Edgar Schein, it's where everyone starts. But it's the pretty much the best place, I think, to get a grounding on it, and why it's important, the methodology. And some really, really interesting case studies within that. I think in a more contemporary age, Gillean Tett's work, her new book in particular, I think, is within the context of financial services, a really interesting study of actually using the discipline of anthropology to understand and observe organizational cultures, including within finance. So that's, that's a good book. I would also recommend Taking the Floor by Daniel Windsor, who has done an ethnographic study within Wall Street, on the culture of the trading floor. And that's an extremely good read, I really like the discipline of ethnography as one, which is a really, really powerful lens to view organizations. And we use some of that as well, with the work that we do. Of course, I would recommend the FSCB website. But also, you know, the compendium that you guys have at the New York Fed, I think is an excellent go to. I use it a lot to be able to find out who's saying what's on this topic from across the world. And it's an excellent place to find out some kind of key speeches or pieces of research or initiatives that are happening across the regulatory community and beyond.
Toni Dechario 32:33
Well, that's kind of you! So final question: you've kind of answered this a little bit in talking about some of your own work, but is there anything, this doesn't have to be related to culture, that you're reading, listening to watching that you can't stop talking about that you want to share with the world?
Mikael Down 32:49
Well, one of them is Mariana Mazzucato's new book Mission Economy. That's, I think, a really, really powerful way of looking at organizing a system across government, private sector, academia to achieve a common goal. So, there's a big study of it in Moonshot. But actually, you know, the application of it is around looking at some of the challenges that face us today, including the imperative to get to net zero. That's something I've read recently, which I thought was very, very thought-provoking. And sort of back to the sort of stuff that I study off the side of my desk on the AI side, I really recommend Deep Medicine by Eric Topol, which is a really good way of actually getting into AI. If you're curious about AI, and you want to understand it, he describes it really, really well.
Toni Dechario 33:34
Thanks so much, Mikael, for being with us today. It was good to talk to you.
Mikael Down 33:38
Good to talk to you, too.
Toni Dechario 33:40
For more conversations like this, as well as publications and other resources related to banking culture reform, please visit our website at newyorkfed.org/governance-and-culture-reform.