Addressing the Bias Problem
Though virtually every involved party was at fault to some degree, bias on multiple fronts was largely the cause of the 2008 financial crisis. Given that bias in risk management can result in a disastrous event such as this one, protecting against biases is critical. Jim DeLoach presents several strategies to overcome these blind spots and effectively address operational risks.
Few would argue that the 2008 financial crisis was likely the most spectacular failure in risk management recorded to date. There are so many causal factors and culpable parties, we cannot possibly cover them all. One of my favorite books on the subject is All the Devils Are Here: The Hidden History of the Financial Crisis. The promo for this outstanding, highly readable book reads as follows:
As soon as the financial crisis erupted, the finger-pointing began. Should the blame fall on Wall Street, Main Street or Pennsylvania Avenue? On greedy traders, misguided regulators, sleazy subprime companies, cowardly legislators or clueless homebuyers? According to [the authors], the real answer is all of the above – and more. Many devils helped bring hell to the economy. And the full story, in all of its complexity and detail, is like the legend of the blind men and the elephant. Almost everyone has missed the big picture. Almost no one has put all the pieces together.
The book focuses on motivations of all of the parties – “from famous CEOs, cabinet secretaries and politicians to anonymous lenders, borrowers, analysts and Wall Street traders.” Its basic premise is that the crisis was all about human nature.
With warning signs ignored by regulators, financial institutions and academia, the question is, why? Perhaps the answer lies in how human nature manifests itself, with one way being through bias and the groupthink that results from that bias. For example, one view of the cause of the financial crisis is that there were two primary types of bias:
- “Not invented here” bias, which is the unwillingness to adopt an idea because it originates from outsiders, leading to errors in group judgments such as missing out on new opportunities or risks.
- Confirmation bias, which is the tendency to search for, filter or interpret information in a way that confirms existing preconceptions or initial decisions and ignores contrary insights.
Both types of bias contribute to groupthink, in which participants suppress their divergent views in an effort to create consensus.
There are many forms of cognitive bias, including the two above. Other forms of bias were likely involved in spawning and sustaining the crisis (e.g., framing effect, anchoring, belief, availability heuristic, hindsight, outcome and even the ostrich effect, among others). The various forms of bias and the groupthink phenomenon they encourage often result in a desire for harmony in an organization, meaning greater weight is placed on “getting along” rather than on expressing disagreement on the things that really matter. As a result of efforts to minimize conflict and maximize conformity, a group of this nature tends to avoid critical evaluation of alternative views and salient contrary information and, as a result, reaches risk-reward decisions that may miss the mark badly. In a rapidly changing environment, this behavior creates lethal “blind spots” in an organization.
With respect to risk management, bias has always existed. It is not unusual to find evidence of groupthink, dominant personalities, overreliance on numbers, disregard of contrary information, disproportionate weighting of recent events and tendencies toward risk avoidance or risk-taking in any organization. After all, it’s human nature, right? So, the question is not whether bias exists, but rather how bias within the risk-reward decision-making process can be managed.
As individuals, each of us has our own unique paradigm of how things work based on our past experiences and learnings. Therefore, it is healthy to recognize that biases exist and everyone has them. The following are some thoughts on how to overcome bias in risk management:
Focus on improving processes rather than blaming people. Most of us have a strong bias to avoid pain, which includes having to admit our failures to others. This is where traders get into trouble: by taking on more risk to cover past accumulated losses, leading to circumstances where they ultimately go rogue. That’s why it’s important to focus on the process and encourage people to come forward and escalate issues so they can be addressed in the cool of the day rather than allowed to fester and become a formidable problem with limited remediation opportunities.
Recognize that risk management inevitably leads to conflict and that inevitability should be expected and encouraged. Tension is inevitable between value creation and protection. For example, how does an organization balance its credit policy with its sales strategy? Does a trading operation establish appropriate limit structures when empowering personnel to authorize trades? If prudent public safety considerations are considered to be more important than cost and schedule considerations, how does management know that decisions are being made appropriately across the organization on matters that could infringe on public safety? The point is that each of these matters leads to dialogue between the independent risk management function and front-line and customer-facing personnel. If risk is to be managed, healthy tension is a good thing. For this to happen, risk management must be positioned properly.
For example, the chief risk officer (CRO) (or equivalent executive) should be viewed by line leaders as a peer and have a direct reporting line to the CEO as well as a reporting line to the board or a committee of the board. Furthermore, directors should conduct mandatory and regularly scheduled executive sessions with the CRO. With the board supporting risk management’s independent role within the organization in this way, the function becomes a viable line of defense.
When making decisions, reduce the risk of groupthink. It is not unusual for groups to form opinions or make decisions without having engaged in robust debate or listened to dissenting views. That is why efforts should be made to ensure that all views are heard from the right sources and considered. The following are 10 ideas for doing this:
- Keep the group a manageable size. Invite the appropriate stakeholders and avoid observers and multiple parties from the same team.
- Focus on risks that truly matter (rather than the trivial many). Think about what the organization doesn’t know. Risk assessments directed to cataloging known risks are not going to generate new insights for management and the board. Focus the company’s risk assessments more on circumstances or potential outcomes that have not been considered by the organization.
- Designate a facilitator and don’t allow a few to dominate. Be careful about relying on the smartest or most dominant people in the room. Allowing the higher-ups, experts and dominant personalities to drive what should be a divergent conversation to a point of convergence too soon is a common mistake. Get the facts out and make sure all sides of the issue are voiced, all relevant facts are obtained and everyone whose opinion is valued is heard. Once that’s accomplished, initiate the convergence process to a conclusion.
- Engage diverse experiences and points of view and avoid “yes people.” Diversity in backgrounds and perspectives enriches the dialogue and leads to better decisions.
- Avoid starting with a desired outcome or structuring data to fit a preconceived decision. Ultimately, managing risk is about seeking the truth. The earthquake model used by the Japanese nuclear power operator hit by the 2011 tsunami, causing a meltdown of three nuclear reactors, was based on empirical data dating back to 1896; it disregarded important scientific evidence asserting that a major quake had occurred more than 1,000 years ago, resulting in a powerful tsunami that hit many of the same locations as the 2011 disaster did. Geologists had also found evidence of two additional large tsunamis hitting the same region during the past 3,000 years, leading to a view that a catastrophic tsunami was, in effect, a 1,000-year event. While a model based on just over 100 years of data could not possibly offer much insight regarding a 1,000-year event, it certainly supported a conclusion that the nuclear plant’s present configuration was satisfactory. Had the additional scientific data been considered or had a different question been asked regarding the consequences of a catastrophic tsunami hitting the plant, the nuclear power operator would have faced the need to consider formidable investment decisions. Geological time is impervious to arbitrary assumptions that ignore the available facts.
- Distinguish between divergent and convergent dialogues. Recognize when the group is in a divergent mode and when it wants to converge. Remember that divergent thinking leads to better problem solving and more creative solutions. Conversely, convergent thinking shuts down dialogue that isn’t focused on a single solution.
- Accept conflict and devil’s advocacy as the norm; understand why dissenters disagree. These qualities are the essence of effective brainstorming.
- Seek diverse external perspectives. From formulating hypotheses to presenting alternative scenarios and their attendant considerations to encouraging healthy debate, management must minimize the impact of bias by encouraging the pursuit of all potentially relevant information, accepting a contrarian voice in the dialogue and, if necessary, seeking diverse opinions from informed third parties. Sometimes, it helps to obtain viewpoints from outside of the organization. Techniques for viewing the situation in different ways or using different frameworks can be used to minimize groupthink.
- Consider the consequences if a decision is wrong. Management should incorporate the more extreme scenarios into stress tests of financial models supporting critical investment decisions and operating plans. Contingency or exit plans should be explored in case a proposed plan or decision doesn’t work out.
- Value the differences by looking for synergies in multiple points of view. Recognize the limitations of consensus. In traditional risk maps derived from electronic voting, the collective input of the group is captured in the form of a single point on a grid, as if “consensus” has been reached. However, that point on the grid results from aggregating divergent views. It is possible that one of the divergent views is correct; therefore, the group should determine whether there are outlier views resulting from important information the rest of the group doesn’t have.
Conduct a pre-mortem. While we can never say with certainty that we know what we don’t know, we can apply techniques that encourage managers to think strategically on a comprehensive basis by focusing on the big picture. The “pre-mortem technique” is a process for engaging managers in contrarian, “devil’s advocate” thinking without encountering resistance. The idea is to assume that a critical strategic assumption is no longer valid, provide the reason(s) why from a point in time in the future and explain what that development (i.e., an event or a combination of events) might mean to the organization. Alternatively, more extreme scenarios can be incorporated into stress tests of financial models supporting critical investment decisions and operating plans.
We may not be able to identify “black swans” until they happen, but at least we can assess how much they might hurt by considering the cost of being unable to execute aspects of the strategy. If management doesn’t like what it sees as a result of this contrarian analysis, steps should be taken to improve early-alert capabilities, contingency plans and response readiness.
Avoid compromising the quality of your decision-making process. Give the following “don’ts” careful consideration:
- Don’t structure data to fit a preconceived decision. Ultimately, managing risk is about seeking the truth and acting on it. The aforementioned catastrophic 2011 tsunami event in Japan illustrates what can happen when salient facts are conveniently ignored.
- Don’t rely on the smartest people in the room. As noted earlier, allowing experts and dominant personalities to monopolize the dialogue can stifle the sharing of useful insights.
- Don’t focus on risks everyone knows about. Assessments directed to shuffling known risks around a heat map can’t be expected to generate new insights for management and the board. Think about what the organization doesn’t know. Focus the company’s risk assessments more on circumstances or potential outcomes that the organization has not considered.
- Don’t extrapolate the past into the future. Change is not linear. It can be dangerously disruptive. Stuff happens.
- Don’t draw false security from probabilities. Throughout the process, acknowledge that no one can predict the future with certainty. Playing numerology games with probability estimates that are, at best, mere guesses can create a false sense of comfort over “what the numbers say.” However, this does not make the threat of a plausible or extreme crisis situation or significant emerging risk scenario go away. That is why a high-impact, high-velocity and high-persistence threat warrants an assessment of an organization’s response readiness. If response readiness is low, a focused response plan may be needed to enhance preparedness.
- Don’t manage toward a singular view of the future. Given the complexity of the business environment, executives should avoid the kind of overconfidence that is often driven by past success. It is common for leaders to make bets based on what they see in the future. But for the big bets that matter, what if they’re wrong? “What if” scenario planning and stress testing are tools for evaluating management’s “view of the future” by visualizing different future scenarios or events, what their consequences or effects might be and how the organization can respond to or benefit from them – providing more insight into the hard spots and soft spots in a proposed investment or plan.
While the above ideas are not exhaustive, they suggest that overcoming bias in risk management is all about improving risk-reward decision-making processes continuously so that alternative views are expressed and considered and relevant facts are placed on the table. Ignoring dissenting viewpoints, suppressing creative thinking and isolating the organization from outside influences are sure ways for executive management to lose touch with business realities.
Questions for Executives and Boards
Executive management and the board of directors may want to consider the following questions in the context of the nature of the entity’s risks inherent in its operations:
- Do we understand the critical assumptions underlying the organization’s strategic, operating and investment plans and evaluate those assumptions with appropriate information from internal and external sources?
- Do we make sufficient use of scenario planning and stress testing to challenge assumptions and expected outcomes, address “what if” questions and identify sensitive external environment factors that should be monitored going forward?
- Are we satisfied that requests for investment funding are presented with a balanced view of rewards and risks?
 All the Devils Are Here: The Hidden History of the Financial Crisis, by Bethany McLean and Joe Nocera; available at www.amazon.com/All-Devils-Are-Here-Financial/dp/159184438X.
 “The Financial Crisis and the Systemic Failure of Academic Economics,” David Colander, Hans Föllmer, Armin Haas, Michael Goldberg, Katarina Juselius, Alan Kirman, Thomas Lux and Brigitte Sloth, Kiel Institute for the World Economy, February 2009; see www.ifw-members.ifw-kiel.de/publications/the-financial-crisis-and-the-systemic-failure-of-academic-economics/KWP_1489_ColanderetalFinancial%20Crisis.pdf.
 “Japanese Nuke Plant Downplayed Tsunami Risk,” Justin Pritchard and Yuri Kageyama, Associated Press, March 27, 2011.