One night, over dinner, Daniel Kahneman changed my perspective on value creation. Why do so many people, including leaders of large corporations, draw back from acting more boldly, even when taking incremental risks can create much greater returns? And perhaps more important: Can anything be done about it?
Kahneman explained to me that while it’s hard to change human nature, it’s much easier for organizations to take practical steps to overcome human decision biases. By understanding the inherent irrationality of how people make decisions—including a strong tendency toward loss aversion—companies can successfully encourage managers to allocate more resources for more value-creating projects.
His insights were a revelation to me. I’ve been a consultant for more than 40 years, nearly all of that span at McKinsey. Over those decades, I’ve been blessed to work with not only brilliant colleagues but also many of the most effective business leaders in the world. From the very start, I sought to understand and explain how companies could allocate their resources to maximize value over the long term. With that mission in mind, my coauthors and I wrote what turned out to be a best-selling book on corporate valuation, Valuation: Measuring and Managing the Value of Companies; next year will mark its eighth edition.
Yet from the time I began analyzing companies (this, even before I was a consultant), and for years afterward, I was puzzled: Why don’t companies take obvious, rational actions to maximize long-term value? It’s up to senior leaders to allocate resources to the projects that create the largest cash flows over time. Some projects deliver outsize, sustainable returns; it’s clear that these initiatives should receive more company resources. Other investments deliver lower or even negative returns. These businesses should receive fewer resources—or none at all.
Yet even when the data and conclusions are indisputable and highly capable CEOs and CFOs recognize what needs to be done, their companies often still fail to allocate resources as they should. Instead, managers and their teams fall back to the same ways of doing things. The companies don’t maximize value. Or worse, they watch upstarts and competitors disrupt their businesses. It doesn’t seem rational.
But it’s human. Thanks to Daniel Kahneman, we now understand human nature—and all of its apparent irrationality—much more clearly. Kahneman and his research partner, psychologist Amos Tversky, found that people didn’t make decisions in the ways that economists had traditionally assumed. Harvard University psychologist and author Steven Pinker said of Kahneman: “His central message could not be more important, namely, that human reason left to its own devices is apt to engage in a number of fallacies and systematic errors, so if we want to make better decisions in our personal lives and as a society, we ought to be aware of these biases and seek workarounds. That’s a powerful and important discovery.”
Kahneman’s published work was primarily about individual decision making. I, along with my collaborator Dan Lovallo, had the opportunity to work with Kahneman on how organizations might improve their decision making. It was while we were having dinner that I asked him straight out, “If people don’t behave in an economically rational way, is there any hope for organizations?”
Yes, Kahneman assured me. “I’m much more optimistic about organizations than individuals,” he said. “Organizations can put systems in place to help them.” Managers can develop rules and processes that help overcome inherent decision-making biases.
Drawing on Kahneman’s insights, my colleagues and I have proposed (or adopted from others) several techniques to help organizations understand and improve their decision making in resource allocation. In particular, our research finds that large organizations often suffer from four decision-making biases that Kahneman’s work helps solve: groupthink, loss aversion, confirmation bias, and anchoring.
Groupthink and its sibling concept “sunflower management” lead to a lack of debate about important decisions. In groupthink, individuals are reluctant to raise ideas that might be different from what they perceive as the emerging consensus. With sunflower management, executives try to guess what the most senior person wants to hear rather than expressing their own points of view and then bend their own views accordingly. While the best solution is a culture of debate, fostering a strong obligation-to-debate culture can take years to develop (and unfortunately, much less time to destroy). Short-term techniques to ensure that valuable insights are heard can include assigning someone to be a devil’s advocate, insisting that the most senior people don’t express their views until everyone has had a chance for input, bringing in subject matter experts (not just the top management team), and, for really big decisions, putting in place a red and blue team approach in which two teams are assigned to take opposite points of view.
Loss aversion is the idea that human beings weight losses more than they do gains and, therefore, pass up promising investment opportunities. Dan and I cowrote with Kahneman the Harvard Business Review article, “Your company is too risk-averse.” We showed that what mattered was not the riskiness of an individual project but how that project contributed to the overall risk of the company’s portfolio of projects. We found that, too often, companies pass up attractive projects and focus on the risk of each one. But when companies aggregate projects, they achieve a natural diversification effect and can realize a better financial result without taking on much, if any, additional risk.
Confirmation bias is the tendency to search only for data or evidence that supports a hypothesis. For example, a management team might develop a hypothesis about a strategic direction for a business and, consciously or otherwise, highlight only confirming data. As a result, they will miss information that might send them in a different direction. Overcoming confirmation bias is relatively simple, so long as organizations follow Kahneman’s counsel to put rules in place that must be followed. All it takes is discipline and enforcement by top management to insist on asking the right questions.
Anchoring is the tendency to make decisions based on past (or even irrelevant) information. Large organizations tend to anchor their budgets and resource allocation decisions based on what they did last year, with only minor deviations. As a result, their budgets won’t be aligned with any potential strategic shifts that top management may want to make. Once again, putting in place rules that Kahneman recommended can overcome significant bias. For example, an organization might create a rule that every strategic initiative in the budget must be fully funded.
These rules may feel unnatural, and indeed they often go against human nature. But while you can’t change human nature, thanks to Kahneman, I’ve learned that you can change how organizations approach resource allocation. Debiasing decision making helps companies achieve value-creation opportunities that were there all along.
“Our firm is designed to operate as one—a single global partnership united by a strong set of values. We are equally committed to both sides of our mission: attracting and developing a talented and diverse group of colleagues and helping our clients create meaningful and lasting change.
From the C-suite to the front line, we partner with clients to help them innovate more sustainably, achieve lasting gains in performance, and build workforces that will thrive for this generation and the next.”
Post details > link to site