The Neuroscience Of Climate-
Change Apathy (And How To Fix It)

Mark Trexler and Laura Kosloff

Scores of studies and analyses suggest that the costs of ignoring climate change are likely to far outweigh the costs of avoiding it, but estimates of business-as-usual climate change continue to tick upward.  Before we can understand and manage climate-change risk, we need to understand and manage the brains that evolution gave us.

Scores of studies and analyses suggest that the costs of ignoring climate change are likely to far outweigh the costs of avoiding it, but estimates of business-as-usual climate change continue to tick upward. Before we can understand and manage climate-change risk, we need to understand and manage the brains that evolution gave us.

This article has been adapted from The Changing Profile of Corporate Climate Change Risk, the first in a series of “DōShorts“, which distill sustainability best practice into short ebooks aimed at busy professionals and designed to be read in 90 minutes or less.   It is available for purchase here.

 

9 October 2012 | In 2006, the British government published the Stern Review on the Economics of Climate Change, which warned that climate change seems certain to hobble future economic growth, with higher temperatures and extreme events taking an ever higher toll on global economies. The Stern review estimated that GDP could be reduced by 5-20% within just several decades, while an investment today of just 1% of GDP into mitigation efforts would avoid many of the worst consequences of climate change.

The report has not been converted into policy, although it did generate an active debate over how variables like the “social rate of time discount” were handled in Stern’s analysis, and how we should value the health and well-being of future generations. Ultimately what was most notable about the Stern Review was how limited an impact the extensive cost-benefit analysis in the report had in engaging the general public – let alone decision-makers – in addressing climate risk.

The fact is that at an emotional level we don’t really know how to use cost-benefit analysis in perceiving and interpreting risk. In most policy arenas we don’t knowingly and explicitly make decisions that risk dramatic and negative impacts on future generations, but it’s not due to cost-benefit analysis. Nuclear-waste disposal policy, generally evaluated against the requirement that the waste be sequestered for thousands of years, is an interesting juxtaposition to how we are thinking about the risks of climate change over decades rather than millennia. Yet the alleged “high costs of climate change mitigation,” implicitly based on a cost-benefit argument, continues to be a key rallying cry in opposition to climate policy.

Rethinking Classic Cost-Benefit Analysis

Many economists argue that focusing on total discounted dollars is the wrong way to think about climate risk. Even modest reductions in future GDP caused by climate change mitigation efforts can add up to a large figure in absolute dollars. But what if the question is phrased differently – i.e. is it worth accepting a 95% increase in 2030 GDP, as opposed to a 100% increase in 2030 GDP, in order to make the investments necessary to manage climate change risk? This puts the question into an entirely different light, and allows for a more risk- and values-based discussion.

Other economists argue that there is a good chance the world economy would actually be better off in the future by tackling the climate change problem today, even without factoring in the costs of future climate change. They argue that moving to replace the high environmental, social, and security costs of today’s global energy systems with alternative energy systems would pay off quickly. So we can’t simply take for granted that the costs of climate change mitigation are negative for society over the medium to long-term.

Economic analysis of climate change risk at the local and project-based level is still relatively rare, primarily because the ability to forecast the localized impacts of climate change is still evolving. One credible economic analysis, looking at hurricane risk to New York City, concludes that by 2030 climate change will increase the risk of a category 4 or 5 hurricane hitting New York City by as much as 25–30%, and that the city’s vulnerability to such an event is so great that the city should be willing to pay more than $47 billion per year in 2030 to avoid the risk (if there were anyone to pay). That is more than 50% of New York City’s annual expenditure budget ($60 billion in 2009), just to purchase a climate change insurance policy against climate change-induced increase in hurricane risk.

The same analysis concluded that investors funding hurricane-susceptible infrastructure should be prepared for a significant decline in their expected rates of return from such investments, for the same basic reason of increasing extreme event risk. For infrastructure with an estimated 12% return based on historical climate, for example, the analysis suggested that expected future rates of return may prove closer to 4%.

What you see is all there is

Climate risk has been a source of active discussion for 25 years, yet clear policy to address the climate change problem is not yet in place. It is easy to attribute this failure to economic externalities, too-high discount rates, or campaign financing and political influence. But research over the last 20 years has also opened a growing window of investigation into the “how” of human decision-making. When it comes to perceiving and responding to complicated risks, it is increasingly clear that the human brain can operate to the detriment of what we would normally consider rational outcomes.

One explanation of this is that human society, and the nature of the problems we are trying to address, has evolved more quickly than the physical capacity of our brains to understand and respond to developing risks. As Daniel Kahneman put it in his 2011 book Thinking, Fast and Slow, humans suffer from the ‘what you see is all there is’ (WYSIATI) phenomenon.

WYSIATI suggests that human decision-making is based primarily on Known Knowns, namely phenomena we have already observed, and rarely considers Known Unknowns. Most importantly for a problem like climate change, human decision-making appears almost oblivious to the possibility of Unknown Unknowns, including the risk of climate change “tipping points” that worry many scientists.

Several of the ways in which our thinking can yield counter-intuitive results in an area like climate change are briefly profiled below:

• The availability heuristic is a way of describing how humans make judgments. We assume we’re rational creatures, using all available information to make decisions. In fact, we make judgments based on what we best remember, which tends to be very recent experiences. Thus, in the days or weeks after a major earthquake homeowners will flock to buy earthquake insurance. Over time, the number of people buying earthquake insurance declines, even as the objective likelihood of an earthquake increases as tectonic stresses increase. The availability heuristic has major implications for how we think about aspects of climate risk that no one has yet witnessed.

• The optimism bias is a term used to describe the human tendency to expect things to turn out better than data-supported forecasting may suggest. Regardless of the data presented, we tend to believe that the future will be much better than the past. The bias holds true across ethnic and socioeconomic groups. Taken collectively, people can grow pessimistic about world economies or the future of their country, but individually, they will tend to believe their lives will get better. This bias also interferes with evidence-based risk perception for a problem like climate change.

• The ‘neglect of probability’ bias is another bias that kicks in under conditions of uncertainty, based on the difficulty our brains have in working with probabilities. In a field like climate change, as data intensive as it is, and given the range of uncertainties that can best be characterized as probabilities, the ‘neglect of probability bias’ suggests that much more focus is needed on communicating climate risk in a way that avoids the potential for this bias to interfere.

Patternicity is another cognitive bias with huge implications for climate risk perception. The search for patterns in everything around us has clear evolutionary benefits (by telling us which plants and animals to avoid for example), but it is exactly that kind of pattern-seeking that has helped political discourse get bogged down in debates over whether climate change is or isn’t already happening, and exactly how much climate change may have already occurred, rather than focusing on the much clearer risks posed by future climate change.

This short review just scratches the surface of the risk perception and risk management implications of cognitive biases as suggested in recent research, but suggests that much more policy and communications attention will be needed if these barriers are to be overcome when addressing climate risk perception and management. Cost-benefit analysis, no matter how analytically convincing, is not going to tip the balance.

Dr Mark C. Trexler is Director, Climate Risk, DNV-KEMA Energy & Sustainability in Portland, OR, USA. Laura H. Kosloff is an Attorney, also in Portland.

Please see our Reprint Guidelines for details on republishing our articles.

Additional resources

Please see our Reprint Guidelines for details on republishing our articles.