There is ample evidence that our minds consistently distort our perception of the world when making decisions under uncertainty, even within our own constructs of reality. Distortion in perception can stem not only from differences in experience and education but also the necessary use of heuristic judgments.

We use heuristics[1] — mental shortcuts that help us to make decisions in complex or uncertain situations without exceeding our cognitive capacity — in our work and daily lives. Like many shortcuts, however, heuristics do not always yield accurate results.

In the early 1970s, psychologists Amos Tversky and Daniel Kahneman found that the employment of heuristics results in cognitive biases that lead people to make “systematic and predictable errors” when making judgments under uncertainty.[2]

One well-known heuristic, “anchoring and adjustment,” stems from estimating a (usually numerical) value from a relevant initial value by adjusting therefrom.[3], [4] The use of this heuristic leads to a consistent bias in making judgments, as adjustments from the anchor are typically insufficient to arrive at an accurate value.[5]

An example: Asked if the population of Colombia is more or less than 10 million, and then asked to estimate the population, most will estimate that it is closer to 10 million than the correct figure — approximately 48 million.[6]

An anchor may be created to intentionally exploit this bias. A retailer may, for example, advertise a significant “mark-down” from an artificially inflated retail price, luring shoppers, through the false promise of a bargain, to buy more.

Another heuristic that relies on the availability of relevant information will skew one’s assessment of the probability of an event occurring. In some cases, availability bias stemming from the retrievability of an instance — how readily it is recalled — alters one’s perception of the likelihood of an occurrence.
For example, we might inaccurately perceive a higher crime rate in an area merely by having witnessed or heard of a recent crime.  Imaginability — how easily we can imagine an occurrence — can also lead to availability bias.[7]

The anchoring/adjustment and availability biases can both cause people to overemphasize the importance of past information. It may not be surprising that people tend to underemphasize the probability of events that are relatively less visible or those that occur relatively infrequently. Conversely, we tend to overemphasize the probability or frequency of an occurrence that looms large in our consciousness whether due to availability or imaginability.

Take auto accidents versus airplane crashes: Rarely do the former make big news, precisely because they are so frequent and kill or maim only a few people at a time. Though relatively rare, airplane crashes result in hundreds of fatalities at a time, can decimate whole communities, and often linger through dozens of news cycles. In this case, imaginability and retrievability may work in concert to lead us to overestimate the likelihood of one’s dying in a plane crash as opposed to an auto accident.

Cognitive Bias Related to Corporate Sustainability Issues

Anchoring/adjustment and availability heuristics may be particularly prevalent within the realm of corporate sustainability or attention to environmental, social, and governance (ESG) issues. The fact that awareness of these concepts in business management is not yet widespread means that relevance of ESG issues to financial performance are still unclear to many investors and company managers. Lack of familiarity with ESG issues (irretrievability and unimaginability) can skew downward perceptions of their associated risks. Conversely, over time many companies have become increasingly attuned to the materiality of certain ESG issues to their business and focused on them in public reporting; in this case, heuristics can lead to elevated perceptions of ESG risk.

Can Widespread Use of Integrated Reporting Counter Cognitive Bias?

Context would seem to be critical in countering cognitive biases, wouldn’t it?

In the retailer example above, if nearby shops are offering the same product at an undiscounted price that is lower than that of the “discounting” shop, consumers may be less likely to buy more from the latter. The anchor loses its impact in the context of the competition, and the market operates as it should.

Integrated reporting is intended to provide the context that is so critical for stakeholders, primarily investors, to make informed decisions vis à vis the reporting entity. Done relatively well, an integrated report presents both performance and strategy within the broader societal, economic, and environmental context in which the company operates. An integrated report is most useful to investors when it demonstrates how the company is deploying and conserving not only financial but a variety of other capitals: natural, human, manufactured, intellectual, human, social and relationship.

The Value Creation Process

rebernak diagram

Capital inputs and outputs in the value-creation process, taken in the broader context in which a company operates, are important components of integrated reporting. Source: IIRC.

If only one or two companies within a sector engage in integrated reporting, however, we may have an anchoring problem. Moreover, estimates of the broader operating context may be skewed via unavailability of relevant, accurate information. Investors, particularly those less schooled in the application of ESG factors to investment processes, might make assumptions regarding performance that are based on a dearth of relevant information.

Companies should be able to provide context and relevant information sufficient to defend their own assumptions around their performance. The more companies that do, the better investors will be able to make informed decisions. An integrated approach to public reporting is one vehicle for doing so. Likewise, investors will serve themselves and their clients by being attuned to potential biases embedded in information they receive from companies as well as their own heuristics and biases.

Indeed, we could all benefit from awareness of our very human tendency toward cognitive bias. We must always question our assumptions. In doing so, we have enormous opportunity to make better decisions in all human pursuits.

Then again, what’s “better” depends on where you start.

Kate Rebernak is the founder and CEO of Framework LLC, a specialty management consultancy that helps companies create value by understanding, managing and communicating performance on environmental, social, and governance issues that are likely to have direct and indirect financial impacts on their business.

[1] Wikipedia describes a heuristic as “any approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals.” Merriam-Webster defines heuristic as “involving or serving as an aid to learning, discovery, or problem-solving by experimental and especially trial-and-error methods.”

[2] A. Tversky and D. Kahneman, “Judgment under Uncertainty: Heuristics and Biases,” Science (Washington, DC.), 185, 1124–1131 (1973).

[3] Ibid.

[4] “Behavioral Finance: Cognitive Errors – Information-Processing Biases, CFA Tutor, https://cfatutor.me/2013/10/03/behavioral-finance-cognitive-errors-information-processing-biases/.

[5] A. Tversky and D. Kahneman, “Judgment under Uncertainty: Heuristics and Biases,” Science (Washington, DC.), 185, 1124–1131 (1973).

[6] As of 2013. Source: the World Bank.

[7] A. Tversky and D. Kahneman, “Judgment under Uncertainty: Heuristics and Biases,” Science (Washington, DC.), 185, 1124–1131 (1973). Tversky and Kahneman noted that “the risk involved in an adventurous expedition, for example, is evaluated by imagining contingencies with which the expedition is not equipped to cope.”