Beyond Rationality, or “Why We Should Stop Being Surprised That Change is Hard”Posted: June 19, 2017
It is a cliche as old as Plato and Socrates (actually it’s older than Plato and Socrates but 2,500 years makes the point): change is difficult.
In response the usual set of approaches includes the use of various “change management” tactics; to “engage” people in discussion and “active learning;” to “coach” them; to use strategic combinations of “positive and immediate consequences;” to “measure that which we want to manage” etc.
Implicit in these approaches, however, is the stated or unstated assumption of human rationality in how we tend to our systems of beliefs.
A paper published last year in the Journal of Economic Perspectives titled “Mindful Economics: The Production, Consumption, and Value of Beliefs” by Roland Benabou of Princeton and Jean Tirole of the Toulouse School of Economics presents a framework for thinking about beliefs and that adds to the growing school of thought that people actively ignore evidence, information and facts when it conflicts with their beliefs.
A key point, as I see it, in their framework is that our beliefs are not just a bunch of calculated outcomes to which we hold no other attachment other than their utility, but that our beliefs are valued, even treasured personal possessions. When their beliefs are threatened people engage in a defense of those beliefs that goes beyond simply an ego need to be “right,” it is almost akin to the threat posed by a “break and enter.”
According to the authors, better-educated people are especially good at “motivated reasoning.”
Here is a brief excerpt in which they summarize the three ways we engage in “motivated reasoning” to protect and foster our beliefs.
The strategies of self-deception and dissonance-reduction used to protect valued beliefs are many and varied, but we can group them into three main types: strategic ignorance, reality denial, and self-signaling.
Strategic ignorance consists in avoiding information sources that may hold bad news, for fear that such news could demotivate us, induce distressing mental states, or both. For instance, many at-risk subjects refuse to be tested for Huntington’s disease or HIV even though the test is free, accurate, and can be done anonymously.
Reality denial is the failure to update beliefs properly in response to bad news. When credible warning signs are received but the feared state of the world is not yet materially incontrovertible, these signals can be processed and encoded in a distorted or dampened manner. Thus, accumulating red flags may indicate an ever-rising probability of disease, or of a housing-market crash, yet agents find ways of not internalizing the data and rationalizing away the risks, as revealed by their unchanged life plans, failure to divest or diversify from risky investments, and so on.
Self-signaling refers to a set of strategies by which the agent manufactures “diagnostic” signals of the desired type, by making choices that he later interprets as impartial evidence concerning his own underlying preferences, abilities, or knowledge about the state of the world. In the health domain, for instance, this corresponds to people who “push” themselves to overcome their symptoms, carrying out difficult or even dangerous activities not only for their own sake, but also as “proof” that everything is fine.
What does this all mean? To the extent that their scholarly paper captures a real and significant part of human psychology, it indicates just how difficult it is to change important, fundamental beliefs and that we should view claims to enact belief-changes through relatively superficial means such as offsites and workshops, no matter how seemingly impactful they appear on the surface.
It also suggests that it is wise to more carefully delineate between a change in beliefs that is core to a person’s self-identity and competence as compared to beliefs that are less so. For example, a beginner at some task may have less personal capital invested in a belief on the best way to use a tool but an “expert” might use all sorts of motivated reasoning to protect their belief in the best way to do something.