Julian Barnes opens Changing My Mind, his brisk new book about our unruly intellects, with a quote famously attributed to the economist John Maynard Keynes: “When the facts change, I change my mind.” It’s a fitting start for an essay on our obliviousness to truth, because Keynes didn’t say that—or not exactly that. The economist Paul Samuelson almost said it in 1970 (replacing “facts” with “events”) and in 1978 almost said it again (this time, “information”), attributing it to Keynes. His suggestion stuck, flattering our sense of plausibility—it’s the sort of thing Keynes would have said—and now finds itself repeated in a work of nonfiction. Our fallibility is very much on display.
Not that Barnes would deny that he makes mistakes. The wry premise of his book is that he’s changed his mind about how we change our minds, evolving from a Keynesian faith in fact and reason to a framing inspired by the Dadaist Francis Picabia’s aphorism “Our heads are round so that our thoughts can change direction.” (In this case, the citation is accurate.) Barnes concludes that our beliefs are changed less by argument or evidence than by emotion: “I think, on the whole, I have become a Picabian rather than a Keynesian.”
Barnes is an esteemed British novelist, not a social scientist—one of the things he hasn’t changed his mind about is “the belief that literature is the best system we have of understanding the world”—but his shift in perspective resonates with a host of troubling results in social psychology. Research in recent decades shows that we are prone to “confirmation bias,” systematically interpreting new information in ways that favor our existing views and cherry-picking reasons to uphold them. We engage in “motivated reasoning,” believing what we wish were true despite the evidence. And we are subject to “polarization”: As we divide into like-minded groups, we become more homogeneous and more extreme in our beliefs.
If a functioning democracy is one in which people share a common pool of information and disagree in moderate, conciliatory ways, there are grounds for pessimism about its prospects. For Barnes, this is not news: “When I look back at the innumerable conversations I’ve had with friends and colleagues about political matters over the decades,” he laments, “I can’t remember a single, clear instance, when a single, clear argument has made me change my mind—or when I have changed someone else’s mind.” Where Barnes has changed his mind—about the nature of memory, or policing others’ language, or the novelists Georges Simenon and E. M. Forster—he attributes the shift to quirks of experience or feeling, not rational thought.
Both Barnes and the social scientists pose urgent, practical questions. What should we do about the seeming inefficacy of argument in politics? How can people persuade opponents on issues such as immigration, abortion, or trans rights in cases where their interpretation of evidence seems biased? Like the Russian trolls who spread divisive rhetoric on social media, these questions threaten one’s faith in what the political analyst Anand Giridharadas has called “the basic activity of democratic life—the changing of minds.” The situation isn’t hopeless; in his recent book, The Persuaders, Giridharadas portrays activists and educators who have defied the odds. But there is a risk of self-fulfilling prophecy: If democratic discourse comes to seem futile, it will atrophy.
Urgent as it may be, this fear is not what animates Barnes in Changing My Mind. His subject is not moving other minds, but rather changing our own. It’s easy and convenient to forget that confirmation bias, motivated reasoning, and group polarization are not problems unique to those who disagree with us. We all interpret evidence with prejudice, engage in self-deception, and lapse into groupthink. And though political persuasion is a topic for social scientists, the puzzle of what I should do when I’m afraid that I’m being irrational or unreliable is a philosophical question I must inevitably ask, and answer, for myself.
That’s why it feels right for Barnes to approach his topic through autobiography, in the first person. This genre goes back to Descartes’ Meditations: epistemology as memoir. And like Descartes before him, Barnes confronts the specter of self-doubt. “If Maynard Keynes changed his mind when the facts changed,” he admits, “I find that facts and events tend to confirm me in what I already believe.”
You might think that this confession of confirmation bias would shake his confidence, but that’s not what happens to Barnes, or to many of us. Learning about our biases doesn’t necessarily make them go away. In a chapter on his political convictions, Barnes is cheerfully dogmatic. “When asked my view on some public matter nowadays,” he quips, “I tend to reply, ‘Well, in Barnes’s Benign Republic …’” He goes on to list some of BBR’s key policies:
For a start … public ownership of all forms of mass transport, and all forms of power supply—gas, electric, nuclear, wind, solar … Absolute separation of Church and State … Full restoration of all arts and humanities courses at schools and universities … and, more widely, an end to a purely utilitarian view of education.
This all sounds good to me, but it’s announced without a hint of argument. Given Barnes’s doubts about the power of persuasion, that makes sense. If no one is convinced by arguments, anyway, offering them would be a waste of time. Barnes does admit one exception: “Occasionally, there might be an area where you admit to knowing little, and are a vessel waiting to be filled.” But, he adds, “such moments are rare.” The discovery that reasoning is less effective than we hoped, instead of being a source of intellectual humility, may lead us to opt out of rational debate.
Barnes doesn’t overtly make this case—again, why would he? But it’s implicit in his book and it’s not obviously wrong. When we ask what we should think in light of the social science of how we think, we run into philosophical trouble. I can’t coherently believe that I am basically irrational or unreliable, because that belief would undermine itself: another conviction I can’t trust. More narrowly, I can’t separate what I think about, say, climate change from the apparent evidence. It’s paradoxical to doubt that climate change is real while thinking that the evidence for climate change is strong, or to think, I don’t believe that climate change is real, although it is. My beliefs are my perspective on the world; I cannot step outside of them to change them “like some rider controlling a horse with their knees,” as Barnes puts it, “or the driver of a tank guiding its progress.”
So what am I to do? One consolation, of sorts, is that my plight—and yours—predates the findings of social science. Philosophers like Descartes long ago confronted the perplexities of the subject trapped within their own perspective. The limits of reasoning are evident from the moment we begin to do it. Every argument we make contains premises an opponent can dispute: They can always persist in their dissent, so long as they reject, time and again, some basic assumption we take for granted.
This doesn’t mean that our beliefs are unjustified. Failure to convert the skeptic—or the committed conspiracy theorist—need not undermine our current convictions. Nor does recent social science prove that we’re inherently irrational. In conditions of uncertainty, it’s perfectly reasonable to put more faith in evidence that fits what we take to be true than in unfamiliar arguments against it. Confirmation bias may lead to deadlock and polarization, but it is better than hopelessly starting from scratch every time we are contradicted.
None of this guarantees that we’ll get the facts right. In Meditations, Descartes imagines that the course of his experience is the work of an evil demon who deceives him into thinking the external world is real. Nowadays, we might think of brains in vats or virtual-reality machines from movies like The Matrix. What’s striking about these thought experiments is that their imagined subjects are rational even though everything they think they know is wrong. Rationality is inherently fallible.
What social science reveals is that we are more fallible than we thought. But this doesn’t mean that changing our mind is a fool’s errand. New information might be less likely to lead us to the truth than we would like to believe—but that doesn’t mean it has no value at all. More evidence is still better than less. And we can take concrete steps to maximize its value by mitigating bias. Studies suggest, for instance, that playing devil’s advocate improves our reliability. Barnes notwithstanding, novel arguments can move our mind in the right direction.
As Descartes’ demon shows, our environment determines how far being rational correlates with being right. At the evil-demon limit, not at all: We are trapped in the bubble of our own experience. Closer to home, we inhabit epistemic bubbles that impede our access to information. But our environment is something we can change. Sometimes it’s good to have an open mind and to consider new perspectives. At other times, it’s not: We know we’re right and the risk of losing faith is not worth taking. We can’t ensure that evidence points us to the truth, but we can protect ourselves from falling into error. As Barnes points out, memory is “a key factor in changing our mind: we need to forget what we believed before, or at least forget with what passion and certainty we believed it.” When we fear that our environment will degrade, that we’ll be subject to misinformation or groupthink, we can record our fundamental values and beliefs so as not to forsake them later.
Seen in this light, Barnes’s somewhat sheepish admission that he has never really changed his mind about politics seems, if not entirely admirable, then not all bad. Where the greater risk is that we’ll come to accept the unacceptable, it’s just as well to be dogmatic.
The post Why It’s Hard to Change Your Mind appeared first on The Atlantic.