Chris Mooney has been hard at work lately, through his journalism, his blogging, and his podcast at the Center for Inquiry, trying to understand why denialism is so pervasive. In the new issue of Mother Jones, he lays out some of “The Science of Why We Don’t Believe Science”:
an array of new discoveries in psychology and neuroscience has …demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president, and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
The theory of motivated reasoning builds on a key insight of modern neuroscience: Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds–fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.
We’re not driven only by emotions, of course–we also reason, deliberate. But reasoning comes later, works slower–and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.
These psychological insights are crucial building blocks for anyone who wants to convince other people of just about anything. We might wish to live in a world of purely rational beings, but the only way to make the world more rational is, alas, to play upon the same psychological quirks that make science so unacceptable to so many people.
when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we’re being scientists, but we’re actually being lawyers. Our “reasoning” is a means to a predetermined end–winning our “case”–and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.
And this is the central challenge in trying to change someone’s mind, whether the creationists Chris uses in some of his examples, or anti-vaccine activists. I ran into this recently when a friend of mine mentioned in conversation that she planned to delay vaccinating her new baby, and to skip some vaccines entirely. In this case, she’s got a background in medicine, so I couldn’t just say “I’m a biologist,” and the details of immunology she’d gleaned from anti-vaxxers got into sufficiently obscure areas that I had to tread lightly so as not to accidentally step in a trap. Getting a basic fact wrong would make me no longer a credible source.
How had this friend — a smart person with medical training and a deep and abiding love for her child — gotten sucked into the dangerous world of vaccine denial? It started, I think, with her skepticism about “western medicine,” a skepticism which journalist Seth Mnookin argues convincingly tends only to be fed by the typical ways doctors respond to patients’ concerns about vaccine safety. Doctors are rushed, and they don’t tend to like having to explain fairly basic things, let alone defend the very edifice of modern science and medicine, so someone primed not to trust doctors will find confirmation of that bias in the doctor’s behavior.
The anti-vaxx movement plays to that by offering a welcoming community, and by offering simple explanations that play to the fears and cognitive biases of frightened parents. And once someone’s inside that community, it’s hard to get them out. I considered unleashing the great and mighty Orac on her, but I feared his brand of “respectful insolence” wouldn’t bring her back from the brink. Attacks on the mendacity of Andrew Wakefield were tempting, too, but I feared that someone who believed Wakefield was a martyr to an embattled medical establishment might well take a harsh Wakefield debunking as evidence of that martyr framing. People always look for ways to dismiss evidence, and finding the right source of evidence to cut through the anti-vaccine pseudoscience was tricky, and is an ongoing process.
Of course, I see this in my day job, too, where the challenge is the same. It’s easy for creationists to dismiss new research on evolution, claiming all scientists are engaged in a massive atheistic propaganda enterprise anyway. Why should they trust those atheist scientists telling them what they don’t want to hear, when there are nice Christians happy to play to their biases? By writing off sources of evidence contrary to their beliefs, it’s easy to pretend that their debunked beliefs are well-supported!
Mooney explains this dynamic with research Dan Kahan and his group have been publishing:
people rejected the validity of a scientific source because its conclusion contradicted their deeply held views–and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”–not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan.
The case of vaccine denial is the only major example Mooney offers of science denial coming from the left, a point which Kevin Drum (blogging for Mother Jones) flags as problematic:
Chris wrenches his spine out of shape bending over backward to find an example of liberals denying science as much as conservatives. It might be true that you can find vaccine deniers in the aisles of Whole Foods, but if there’s any rigorous evidence that belief in the vaccine-autism link is especially pronounced or widespread among liberals, I haven’t seen it. Surely there’s a better, more substantive example than that floating around somewhere?
I’m not aware of any polling regarding the autism/vaccine link, and not for lack of trying. I’d be fascinated to see such a survey, and I suspect that you would see an ideological component to it. The highest rates of vaccine rejection in California occur, after all, in liberal Marin County, and I see more of it in San Francisco than I recall seeing in Kansas.
The closest I’ve come is a question from a 2009 survey by Pew, in which people were asked whether “Parents should be able to decide NOT to vaccinate their children” or if they thought “All children should be required to be vaccinated.” More than 2/3 favored requiring vaccinations (68.5%), and conservatives (65%) are less likely to favor compulsory vaccination than liberals (73%). The trend is statistically significant (by my own analysis of the raw data), though it’s a fairly modest effect on a practical level, and probably driven not by political ideology but by other underlying attitudes and affects.
The problem is that conservatives are less likely, on average, to favor any collective action, so it’s not surprising that they’d be less likely to think vaccination should be “required.” You’d expect to see different answers if folks were asked whether all parents should choose to vaccinate, as opposed to whether all parents should be forced to vaccinate. And, of course, the question doesn’t get to people’s reasons for not vaccinating. Conservatives may think it’s some Strangelovian plot to pollute our essence and make us into communists, but I can’t think of a major conservative outlet that has advanced the absurd and oft-debunked attempt to link autism to vaccines. Nor, to my knowledge, have liberal opinion journals like Mother Jones or The Nation endorsed it, but you see plenty of it in the hippy-dippier magazines that cater to decisively liberal audiences.