Why, sometimes I’ve believed as many as six impossible things before breakfast.
– Lewis Carroll
Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”
These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.
This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.
Noam Chomsky has reiterated a similar point many times; namely, that the government or corporate clerisy, despite being “intelligent” by the standards we generally use to measure such things, and despite an impressive pedigree of formal education, are much more susceptible to absorbing propaganda than the rabble, who are more likely to believe what their experience tells them, rather than twist it to fit their preordained conclusions. And while your everyday bumpkin is certainly capable of believing all sorts of stupid shit, I’ve noticed a similar proclivity for open-mindedness myself among people with no emotional stake in forcing facts to conform to their rigid worldviews. Being more widely read, having a larger vocabulary, and taking an active interest in worldly affairs can just as easily provide an ideologue with that many more excuses, hiding places and smokescreens to preserve their dogmatic principles.
My brother is a typical Fox News foot soldier, a Glenn Beck devotee, the whole nine yards. And yet, he’s not truly a drooling idiot, unless you agree with the maxim of the philosopher Forrest Gump, who said, “Stupid is as stupid does.” Some months back, I tried to play the role of Matt Taibbi to his Byron York on the issue of the bailouts and the more general theme of taxpayer money. I asked him why he was so outraged over the thought of “welfare” for shiftless minorities, while being apparently unruffled by the fact that he and I and everyone else, for that matter, had just been utterly robbed blind by a bunch of white suit-and-tie-wearing free-marketeers who almost certainly worship Ayn Rand as a goddess. His response was something along the lines of, well, the banksters were forced to make those bad loans to “those” people, even though they knew it was a bad idea, because the bleeding hearts in the government decided that every black and brown person in America should be a homeowner whether they were capable of handling the responsibility or not, so therefore we shouldn’t be angry at them for simply figuring out a way to profit from a disadvantageous situation. I pointed out to him that in one of the debates with John Kerry, Bush himself had bragged that his administration had done more than any other to increase the number of minority homeowners in America, and how did that square with his assertion of liberal do-gooders being responsible for it? He dismissed it the same way that his hero Beck has attempted to dismiss all the aspects of Bush’s presidency that don’t fit with teabagger dogma: by basically calling Bush too liberal on social and domestic issues. George W. Bush. Too liberal. The mind reels.
The other day, when it was widely reported that the “Climategate” pseudo-scandal had officially been debunked, I couldn’t resist taunting him with the headline in the New York Times. Immediately, of course, despite only having heard of this story two seconds prior, he snorted at the idea that the NYT could even report the weather accurately without injecting a commie liberal bias into it, before segueing effortlessly into blasting the study as a Justice Department conspiracy against capitalism, followed by a lightning-quick detour into ranting about the Fox-derived obsession with Obama’s DoJ and the New Black Panther party. I barely had time to inform him that this was a study done by a British parliamentary commission, and that the American government had nothing to do with it, before he sarcastically responded with, “Oh! Well, if the British say so, it must be true!”
You see what I have to deal with on a regular basis.
What kind of rejoinder is that? Who’s asserting anything about blindly trusting anything British? This is what’s so wearying about even trying to have arguments with people who possess no intellectual integrity — every single thing they say is built on a foundation of ridiculous assumptions and outright bullshit. You can’t even begin to address their ostensible “argument” — assuming you can even find a logical train of thought in there – until you clear up all the incoherent debris underneath it all. Trying to argue with them without challenging their assumptions plays right into their hands, much like taking the bait if asked when you stopped beating your wife. You’ve already validated half of what they’re saying just by answering it seriously.
To any neutral observer, he just proved twice in a matter of seconds that he has no clue what he’s talking about, and that facts mean absolutely nothing unless they conform to what he’s already decided he wants to hear. Yet he manages to blithely continue on, unhampered by the crushing weight of self-awareness that would shame him into silence, simply erasing such incidents from his memory as if they never happened, before changing the subject to whatever else Beck has started ranting about in the meantime. Like I said, he’s not an idiot. His brain does work. He can form complete sentences, spell fairly well, and even read for pleasure. But there’s something going on psychologically, on a different level, that subordinates those qualities to supporting preconceived conclusions, because it would be too stressful and chaotic to him to have to admit being wrong and start again, without being able to lean on one ideological crutch or another.
This is why I tend to be pessimistic about the idea that humans will always rationally choose to better themselves, that all we need is more information to lead us out of ignorance and self-destruction. Truth-seeking for its own sake is a relatively new ideal; far more ingrained in our makeup is the drive to see what we want to see, to cater to our own narcissism, to reinforce our hopes and fears rather than challenge them.