Radical ideas furnished an entertaining social pastime for the aristocracy, but neither they nor anyone else discerned that all of France’s institutions as well as its social fabric would face a devastating assault. When he finished compiling a list of the grievances contained in the Caheirs de doléances (the 60,000 pamphlets written in response to the king’s request for comments on the state of the nation), Tocqueville was thunderstruck. “I realize with a kind of terror that what is being demanded,” he wrote with stupefaction, “is the simultaneous and systematic abolition of all the laws and customs of the country. I see that it will be a question of one of the most dangerous revolutions that ever took place in the world.”
And yet the proponents of this radical change, who would be its unwitting future victims, had no notion of the violence that would accompany such a total and sudden transformation of French society. They naïvely thought that the whole process could be carried out through “reason” alone.
This is the point Gray elects to miss and has elected to miss many times before. Human beings are social creatures whose sociability manifests itself in feelings of empathy and altruism. But these feelings are not always in evidence and sometimes they give way to hatred and to violence. Hatred and violence are not exceptional. History, as Gray never tires of reminding us, is strewn with the corpses of the murdered and maimed. But nor are hatred and violence the rule. And when we encounter them – sometimes, not always – our better selves are mobilised. Moreover, it is in this spirit – and not in any post-Christian attempt to take a lathe to the crooked timber of humanity – that we try to improve the lot of our species: so that Mary Turner’s descendants are not strung up and emptied of their progeny; so that orphans with tears in their unseeing eyes are taken in and given a bowl of soup; and so that our own children can have a decent education and the chance of a job at the end of it. Is this a hubristic belief in progress? The very suggestion dies on the lips.
Human beings may not develop but human institutions do. Sometimes they develop in good ways and sometimes they develop in bad ones, and whether the development is good or bad it is never irreversible. Of course such freedoms and rights and securities as we have won could all be swept away if another Hitler came to power. That is what makes the fight for justice not just worthwhile but necessary. Gray wants us to believe that this fight is no different from the one waged by Christians and communists alike. He is wrong. To seek to make things better is not the same as thinking that they can be made perfect. The problem we face – that humanity faces – is not faith in the future but indifference to it. Resource wars are already in progress and population growth is out of control. A catastrophic change in our climate, growing inequality, the prospect of a nuclearised Middle East: these problems are not on the horizon – they are upon us. In The Silence of Animals, Gray talks about the ‘current fad for evolutionary theories of society’. I don’t know what theories he means. But there is one thing I do know, or think I do: without a little ‘evolution’ or ‘progress’ in the political sphere our flawed and wonderful species is doomed.
First of all, let me just say that this article was a delight to read. Critical yet fair, it’s an engrossing overview of Gray’s recent ouevre as well as a specific review of his latest book. I wish that sort of thing didn’t deserve special mention, but there you go.
Now, then, let me suggest that, rather than seeking to make anyone “believe” anything, what Gray is doing is giving book-length exposition to the thinking of the Chinese farmer:
The situation we always live in is like the wise Chinese farmer whose horse ran off. When his neighbor came to console him the farmer said, “Who knows what’s good or bad?”
When his horse returned the next day with a herd of horses following her, the foolish neighbor came to congratulate him on his good fortune.
“Who knows what’s good or bad?” said the farmer.
Then, when the farmer’s son broke his leg trying to ride one of the new horses, the foolish neighbor came to console him again.
“Who knows what’s good or bad?” said the farmer.
When the army passed through, conscripting men for war, they passed over the farmer’s son because of his broken leg. When the foolish man came to congratulate the farmer that his son would be spared, again the farmer said, “Who knows what’s good or bad?”
Ad infinitum. Like waves, the events of our lives have no true essence of their own; they simply ebb and flow. If that thought depresses you, then you might just share a little more of that faith in perfectibility than you think.
People are very good at identifying specific things that need to be made “better”. Often, though, they lack broader perspective, and so the improvements they make in one area give rise to new problems in a different area, and off we go again. Like Todd Anderson in Dead Poets Society, no matter how they pull and stretch the blanket, some other part of their body is left exposed and cold. However much weight you want to give to humankind’s sociable nature, we can at least agree that humans are certainly pattern-seeking animals, and with that in mind, it’s not perverse to recognize a certain disheartening pattern in the efforts people make to control and optimize their world. No, hatred and violence are not the rule. But neither are empathy and altruism. There is no “rule”, only endless give-and-take. Perhaps that’s what Nietzsche was driving at in his conception of the Eternal Recurrence — is humanity capable of accepting such nullifying insights that make a mockery of all they hold dear? Are people capable of unhumanizing their views a little along with Robinson Jeffers?
Almost certainly not. A perspective like this inhabits a forbidding perch; the intellectual air is cold and thin. And humans in general are social enough that they will happily stick together on more hospitable terrain, continuing to dream of a blanket big enough.
That final line is the weakest point of the whole essay. The entire species, doomed? You mean every single last one of us? How likely is that? Human culture, as we understand it today, may not last much longer, but even if a mutated supervirus, an asteroid, a murderously enraged Gaia or all-out nuclear holocaust reduced humans to a few million post-apocalyptic hunter-gatherers scattered in isolated bands, the species would likely continue in some form. Even the dinosaurs are still with us in the form of birds, after all. And perhaps that thought scares us even more — even our dethronement as the dominant species on Earth wouldn’t rate a Götterdämmerung of significance. It would just be one more wave in the endless flow of history.
The invention of mobile camera-phones. On the one hand, as XKCD says, they’ve unwittingly eliminated whatever tiny likelihood there was of the existence of paranormal activity. On the other hand, they’ve helped remind us that we take our lives in our hands anytime we eat food we haven’t prepared and cooked ourselves.
After all, if we’d been getting smarter these last 15-plus years, you’d expect that humanity might have formed new and deeper insights into the nature of existence, and used those insights to update our collective goals: world peace, eliminating hunger, and flying taxis driven by cats wearing little chauffeur’s caps. But not only haven’t we gotten wiser and developed new collective goals, we haven’t even gotten any cleverer and moved closer to achieving the same old ones we’ve always pined for. There’s still the endless butchery of war and the terminal ache of starvation.
Of course, none of it’s a surprise. There are at least two obvious reasons why the existence of a cheap, and even free storehouse of knowledge, the likes of which could not have even been imagined by most people a generation ago, has done little to make us all a whole helluva a lot smarter.
For starters, people can be lazy and superficial. Whether you prefer a Marxist interpretation, an existential one, or something equally incisive but less Eurocentric, the conclusion is the same: Lots of people are largely obsessed with chasing pleasure and shirking meaningful work. They’d rather read about celebrity gossip than learn about mechanical engineering or medicine. They’d rather indulge a neurosis or compulsion than work towards the common betterment. And they’d rather watch funny cat videos than try to figure out how those ghastly little beasts can better serve us.
This is why when you plop an unfathomably rich multi-media warehouse of knowledge in front of them, they’ll mostly use it to wile away the hours on Facebook and Twitter. In much the same way that if you give them an actual book, and eliminate the social stigma that says books are sacred, instead of reading it they might be inclined to rip out the pages and make paper airplanes. The creative ones might set them on fire before pitching them out the window, in a quest to create a modern, aerial Viking funeral.
We had a traveling academic friend spend a weekend with us last month in between conferences, and one of the attempts we made to camouflage our boring home life was to take her to visit Thomas Jefferson’s home, Monticello. Afterward, somewhere in the course of the conversation reflecting on his accomplishments, character and legacy, one of us joked about the likelihood of him being suicidally depressed were he to come back and witness the spectacle of millions of people having instant access to the sort of education and information that took him a lifetime to accumulate, yet not being interested in the slightest. ‘Course, it’s obviously debatable just how far Jefferson’s Enlightenment idealism actually extended; he clearly had no problem believing that some classes of people were exempt from that “all men created equal” jazz. Maybe he would take it all in stride, who knows.
Point is, pace a fine fellow like David Cain, being human isn’t something you really “get better” at. “Being human” encompasses everything between Genghis and Gandhi, not just the parts that appeal to highly specific modern cultural mores. Even in “civilized” nations, during episodes of sport —something that supposedly distinguishes us in its complexity from the mere animals — the most horrific barbarism doesn’t need much encouragement to rear its atavistic head. Should educated Western suburbanites miraculously manage to herd the entirety of humanity into thinking, behaving and believing in accordance with their values, humanity en masse would not then level up and unlock all sorts of fantastical knowledge and abilities. You hear a lot of condescending scorn about the various opiata populi, but the opium of the intellectuals, the pleasant, colorful haze between them and the rictus grin on the face of existence, is the belief that human existence is for something, aimed at this or that telos, and the idea that true humanity could be achieved if only humans would stop being so, well, human, is just as much a delusion as anything the great unwashed have ever come up with.
Zuckerman, however, is not a knee-jerk naysayer about all things digital. The director of the MIT Center for Civic Media and cofounder of the international bloggers’ website Global Voices, he is extremely enthusiastic about the potential of using technology to connect people across cultures. He wants the Internet to be an empathy machine. The difference between him and the full-throated apostles of cyber-utopianism is that he does not believe that the online world is foreordained to fulfill this purpose, nor does he naively assume that the outcomes of cross-cultural connections will always be desirable.
Everything depends in the end on whether we can find direct, causal evidence: we need to show that exposure to literature itself makes some sort of positive difference to the people we end up being. That will take a lot of careful and insightful psychological research (try designing an experiment to test the effects of reading “War and Peace,” for example). Meanwhile, most of us will probably soldier on with a positive view of the improving effects of literature, supported by nothing more than an airy bed of sentiment.
The archetypal image of the sage is one of serenity and an almost-otherworldly lack of concern for the mundane obsessions of everyday life. In other words, the accumulation of knowledge and experience, rather than simply being a super-sum of positive integers, tends to produce a state of being that would seem very much at odds with the sort of simplistic, progressive partisan vision of “better” or “nicer” citizens. Knowledge is a double-edged blade that can be used for the same values and desires people have always had; increasingly complex experience isn’t likely to express itself in platitudes.
So Owl gave me the first intimation in my life that all are not wise who claim to be learned. And Owl was a hint also that the clever could be the most foolish of all.
But why did owls symbolise wisdom in the first place? The splendid photos in my book, succinctly titled Owls, suggested a reason: owls seem to have only two states, the serene calmness of sleep and the most intense alertness when awake. Try as we might not to anthropomorphise, owls look serious; they indulge in no foolish or redundant movement. This is nonsense, of course: owls are bird-brained. And one of the things that I learnt from this book, delightful to me because completely useless, is that the Owl of Minerva does not necessarily spread her wings at dusk: nearly forty per cent of the 133 extant species of owls are diurnal, not nocturnal. I bet you didn’t know that.
…The law of unintended consequences is one of the hardest for people to learn because it is so unflattering to our conception of ourselves as rational beings, and because (if it is a law) it suggests inherent limits to our power. We shall never fail to commit errors.
Those excerpts are indeed all from the same essay, an essay which just so happens to be about two of my favorite things: owls and unintended consequences. Naturally, I had to acknowledge it.
Once in my teenage years, after a soccer game, some teammates and I were eating dinner at a restaurant. Somehow, the conversation turned to deciding which animal we each resembled. The consensus was that I was, of course, an owl. Possibly because of my wide eyes, serious expression and quiet bookishness. Or possibly because of my ability to move silently and swivel my head 270°.
Whatever the case, I shortly thereafter underwent the ritual to adopt the owl as my spirit animal. Climbing a tree under a full moon, I hooted and prayed for a vision, while doing my best to resemble a feathered harbinger of death. Soon, my sacred quest was rewarded by the rustle of prey in the leaves below, which turned out to be my mom who had come looking for me. She did admit that my downward swoop was silent and terrifying, at least.
Since then, I have been blessed with the supernatural abilities to win any staring contest and to snatch up a swiftly running rodent with my bare hand.
On February 11, 2011, children the world over gathered about their television sets to watch a new episode of the smash hit cartoon My Little Pony: Friendship Is Magic. On this particular program, one of the ponies manifests the ability to predict the future and Twilight Sparkle, the show’s resident rational positivist, sets out to debunk the phenomenon. What follows is that she is physically and intellectually humiliated at every possible turn of the process, resulting in a climatic breakdown in which she finally rejects her own role as a rational observer and declares that, “I now realize there are wonderful things in this world you just can’t explain, but that doesn’t necessarily make them any less true. It just means you have to choose to believe in them.”
DeBakcsy is concerned about what he sees as a recurring theme in cartoons. You know me; I’m too sanguine to worry about the political ramifications of wrongthink in popcorn entertainment. Reason and the scientific method are made of sterner stuff than that, and overly literal-minded propagandists often fail to reckon with the creative detours their supposedly-unambiguous message can take in the minds of the audience.
Within my working lifetime, the pattern of antibiotic resistance in healthcare has transformed from a rare but notable event, to a problem of epidemic proportion. If we are to avoid a return to the pre-antibiotic landscape with all its excess mortality we must be bold. To squander the advantage we have so recently gained against microorganisms in the fight for life would be unthinkable.
Undesirable, sure; unthinkable, huh? It’s always been the bacteria’s world, we just live in it. Now pardon me while I return to paranoiacally monitoring my incisions for signs of infection and whimpering softly.
Jonathan Haidt, following Jerry Muller’s lead, distinguishes conservatism from orthodoxy:
Orthodoxy is the view that there exists a “transcendant moral order, to which we ought to try to conform the ways of society.” Christians who look to the Bible as a guide for legislation, like Muslims who want to live under sharia, are examples of orthodoxy. They want their society to match an externally ordained moral order, so they advocate change, sometimes radical change. This can put them at odds with true conservatives, who see radical change as dangerous.
Muller next distinguished conservatism from the counter-Enlightenment. It is true that most resistance to the Enlightenment can be said to have been conservative, by definition (i.e., clerics and aristocrats were trying to conserve the old order.) But modern conservatism, Muller asserts, finds its origins within the main currents of Enlightenment thinking, when men such as David Hume and Edmund Burke tried to develop a reasoned, pragmatic and essentially utilitarian critique of the Enlightenment project.
…Muller went through a series of claims about human nature and institutions, which he said are the core beliefs of conservatism. Conservatives believe that people are inherently imperfect and prone to act badly when all constraints and accountability are removed. Our reasoning is flawed and prone to overconfidence, so it’s dangerous to construct theories based on pure reason, unconstrained by intuition and historical experience.
A similar theme which I heard years ago differentiated conservative from radical, not from liberal. Liberal is rather the opposite of authoritarian. Others have juxtaposed empiricism and rationalism in a similar manner. And this theme is also characteristic of the thinking of John Gray and Isaiah Berlin, two of my favorite authors:
Gray, like his friend and mentor Isaiah Berlin, sets himself against all proponents of the grand idea – of progress, of perfectibility, of the right and only way to live. He would, one suspects, champion the bureaucrat over the ideologue any day. We love to castigate bureaucracies – look what a hate-word “Brussels” has become for our latter-day Jacobins of right and left – but consider the alternative. People who kiss their spouses goodbye in the morning, stick from nine-to-five at their humdrum desks, and come home in the evening looking forward to a nice dinner and something on the telly, are surely to be preferred to those cold-eyed demagogues, “the prophets with armies at their backs”, as Isaiah Berlin has it, who conceive a burning vision of exactly how the world should work and are prepared to spill the blood of millions to ensure the imposition of their system.
That is not to say Leiter’s argument is watertight. The claim that religion deserves no special exemptions from generally applicable rules may be right but not because there is anything particularly irrational or otherwise lacking in religious belief. After all, what counts as a religious belief? Aware of the difficulty of defining religion, Leiter devotes a section of the book to the question. His discussion is more sophisticated than many on the subject but he still draws a categorical distinction between religious and other beliefs that is difficult, if not impossible to sustain. Among the distinctive features of religious beliefs, he maintains, is their insulation from evidence. Religious believers may cite what they consider to be evidence in support of their beliefs; they tend not to revise these beliefs in the light of new evidence, still less to cite evidence against them. Instead, their beliefs are part of what Leiter describes as a “distinctively religious state of mind . . . that of faith”.
The trouble is that it is not only avowed believers who display this state of mind.
…Again, nothing infuriates the current crop of evangelical atheists more than the suggestion that militant unbelief has many of the attributes of religion. Yet, in asserting that the rejection of theism could produce a better world, they are denying the clear evidence of history, which shows the pursuit of uniformity in world-view to be itself a cause of conflict. Whether held by the religious or by enemies of religion, the idea that universal conversion to (or from) any belief system could vastly improve the human lot is an act of faith. Illustrating Nietzsche’s observations about the tonic properties of false beliefs, these atheists are seeking existential consolation just as much as religious believers.
If religion does not deserve a special kind of toleration, it is because there is nothing special about religion. Clinging to beliefs against evidence is a universal human tendency. The practice of toleration – and it is the practice, cobbled up over generations and applied in ethics and politics as much as religion, that is important – is based on this fact. Toleration means accepting that most of our beliefs are always going to be unwarranted and many of them absurd. At bottom, that is why – in a time when so many people are anxious to believe they are more rational than human beings have ever been – toleration is so unfashionable.
Years ago, I would have read this and bristled over the facile equation of atheism to religion. And if that were his main point, I’d probably still react that way. But the more interesting —and true — point here is the almost banal reminder that, insistence to the contrary notwithstanding, we don’t actually have any meaningful idea what would happen if the whole world adopted western-style atheism. People might no longer be stupid in uniquely monotheistic ways anymore, but I think it’s a safe bet that we would just find new ways to express our bottomless reserves of stupidity. The point is not that we shouldn’t care about pursuing truth or making improvements; the point is just that we can observe how the same perennial themes of human nature reassert themselves even when, especially when, we pride ourselves on our supposed accomplishments. The Greeks were on to something with all that stuff about hubris, no less so for having expressed it in mythological story-form. As certain segments of the online atheist community have made brutally clear this year, reasoning your way to the nonexistence of God is not necessarily any protection against being insanely stupid in other ways.
How do people react when they’re actually confronted with error? You get a huge range of reactions. Some people just don’t have any problem saying, “I was wrong. I need to rethink this or that assumption.” Generally, people don’t like to rethink really basic assumptions. They prefer to say, “Well, I was wrong about how good Romney’s get out to vote effort was.” They prefer to tinker with the margins of their belief system (e.g., “I fundamentally misread US domestic politics, my core area of expertise”).
A surprising fraction of people are reluctant to acknowledge there was anything wrong with what they were saying. One argument you sometimes hear, and we heard this in the abovementioned episode, but I also heard versions of it after the Cold War. “I was wrong, but I made the right mistake.”
More and more, I find the kind of issues explored by authors like Daniel Kahneman, Dan Ariely, the Brafman brothers, Thaler and Sunstein, Chabris and Simons, etc., to be far more interesting and pertinent than the details of ideological differences.