- Marsheaux — Hanging On
- Gil Scott-Heron — Me and the Devil
- Tony Joe White — As the Crow Flies
- Wrathchild America — Time
- Richard Thompson — Stony Ground
- Fun Lovin’ Criminals — The View Belongs to Everyone
- Alabama Shakes — Hold On
- Metric — Blindness
- The Morning After Girls — Chasing Us Under
- Eddy Grant — Electric Avenue
- American Head Charge — Take What I’ve Taken
- Midfield General — Reach Out
- Killing Joke — Requiem
- Primal Scream — Autobahn 66
- Sepultura — Bottomed Out
- Philip Boa and the Voodooclub — Rome In the Rain
- Ween — The Rift
- Black Grape — A Big Day In the North
- Chumbawamba — Timebomb
- Danger Mouse and Sparklehorse feat. Suzanne Vega — The Man Who Played God
Some of the most memorable pages here restate an argument Camus had already developed at length in “The Rebel”: not all means are acceptable, even when employed for noble ends; terrorism and torture destroy the very goals they are supposed to serve. This position was criticized as “idealist” (it was the reason for the famous break with Sartre), but Camus sticks to it — admirably, in my opinion: “Although it is historically true that values such as the nation and humanity cannot survive unless one fights for them, fighting alone cannot justify them (nor can force). The fight must itself be justified, and explained, in terms of values.”
Even more eloquent, perhaps, are his remarks on the responsibility of intellectuals in times of hatred: “It is to explain the meaning of words in such a way as to sober minds and calm fanaticisms.” Great writer that he was, Camus placed hope in the calming power of language carefully used, and of reason; in the preface, he asks his readers to “set their ideological reflexes aside for a moment and just think.”
“Times of hatred” strikes me as an odd designation. Are there ever times when people aren’t hating each other? Anyway, yes, The Rebel is probably my favorite of his books; I remember being puzzled after reading it and learning that intellectual opinion of the time had it that Sartre got the better of their argument with his criticism of the book. Hindsight, I presume, has rectified this and elevated Camus above the overrated, wall-eyed, Stalinist toad-man.
Funny enough, even though I loved philosophy class, the subjects who most impressed me, like Nietzsche, Camus, Dostoevsky and Kierkegaard (whose recent bicentennial has inspired a few good posts), were idiosyncratic philosophers at best, if not better described more generally as writers.
It’s not shocking that a magazine called Time would be interested in the march of human generations. But the weekly’s much-discussed cover story on the late-’80s to mid-’90s “millennials,” Generation Me Me Me glossed past (as do the inevitable retorts) the possibility that the year of one’s birth just isn’t very important. A broad study three years ago, based on perhaps the largest available data sets measuring American youth, was skeptical that “generational” cohesion—of the sort we obsess over—exists at all.
…The two psychologists’ findings seem common sense. A woman born in Boston, raised in a bilingual environment, with two siblings; who served in Iraq with the Air Force; is married with two children; subscribes to HBO; and earns $52,000 a year working in commercial property administration, is supposed to share primary character traits with a man from a fourth-generation Wyoming family, monolingual, single, no kids, earning $24,000 as a painter, so long as they were both born in 1991, and are Facebook friends?
Fair play to Time, I suppose, for achieving their aim of getting tons of people chattering about their story, even if only to complain about it. Guitar magazines can only look on enviously, wishing that their next pointless cover story on “The 106 Greatest Shredders of All Time or at Least Since the Last List and Until the Next One” could attract so much attention. Generational analysis is astrology for people who fancy themselves too educated and sophisticated to take astrology seriously. The perfect marks, in other words.
From Bruce Hood’s book The Self Illusion, this is a dynamic I’ve certainly seen occur online:
As soon as we blend into the crowd, we no longer feel the need to put in as much effort if it is not recognized. It is only if the group appreciates our efforts that we try harder. This need for recognition also explains why groups can become more polarized on issues that would normally generate only moderate views. In an effort to gain group approval, individuals adopt increasingly extreme positions that they feel represent the group, which in turn drags the group further toward that position. If you couple that dynamic force with “groupthink,” the tendency to suspend critical thinking when we are one of many decision-makers so as to try and galvanize the gathering, then it is easy to see how we behave so differently in groups than we would as individuals.
It’s not hard to imagine a future when neurohumanities and neuroaesthetics have become so adulated that they rise up and out of the academy. Soon enough, they may seep into writers’ colonies and artists’ studios, where “culture producers” confronting a sagging economy and a distracted audience will embrace “Neuro Art” as their new selling point. Will writers start creating characters and plots designed to trigger the “right” neuronal responses in their readers and finally sell 20,000 copies rather than 3,000? Will artists, and advertisers who use artists, employ the lessons of neuroaestheticism to sharpen their neuromarketing techniques? After all, Robert T. Knight, a professor of psychology and neuroscience at Berkeley, is already the science adviser for NeuroFocus, a neuromarketing company that follows the engagement and attention of potential shoppers. When neuroaesthetics is fully put to use in these ways, it may do as Alva Noë said: “reduce people and culture to ends, simply to be manipulated or made marketable.”
And he has a point. Today, there’s the sudden dominance of so many ways to quantify things that used to be amorphous and that we imagined were merely expressive or personal: Big Data, Facebook, ubiquitous surveillance, the growing use of pharmaceuticals to control our moods and minds. In other words, neurohumanities is not just a change in how we see paintings or read nineteenth-century novels. It’s a small part of the change in what we think it means to be human.
Perhaps I have an answer to the question I asked a few months ago, then. Maybe people will look back on the early 21st century and laugh at the way so many educated people thought that the colored lights of fMRI studies would offer truer or deeper explanations of human existence rather than simply rewording what we already know.
I’m enjoying this site very much.
Yet you identify yourself as an atheist.
Yeah, but I really don’t care if God exists or not. If people can lead good lives by believing in God, that’s perfectly fine with me as long as they are not overly dogmatic. But some atheists have also become dogmatic.
Why are you bothered by atheists who don’t like religion and want to smash it?
Because religion is so inherently human that I don’t know what happens if you kick it out of society. Sigmund Freud wrote a whole book against religion and in the end he says that he is still not sure what would happen if we would remove it from society. It might not be good.
…Humans do terrible things to each other, sometimes in the name of God, sometimes without any religious reference. There is no proof that without religion we would be treating our enemies any better. We’re just not a particularly nice species when it comes to the out-group.
I think again of Razib’s recent post, which clarified a lot of things for me. The intellectual tendency he described is present in many atheist critiques of religion — they engage with belief as a body of rational propositions which, when taken to their logical conclusions, are obviously found wanting. As Razib noted, theology (and its rational opposition) leave a comprehensive intellectual fossil record in the form of texts, which helps present a misleading picture of ideology’s importance in the actual lives of believers and non-dogmatic skeptics. But even the followers of creedal religions are often inconsistent and moderate in practice. Stated reasons are more like the makeup we apply before special events, not the face we wake up with every morning. And so, when latter-day positivists talk about reasoning religion out of existence and the objectively better world that will result, I just laugh at their naïveté. They might be surprised to find how much they have in common with a religious fanatic like U Wirathu, who shares the conviction that if only everyone would agree with him on everything of importance, then there would be no more problems. Isaiah Berlin made this mindset a recurring theme in his lectures and writing:
It was further believed that methods similar to those of Newtonian physics, which had achieved such triumphs in the realm of inanimate nature, could be applied with equal success to the fields of ethics, politics and human relationships in general, in which little progress had been made; with the corollary that once this had been effected, it would sweep away irrational and oppressive legal systems and economic policies the replacement of which by the rule of reason would rescue men from political and moral injustice and misery and set them on the path of wisdom, happiness and virtue.
Once people become convinced that there is one true answer to be found for questions of ethics, politics and relationships, it often seems perfectly reasonable then to marginalize opposition for the good of us all — peacefully if possible, violently if regrettably necessary. Religion has in many ways ceased to be an interesting issue for me; now, I mostly just stay vigilant for moralizing busybodies who have a grand plan for fixing everything and are looking for recruits and victims.
Speaking of nonexistent connexions once again: as a non-academic, an autodidact, and a Bear of Very Average Brain besides, I’m always on the lookout for the givens in an argument, the unfounded and unquestioned assumptions from which the rest of the assertions flow. Like we learned in geometry class, it doesn’t matter how many elaborate steps follow in a proof if your given is flawed. On that note, I can appreciate the usefulness of Daniel Dennett’s little heuristic:
Not always, not even most of the time, but often the word “surely” is as good as a blinking light locating a weak point in the argument. Why? Because it marks the very edge of what the author is actually sure about and hopes readers will also be sure about. (If the author were really sure all the readers would agree, it wouldn’t be worth mentioning.) Being at the edge, the author has had to make a judgment call about whether or not to attempt to demonstrate the point at issue, or provide evidence for it, and—because life is short—has decided in favor of bald assertion, with the presumably well-grounded anticipation of agreement. Just the sort of place to find an ill-examined “truism” that isn’t true!
Let’s examine the most egregious Facebook ad of them all: “Dinner” (in the video above). On the surface, it portrays an intergenerational family meal where a young woman escapes from the dreariness of her older relative’s boring cat talk by surreptitiously turning away from the feast and instead feasting her eyes on Facebook Home. With a digital nod to the analog “Calgon, Take Me Away” commercials, the young woman is automatically, frictionlessly transported to a better place: full of enchanting rock music, ballerinas, and snowball fights.
But let’s break Zuckerberg’s spell and shift our focus away from Selfish Girl. Think off-camera and outside the egocentric perspective framed by the ad. Reflect instead on the people surrounding her.
Ignored Aunt will soon question why she’s bothering to put in effort with her distant younger niece. Eventually, she’ll adapt to the Facebook Home-idealized situation and stop caring. In a scene that Facebook won’t run, Selfish Girl will come to Ignored Aunt for something and be ignored herself: Selfishness is contagious, after all. Once it spreads to a future scene where everyone behaves like Selfish Girl, with their eyes glued to their own Home screens, the Facebook ads portend the death of family gatherings.
Remember what I was saying about Hume and his connexions? This is a good example. If you’re already inclined to believe that most social media activity is frivolous, it’s very easy to thoughtlessly nod along with the next few steps in his reasoning. But people are not integers. Silly ads are not chains of logical propositions understood as literal commands. The Great Unwashed are not blank slates incapable of resisting or criticizing suggestions without the help of a philosophy professor. And when it comes to human behavior, like does not necessarily beget like. Ignored Aunt might very well possess the wisdom of several decades’ experience and be gracious enough to realize that Selfish Girl will eventually outgrow her moody adolescent self-absorption, as adolescents always have, without bearing her a grudge over it. Like most humans ever, Selfish Girl will likely become bored by her normal routine — in this case, frittering away spare moments on Facebook — and go looking for greener grass elsewhere. Perhaps even in flesh and blood relationships.
It’s typical of intellectuals, in their drive for logical consistency, to trick themselves into seeing logical necessity where none exists. Overthinking things, basically. Really, the idea that millions of years of social instincts honed by evolution will be destroyed or reshaped in a couple decades by some cool new gadgets is only fit for laughing at.
I have no pretensions to any special knowledge, let alone anything like wisdom; I am just some guy, a PERSON IN WORLD looking around and noticing things and saying what I think. If what I say doesn’t reflect your own experience, it’s possible that it isn’t about you. It’s also possible that something that’s not About You might still be of some interest or use. There is even some remote possibility that I am oversimplifying, missing something obvious, or just speaking ex rectum.
I’ve lately been rereading Montaigne, generally considered the first essayist, inspired by Sarah Bakewell’s literary biography “How to Live.” Ms. Bakewell singles out the end of one passage in which Montaigne suggests that being self-aware of your own silliness and vanity at least puts you one up on those who aren’t, then shrugs, “But I don’t know.” It’s that implicit I don’t know at the heart of Montaigne’s essays — his frankness about being a foolish, flawed and biased human being — that she thinks has endeared him to centuries of readers and exasperated more plodding, systematic philosophers.
My least favorite parts of my own writing, the ones that make me cringe to reread, are the parts where I catch myself trying to smush the unwieldy mess of real life into some neatly-shaped conclusion, the sort of thesis statement you were obliged to tack on to essays in high school or the Joycean epiphanies that are de rigueur in apprentice fiction — whenever, in other words, I try to sound like I know what I’m talking about.
A raised glass to both Kreider and my old pal Montaigne. Too many smart people, especially online, are more concerned with winning arguments than actually saying anything insightful or interesting. Punditry, both professional and amateur, has become intolerably boring for me.