David Hume was — at least on the matter of death and dying — a Socratic man. Even in his most canonical works, A Treatise of Human Nature and An Enquiry concerning Human Understanding, Hume was largely preoccupied with establishing limits. The way Hume saw it, our brief lives, crowned by unavoidable death, are unlikely to put us in touch with any grand absolutes. On the other hand, the human mind is an indubitably powerful tool and its powers of reasoning have penetrated many an enigma. Hume was as amazed by human knowledge as the next guy. He simply wanted us to be honest about its failings and limitations. Most of the things we know come from observing what happens around us and making the reasonable inference that what happens one day will continue to happen the next. The sun will rise, dropped objects will fall, harsh words will bite, etc. We don’t know the greater “why” of such things, suggested Hume, and there is no reason to think we ever will.Hume always found himself astraddle those two great pleasures, study and society. When he spent too much time studying, the mechanisms of human reason would take flight, leading him into seemingly logical conclusions that defied his actual experience of the world. Thinking hard in his study, Hume would reach the conclusion, for instance, that there is no such thing as causality. Then he would step outside and go about the business of daily life in the full assurance that cause and effect operates just as we’ve always experienced it. Everyday experience would do its work, grounding him again in reality. That contrast between reason and experience never failed to amuse, trouble, and delight Hume.
Monthly Archives: April 2011
Arthur sent me an inspirational quotation from Oscar Wilde to encourage me as I go through this turbulent period of my life:
Liverpool midfielder Lucas Leiva believes this season has proved he has the quality to be a long-term success at Anfield.The Brazil international was a much-derided figure among fans during his first three years after signing from Gremio in July 2007.However, following the departure in successive summers of Xabi Alonso and Javier Mascherano the 24-year-old has grasped the challenge of added responsibility and is enjoying the best form of his Reds career.In previous campaigns he was criticised by supporters for being a favourite of former manager Rafael Benitez but this term has fully justified the belief the Spaniard had in him when others had their doubts.
Nietzsche was famously close to Richard Wagner for a time before splitting from him and subsequently heaping much scorn upon him. Most accounts of their relationship have focused on things like Wagner being a disciple of Schopenhauer, as Nietzsche was early in his life, thus making their split inevitable when Nietzsche became disenchanted with Schopenhauer’s philosophy. Or they mention that Wagner was the same age as Nietzsche’s father, who died when he was only five, thus making Wagner the father-figure that Nietzsche had to rebel against for his own intellectual independence. You get the picture. Serious issues of philosophy and psychology here.
Only nine days after the composer’s death in 1883, Nietzsche confided in a letter to a friend, “Wagner was by far the fullest human being I have known.” However, he went on, “Something like a deadly offence came between us; and something terrible could have happened if he had lived longer.”Details about this “deadly offence” emerged only in 1956, when correspondence first came to light between Wagner and a doctor who had examined Nietzsche. It related to a consultation Nietzsche had in Switzerland in 1877. The doctor, a passionate Wagnerian, examined Nietzsche and found his health poor—indeed Nietzsche was at risk of going blind. This was when Nietzsche and Wagner were still friends and so, following the examination, Nietzsche wrote to Wagner, reporting the diagnosis, but also enclosing an essay on The Ring, which the doctor had written and given to Nietzsche, on the understanding that it would be passed on. Wagner replied to the doctor, thanking him for the essay, but also raising the matter of Nietzsche’s health, apparently referring to the belief, common at the time, that blindness was caused by masturbation. The doctor, in his reply to Wagner, behaved extremely unprofessionally, confiding that, during his examination, Nietzsche told him he had visited prostitutes in Italy “on medical advice.” (This was sometimes recommended then as treatment for chronic masturbation.)Even at this distance, the set of events is shocking; how much worse it must have been then. It is now known that the details of this exchange circulated during the Bayreuth Festival of 1882, coming to Nietzsche’s own notice later that same year. He confessed in a letter that an “abysmal treachery” had got back to him. More than one observer has concluded that this episode helped to unbalance Nietzsche.It is a story that diminishes two great men.
This is a loooong essay by Raymond Tallis, and I fully admit up front that I don’t have the time to give it the careful attention and engagement it deserves, but at the same time, when an argument starts off with such weak points, it doesn’t bode well for the rest of it or make me inclined to want to spend much more time on it:
The failure to distinguish consciousness from neural activity corrodes our self-understanding in two significant ways. If we are just our brains, and our brains are just evolved organs designed to optimize our odds of survival — or, more precisely, to maximize the replicative potential of the genetic material for which we are the vehicle — then we are merely beasts like any other, equally beholden as apes and centipedes to biological drives. Similarly, if we are just our brains, and our brains are just material objects, then we, and our lives, are merely way stations in the great causal net that is the universe, stretching from the Big Bang to the Big Crunch.
Most of those who subscribe to such “neuroevolutionary” accounts of humanity don’t recognize these consequences. Or, if they do recognize them, then they don’t subscribe to these accounts sincerely. When John Gray appeals, in his 2002 book Straw Dogs, to a belief that human beings are merely animals and so “human life has no more meaning than the life of slime mold,” he doesn’t really believe that the life of John Gray, erstwhile School Professor of European Thought at the London School of Economics, has no more meaning than that of a slime mold — else why would he have aspired to the life of a distinguished professor rather than something closer to that of a slime mold?
Anyway, when you started talking about the world being all mystical it made me think of Jungian archetypes, which you don’t approve of. So I’m wondering why you draw the line of artistry at constructing a narrative. It’s alright to admire, to compose paeans and songs, but not stories? Why? I’ve always felt you don’t approve of the stories people tell themselves, as if constructing a narrative in order to self-soothe was somehow an act of weakness. But what I see here is simply a variation of degree, not of type.Given that most people cannot resist (nor want to) their own pattern-making capabilities, I think my approach is most useful for those people who don’t desire the rigour of Zennish ascetism.That’s always been my beef with Buddhism. I don’t need the capital-R real, capital-T truth. I don’t actually think it exists, and even if it did, I honestly don’t think it’s useful, in a pragmatic, day-to-day reality. It’s kind of like pure science. I acknowledge that it’s useful, perhaps even necessary, but applied science is really where that action (and my interest) is.
We instinctively graft abstract concepts like “time,” “theories,” and “humor” onto more concrete concepts that are easier to visualize. For example, we talk about time as if it were a physical space we’re traveling through (“We’re approaching the end of the year”), a moving entity (“Time flies”) or as a quantity of some physical good (“We’re running out of time”). Theories get visualized as structures — we talk about building a case, about supporting evidence, and about the foundations of a theory. And one of my favorite metaphors is the one that conceives of humor in terms of physical violence. A funny person “kills” us or “slays” us, witty humor is “sharp,” and what’s the name for the last line of a joke? The “punch” line.Interestingly, a lot of recent research suggests that these metaphors operate below the level of conscious thought.Associating the future with the forward direction and the past with the backwards direction seems pretty harmless. But cases like “morality equals cleanliness” start to suggest how dangerous metaphorical thinking can be. If people conflate dirtiness with immorality, then the feeling of “Ugh, that’s disgusting” becomes synonymous with the judgment, “That’s immoral.” Which is likely a reason why so many people insist that homosexuality is wrong, even though they can’t come up with any explanation of why it’s harmful — any non-contrived explanation, at least. As the research of cognitive psychologist Jonathan Haidt has shown, people asked to defend their purity-based moral judgments reach for logical explanations, but if they’re forced to admit that their explanation has holes, they’ll grope for an alternative one, rather than retracting their initial moral judgment. Logic is merely a fig leaf; disgust is doing all the work.So far I’ve been discussing implicit metaphors, but explicit metaphors can also lead us astray without us realizing it. We use one thing to metaphorically stand in for another because they share some important property, but then we assume that additional properties of the first thing must also be shared by the second thing.Since our ancestors’ genetic fitness depended on their sensitivity to each other’s mental states, it feels very natural for us to speak about plants or inanimate objects as if they were agents with desires, wills, and intentions. That’s the kind of thinking we’re built for. In fact, even the phrase “built for” relies on the implication of a conscious agent doing the building, rather than the unconscious process of evolution. Metaphors are so deeply embedded in our language that it would be difficult, if not impossible, to think without them. And there’s nothing wrong with that, as long as we keep a firm grasp — metaphorically speaking — on what they really mean.
Most of the time, I don’t have the sort of dreams that make the slightest bit of sense or relate to anything going on in my life at the time. But last night, I dreamed about one of my dogs. It was like I walked into a room, and all of my dogs jumped down off the bed to greet me, and when they did, I saw he had been lying behind them, hidden in the covers. I whispered hoarsely, “Hey, big bear (one of his dozens of nicknames), where have you been? Come here, buddy.” He laid his ears back like he did when he was happy, mouth wide in a dog-smile, wagging and starting to get up. Then I was suddenly awake in the dark, heart jackhammering in my chest, breath hitching in my throat, eyes brimming with tears.
A thought occurred to me earlier, so I went back and checked, and I was right: I’ve barely written about politics at all in the past few months. Why, I’m quite proud of that fact.
Then I Went Down Into the Basement Where My Friend, the Maniac, Busies Himself with His Electronic Graffiti
But here is the thought that makes death formidable again: it is the moment after which I will never post to my blog again, after which I will never write another Facebook status update, I will never again tweet. My soul cries out: “But I can’t live without doing these things!” And death answers back: “No. But you can die.”This is to say that my life is wrapped up with an activity from which I will have to leave off at death. But it is also the activity, I am increasingly coming to think, of actively constructing my self, and this activity, when it leaves off in death, will leave an accurate and vivid trace of a life. My online activity is, as I already put it, both mask and gravestone at once.This may not be an entirely different experience than the one Robert Burton had when he brought out the fourth and fifth and sixth editions of The Anatomy of Melancholy. But the immediacy of blogging, and its lack of finality, make it much more like life itself than a book ever could be, with the backlogged publishers who promise it and the slow-churning presses that produce it. There is no publisher to blame if my blog is not sufficiently far along in its perfection at the moment of my death, no editorial wrangling or grinding production process. There is only me, and what I hope might be a mirror of me, diffused then through the Web of my culture by means I don’t understand: a mirror of me mirroring the world by sitting in a chair and looking at a screen and, every now and again, out the window.