Eric Cantona has added to his long list of unique and bizarre speeches after collecting the UEFA President’s Award.
The former United forward was on stage ahead of the Champions League draw in Monaco to receive the award, which “recognises outstanding achievements, professional excellence and exemplary personal qualities”.
Dressed in a rather casual shirt, jeans and flat cap and sporting a familiarly large beard, the 53-year-old began by quoting William Shakespeare’s King Lear: “As flies to wanton boys, we are for the gods.”
The audience, including Lionel Messi, Cristiano Ronaldo and Virgil van Dijk, looked on perplexed as Cantona continued: “They will kill us for the sport.
“Soon the science will not only be able to slow down the ageing of the cells, soon the science will fix the cells to the state and so we will become eternal.
“Only accidents, crimes, wars, will still kill us but unfortunately, crimes, wars, will multiply.
“I love football. Thank you.”
August 2019
Thursday Throwback
[Originally published Feb. 1, 2011.]
As Mr. Choprah has learned, people will pay good money to be told by a religious authority figure that they will live forever. People have paid damn good money to hear that from religious authority figures for a very long time and in cultures across the globe. It is quite a reliable strategy for making a living.
…I don’t necessarily think that Mr. Choprah is cynically exploiting his readers by telling them lies. He says what he says in order to create a reassuring feedback loop from himself to his readership and back again that helps relieve his own fears of death. This is also a time-tested strategy and appears to work for some people.
The question of the exact ratio of witless fraud to manipulative liar that constitutes Deepak Chopra is an academic one, and not all that important. The results are the same in any case, no matter what he truly believes in his heart of hearts.
But “believe” is kind of a funny term when you really think about it. What does it mean to really believe something? Does it mean you are about as sure as you can be without having direct proof, as in believing that gravity will still be in effect and the sun still in the sky tomorrow? Or does it mean something more like hope? Either way, as with all abstract concepts, the term exists in contrast to its opposite, doubt. Claiming to believe something implicitly allows the possibility of doubt. And when we’re talking about such a momentous and emotionally charged issue as the possibility of life after death, you can bet your sweet bippy that the boundary between “things we know to be objectively true” and “things we desperately want to believe are true”, poorly demarcated at the best of times, becomes practically nonexistent.
What I find interesting is in the way Brad puts that: Chopra is just as much a prisoner to his audience as they are to their own perceived need for someone wiser than them to tell them all the answers. Yes, I know, the poor bastard, imprisoned in his several mansions. Forced to endure a retinue of underlings fanning him, popping grapes into his mouth, and throwing themselves across mud puddles for him. What I mean is: Deepak, being a confused shaved ape like the rest of us, though a fantastically rich one, is no different when it comes to falling prey to the fear of nonexistence. I’m willing to allow that he has genuinely wondered about all the big questions in life just like anyone else. But if there ever were a time when he felt free to do so, surely he doesn’t have that freedom now. Whether by happenstance or design, he’s been presented, or presented himself, as a guru. Does he truly believe he has all the answers? Well, the positive reinforcement from his devoted fans probably helps assuage any private doubts he might have. As the bank account keeps growing, and the adulation pours in; as his identity as a sage gets ever more firmly cemented in place by the feedback he gets from hundreds, thousands, millions of others, he had better believe it for the sake of his psychological well-being. There’s no way he can ever allow himself to say those simple words, “I don’t know.” No one has time for a wise man who qualifies his statements constantly with things like, “Of course, that’s just my opinion…I could be wrong…I’m no more of an expert on this than you are…I’ve never considered that before, actually…fucked if I know.” Well, if you don’t have any clear answers, then why should we listen to you? And if no one’s listening to me anymore, who am I?
Being able to freely admit ignorance is what keeps us mentally supple. It’s what keeps us from being too proud to learn anything new, too afraid of losing face to ask a question, too stubborn to change our mind when necessary. But few, if any, would have the courage to face an audience of people trained to hang on your every word and do just that. The more you come to see yourself as a person with a message, the less chance you’ll retain that flexibility. The worst thing that could happen to any sincere truth-seeker would be to look back over their shoulder and see an army of followers looking back at them.
There’s No Shame In Going Out of Style
… my brother asked me why I didn’t just have a newsletter, like everyone else, instead of writing whole paragraphs on Twitter—which isn’t designed for paragraph-length thoughts. I didn’t really have a good answer to that, except to say that I like Twitter. Probably because play consists of whatever a body is not obliged to do. He reckoned I should start a newsletter, populate it with my random thoughts, get a bunch of subscribers, then ask people to pay. He was sure this would work.
I admit to being a bit confused about why this newsletter thing has caught on. I subscribe to Micah Mattix’s, because I like that sort of books-and-art link aggregation. I also subscribe to Alan Jacobs’s, though I don’t understand why he feels the need to separate “things that give me delight” from “analysis or critique.” Why can’t a blog do both? Why can’t it be anything and everything, from pictures to brief comments to long essays? I know WordPress makes it easy to “follow” a favorite blog and get new posts sent straight to your inbox, and I’m pretty sure Blogspot does too. And if you don’t want to hear from the rabble, you can just turn the comments off. How is that any different from a newsletter? (And whatever happened to that other recent attempt to reinvent the wheel, Tweetlonger or Tweetstorm or whatever it was called?) I guess I’m just amused by the way people seem determined to overlook a perfectly flexible, convenient platform for writing online that’s been around for twenty years now.
In Anthony Kronman’s newest book, he spends a fair amount of time writing about what he calls the “conversational ideal,” which, in his view, should be the animating spirit of academia, a golden mean between the stifling, therapeutic ideal of safe-space culture and the market-oriented rough-and-tumble free-speech culture. While reading that, it struck me that the conversational ideal is what I’ve always loved about blogs. A blog is a way for isolated individuals to connect with like-minded people over shared interests, but on a much smaller, personal scale, where actual written conversations can take place. Once Facebook and Twitter took over and leveled the landscape, communication quickly flowed down to sea level, and now you might as well be trying to have a conversation in a crowded pub, a football stadium, or a metropolitan street at rush hour. (Some junkies are honest enough to admit that the cheap thrill of “statistical dopamine” is the only reason for anyone to degrade themselves like that.) A common practice on blogs back in the day, one which I still follow, was to use an excerpt from someone else’s post, or from a book, as a springboard for one’s own thoughts. This helped maintain the sense of a blog as a record of ongoing conversation. Now it seems that “quote-tweeting” someone on Twitter is widely considered creepy, if not outright harassment. Everyone wants to speak, but no one wants to listen and respond intelligently. Conversation becomes just another zero-sum battle, as Bill Watterson foresaw.
The Lady of the House, who is much more worldly in these matters than me, says that a lot of “content creators” {shudder} prefer newsletters in order to have a captive email list for future marketing purposes. Well, I suppose that was inevitable. Again, though, I prefer the busker ethos of the old blogosphere, where a blogger could put a tip jar or an Amazon wish list in the sidebar so grateful readers could contribute if and when the spirit moved them. As for me, I have nothing to sell you, but I appreciate the fact that you voluntarily show up here, even though I question your taste.
All the Words We Say to Be Believed
By now, you are probably aware that the millennial and “Gen-Z” generations are far more supportive of “socialism” and redistributive economic policies than any of their elders. And yet, according to Pew’s new survey, Americans under 30 are also way more distrustful of their fellow citizens and government than any other age group.
…Historically, socialism has been a utopian creed marked by its faith in humankind’s capacity for altruism. But Pew’s research suggests that America’s most socialistic age bracket is also its most misanthropic.
This will no doubt come as a shock to those of us blessed enough to have woke acquaintances on social media, where their cheerful equanimity, thoughtful consideration, and effusive delight in the motley variety of opinions belonging to their fellow citizens are on constant display. But yes, it seems that many people in these Yoo Ess of Ay are firmly in favor of spending vast amounts of other people’s money, especially when saying so raises one’s own status among peers with no cost to oneself. That’s a real head-scratcher, all right. Blinking in bewilderment, bereft of answers from the social-science literature, our correspondent concludes, “A definitive explanation of this paradox in public opinion will probably require a more credentialed authority.”
Well, I’m fairly sure Ivan Karamazov wouldn’t count as a credentialed authority, being a fictional creation and all, but he did note that it’s much easier to love one’s neighbors “at a distance,” where we can avoid being inconvenienced by their body odor, their clumsiness, and their stupid faces. Likewise, James Fitzjames Stephen probably doesn’t have any relevant citations to his name, but he recognized that the utopian socialist was precisely the sort of man “who is capable of making his love for men in general the ground of all sorts of violence against men in particular.” Christopher Lasch, though — I know he wandered off the reservation in his later years, but perhaps his credentials might earn him a perfunctory hearing? He traced the makings of misanthropic progressivism to a full century ago, when it started to become fashionable to demonstrate one’s political idealism by expressing contempt and disgust for one’s benighted neighbors. Too bad these men chose to express their observations in the vague, wispy form of novels and artful rhetoric rather than the data and studies of social science.
As to Levitz’s befuddlement over why the U.S. doesn’t seem inclined to replicate the welfare states of northern Europe, well, greater minds than his have found the question difficult and left it unaddressed.
Immortal Longings
What is this joie de vivre that they talk about nowadays? Our hunger for God, our thirst of immortality, of survival, will always stifle in us this pitiful enjoyment of the life that passes and abides not. It is the frenzied love of life, the love that would have life to be unending, that most often urges us to long for death. “If it is true that I am to die utterly,” we say to ourselves, “then once I am annihilated the world has ended so far as I am concerned—it is finished. Why, then, should it not end forthwith, so that no new consciousnesses, doomed to suffer the tormenting illusion of a transient and apparential existence may come into being? If, the illusion of living being shattered, living for the mere sake of living or for the sake of others who are likewise doomed to die, does not satisfy the soul, what is the good of living? Our best remedy is death.” And thus it is that we chant the praises of the never-ending rest because of our dread of it and speak of liberating death.
…And they come seeking to deceive us with a deceit of deceits, telling us that nothing is lost, that everything is transformed, shifts and changes, that not the least particle of matter is annihilated, not the least impulse of energy is lost, and there are some who pretend to console us with this! Futile consolation! It is not my matter or my energy that is the cause of my disquiet, for they are not mine if I myself am not mine—that is, if I am not eternal. No, my longing is not to be submerged in the vast All, in an infinite and eternal Matter or Energy or in God; not to be possessed by God, but to possess Him, to become myself God, yet without ceasing to be I myself, I who am now speaking to you. Tricks of monism avail us nothing; we crave the substance and not the shadow of immortality.
— Miguel de Unamuno, Tragic Sense of Life
****
That man desires immortality is understandable, but were it not for the influence of the Christian religion, it should never have assumed such a disproportionately large share of our attention. Instead of being a fine reflection, a noble fancy, lying in the poetic realm between fiction and fact, it has become a deadly earnest matter, and in the case of monks, the thought of death, or life after it, has become the main occupation of this life.
…Many people have substituted for this personal immortality, immortality of other kinds, much more convincing—the immortality of the race, and the immortality of work and influence. It is sufficient that when we die, the work we leave behind us continues to influence others and play a part however small, in the life of the community in which we live. We can pluck the flower and throw its petals to the ground and yet its subtle fragrance remains in the air. It is a better, more reasonable, and more unselfish kind of immortality. In this very real sense, may say that Louis Pasteur, Luther Burbank and Thomas Edison are still living among us. What if their bodies are dead, since “body” is nothing but an abstract generalization for a constantly changing combination of chemical constituents! Man begins to see his own life as a drop in an ever flowing river and is glad to contribute his part to the great stream of life. If he were only a little less selfish, he should be quite contented with that.
— Lin Yutang, The Importance of Living
****
However, in an entirely friendly spirit, I would like to take issue with Alan Harrington’s fascinating article “The Immortalist” (May 1969), on the desirability of abolishing death, and the possibility of doing so through medical techniques.
The immortalization of any biological individual runs into the same logistic problems as building indefinitely high skyscrapers: the lower floors are increasingly taken up with channels for elevators. It’s called “the law of diminishing returns.” A brain that continues intact for 100, 500, or 1,000 years is increasingly clogged with memories, and becomes like a sheet of paper so covered with writing that no space is left for any visible or intelligible form. Thus a human being 500 years old would be as inert as a turtle of the same age.
Consider the following points: (a) Death is not a sickness or disease; it is an event as natural and as healthy as childbirth or as the falling of leaves in the autumn. (b) As the “natural childbirth” obstetricians are training women to experience the pains of labor as erotic tensions, there is no reason why the “pangs of death” should not be reinterpreted as the ecstasies of liberation from anxiety and overloads of memory and responsibility. (c) Suppose that medical science achieves a method of getting rid of the overload of memories and anxieties: Isn’t this what death accomplishes already? (d) The funk about death is the illusion that you are going to experience everlasting darkness and nothingness as if being buried alive. (e) The “nothingness” after death is the same as the “nothingness” before you were born, and because anything that has happened once can happen again you will happen again as you did before, mercifully freed from the boredom of an overloaded memory.
Along with most of us, Alan Harrington doesn’t see that this “nothingness” before birth and after death is simply the temporal equivalent of, say, the space between stars. Where would stars be without spatial intervals between them? The problem is simply that civilized and brainwashed human beings lack the perception that we are all one Self, marvellously varied and indefinitely extended through time and space with restful intervals. As St. Thomas Aquinas said, “it is the silent pause which gives sweetness to the chant.”
— Alan Watts, The Collected Letters of Alan Watts
Thursday Throwback
[Originally published Mar. 19, 2016.]
Linji Yixuan, founder of a school of Ch’an Buddhism in 9th-century China, was said to have scornfully dismissed Buddhist terms such as “Bodhi” and “Nirvana” as stakes for tethering donkeys. In reading Sarah Bakewell’s At the Existentialist Café, I found myself wishing that the prominent figures in phenomenology and existentialism, Edmund Husserl, Martin Heidegger, and Jean-Paul Sartre, had studied under a similar instructor who might have imparted a sense of skepticism toward the possibility of capturing the essence of experience in words like “Being” and “Authenticity,” or, as in the case of Heidegger, torturous compound German phrases. Bakewell’s previous book, an unconventional biography of Montaigne, was a treasure. I was eagerly anticipating this one as a result, especially given that I have become increasingly convinced that I am the victim of a hypnotist’s lingering prank, one in which I was rendered incapable of remembering the slightest interesting thing about Heidegger’s philosophy within mere minutes of having finished reading about him. If anyone could help me understand why this man is worth my time and attention, Bakewell could.
As it happens, I can’t say I’m any closer to a desire to read Being and Time, though that’s no fault of Bakewell’s. As a character-driven summary, rather than a dense philosophical history, the story of this group of thinkers is indeed absorbing. It begins with Husserl’s slogan, “To the things themselves!” As Bakewell explains:
It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
There is at least a superficial similarity to Zen philosophy here, but in practice, the phenomenologist project appears more like an attempt to greedily attend to the quotidian details of experience in a possessive manner. I once heard a metaphorical description of conscious awareness as a thin penlight being used to search within a darkened warehouse. Bakewell’s paraphrases of the original material, however lucid, seem to me to describe an attempt to shove as much experience as possible out of the shadows, into that narrow beam of light.
At this point, it becomes curious to note that Charles Darwin is mentioned only once in passing. Not that he should have played a role in Bakewell’s story, of course, but his absence in the thought of her subjects, nearly a century after his discoveries, seems significant (for that matter, one might think Kant’s criticisms of empiricism, a half-century before Darwin, should have likewise tempered the urge to wallow in the deluge of sensory stimuli). Evolution did not mold the human brain as a truth-seeking machine. Like all other evolved beings, we adapted to our environments at a cultural snail’s pace, according to our most pressing needs. The fact that we ruthlessly filter our experience along established patterns and ignore most things which seem irrelevant to our needs and interests is evidence of our minds working in the most efficient manner possible within their enforced constraints, rather than evidence of a moral or intellectual failing. Subjecting our experiences to hyper-focused scrutiny is something that can only be done in short bursts — the cognitive burden is too taxing. Is the effort worth it, though? Will sustained attention to this, whatever this may be, burn a hole through the layers of illusion to the authentic truth beneath? I’m skeptical, to say the least.
In his book Fools, Frauds and Firebrands, Roger Scruton referred a few times to Sartre’s “Satanic” pride (Satanic in the sense of Milton’s anti-hero). We see evidence of this here as well, with both Sartre and his partner Beauvoir struggling to accept the necessity of death. As Bakewell says about Sartre, “everything in his personality revolted against being hemmed in by anything at all, least of all by death.” Phenomena, of course, will go on after our individual deaths; countless living beings will partake of the same experiences that we have had. What offends Sartre’s immense pride is the fact that death means “I” can no longer hoard one particular collection of memories as “mine.” I found it unexpectedly poignant to consider this particular tragic flaw, especially as there were so many others of his that failed to inspire much sympathy.
That said, I consider it a notable accomplishment that I found myself in near-agreement with Bakewell when she confesses coming to respect and even like Sartre, despite disagreeing with much of his philosophy and acknowledging his “monstrous” nature. For me, it was more a feeling of grudging acceptance for someone attempting to think his way through one of the darkest, most turbulent times in history, when it was very easy to make wrong choices. My general revulsion for Sartre as a man was tempered by being reminded of his concept of bad faith, one which has become increasingly central to my own thinking. Passages like this —
All these devices work because they allow us to pretend that we are not free. We know very well that we can always reset the alarm clock or disable the software, but we arrange things so that this option does not seem readily available. If we didn’t resort to such tricks, we would have to deal with the whole vast scope of our freedom at every instant, and that would make life extremely difficult. Most of us therefore keep ourselves entangled in all kinds of subtle ways throughout the day.
Or this —
For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control.
— stir my blood and quicken my pulse, just like when I first discovered existentialism in philosophy class many years ago. We could do with a lot more emphasis on “radical freedom” in our own time, when the urge to shrug off the burden of our own agency is as tempting as ever, especially when encouraged by the latest theories in sociology and neuroscience. All in all, a satisfying read, even if Heidegger is destined to remain opaque to me.
The Importance of Living
I am not original. The ideas expressed here have been thought and expressed by many thinkers of the East and West over and over again; those I borrow from the East are hackneyed truths there. They are nevertheless my ideas; they have become a part of my being. If they have taken root in my being, it is because they express something original in me, and when I first encountered them, my heart gave an instinctive assent. I like them as ideas and not because the person who expressed them is of any account. In fact, I have traveled the bypaths in my reading as well as in my writing. Many of the authors quoted are names obscure and may baffle a Chinese professor of literature. If some happen to be well-known, I accept their ideas only as they compel my intuitive approval and not because the authors are well-known. It is my habit to buy cheap editions of old, obscure books and see what I can discover there. If the professors of literature knew the sources of my ideas, they would be astounded at the Philistine. But there is a greater pleasure in picking up a small pearl in an ash-can than in looking at a large one in a jeweler’s window.
I am not deep and not well-read. If one is too well-read, then one does not know right is right and wrong is wrong. I have not read Locke or Hume or Berkeley, and have not taken a college course in philosophy. Technically speaking, my method and my training are all wrong, because I do not read philosophy, but only read life at first hand. That is an unconventional way of studying philosophy—the incorrect way. Some of my sources are: Mrs. Huang, an amah in my family who has all the ideas that go into the breeding of a good woman in China; a Soochow boat-woman with her profuse use of expletives; a Shanghai street car conductor; my cook’s wife; a lion cub in the zoo; a squirrel in Central Park in New York; a deck steward who made one good remark; that writer of a column on astronomy (dead for some ten years now); all news in boxes; and any writer who does not kill our sense of curiosity in life or who has not killed it in himself…how can I enumerate them all?
— Lin Yutang, The Importance of Living
If Montaigne had been a Chinese man writing in 1937, he might have been Lin Yutang. If nothing else, a time-traveling Montaigne would have surely found many enjoyable conversations to be had with Lin. Like his French predecessor, Lin presents his humble credentials without embarrassment and proceeds to consider, examine, and pontificate on any topic that catches his wondering eyes.
Immediately following this preface, by page five he’s already musing over the possibility of classifying national types according to an admittedly pseudo-scientific, quasi-chemical formula “by which the mechanism of human progress and historical change can be expressed.” R stands for a sense of reality (realism); D for dreams (idealism); H for humor; and S for sensitivity, with 4 being the highest quantity and 1 being the lowest. Therefore, “we may put it thus: 3 grains of realism, 2 grains of dreams, 2 grains of humor and 1 grain of sensitivity make an Englishman.” After averring that this is, of course, all provocative rather than authoritative and open to revisions, he proceeds to apply the same chemical classification to writers and poets, from Shakespeare to Li Po. (Shakespeare only needs one more grain of humor to be all fours.) Already, I suspect his claims of unoriginality are a bit overstated.
Again, like his spiritual kinsman, he unabashedly writes from a first-person perspective; the idea of hiding behind some pretense of objectivity would never occur to him. These are his thoughts, stated plainly and firmly, and if you find one or another unappealing, don’t worry, there will be many others arriving presently. “I have no doubt that lyrical poetry would not have developed if our hairy human ancestors had had no lice on their bodies.” Or: “Germany lost the war because Wilhelm Hohenzollern did not know when to laugh, or what to laugh at. His dreams were not restrained by laughter.” Then there are the sections on the proper way to lie in bed (“upholstered with big soft pillows at an angle of thirty degrees with either one arm or both arms placed behind the back of one’s head”), or to sit in a chair (“I want to write about the philosophy of sitting in chairs because I have a reputation for lolling…I contend that I am not the only loller in this modern world and that my reputation has been greatly exaggerated.” Apparently the Communists especially hated him for his “most leisurely of all leisurely writers” reputation.) No subject is too prosaic for reflection and discussion, and it’s wonderful to see quotidian details being attended to in earnest, with no cynicism or irony.
Looking up the book on Goodreads before re-reading it, I saw a review from a woman complaining that she was going to have to bail on it before finishing because of his retrograde opinions on women. I was surprised, then, to first encounter numerous sympathetic references to the “subjection” of women (he blames bipedalism as the original sexist sin). Eventually, he does proclaim his view that women who don’t have children are incomplete and probably unhappy, that careers are no substitute for the engrossing work of motherhood. He also qualifies this by saying that he’s speaking of the average woman, and that if he asks her to raise the children and wash the dishes, he also asks the average man to forget about the arts and concentrate on being a breadwinner. All in all, it’s a pretty mild view for a Chinese man writing more than eighty years ago. I can’t imagine being upset over the discovery that Chinese men of 1937 weren’t intersectional feminists, or not being robust enough to be able to dismiss such an unimportant perspective in a spirit of charity. Imagine Lin (or Montaigne) being offended at encountering something in an old book contrary to their modern sensibilities. If anything, they would have been interested, if not delighted, by the surprising reminder of the true diversity to be found throughout world history.
Well, You Can Call Me Crazy, But I Know Where I Belong
It seems the Eye of Social-Science Sauron has turned toward generalists. Forget intensive specialization; being a Renaissance man is what it’s all about now. Reading the interview transcript, I felt the same unease and annoyance as I did several years ago when Susan Cain’s book Quiet was trendy, and introversion became something of an inverted status symbol. My impression of Cain’s message was, “Introverts can be productive, contributing members of corporate and academic environments too!” Well, that’s nice, but I was perfectly fine with being ignored and overlooked, thanks. Likewise, I’m a generalist because I have a genuine interest in many things and I’m not particularly ambitious. Hearing an author cite a bunch of studies to prove that dabbling in a variety of activities and disciplines can be a more efficient way of increasing one’s chance of success sounds like using the methods and idiom of specialization to justify the existence of generalization, i.e. a Pyrrhic victory. With any luck, within the next few years, this data will be considered fatally flawed and outdated, the social-science geeks will become fixated on a new shiny object, and I can go back to enjoying the benign neglect of the wider world.
An Ideological Fur Coat
She was raised by a traditional family. She planned on having a traditional family. But she maintained that traditional families are old-fashioned and society should “evolve” beyond them.
What could explain this?
In the past, upper-class Americans used to display their social status with luxury goods. Today, they do it with luxury beliefs.
In Daybreak, Nietzsche wrote: “The Greeks, in a way of life in which great perils and upheavals were always present, sought in knowledge and reflection a kind of security and ultimate refuge. We, in an incomparably more secure condition, have transferred this perilousness into knowledge and reflection, and calm ourselves down with our way of life.” In the parlance of our times, he might say that shallow people, bored with their own comfort and affluence, enjoy theoretically deconstructing civilization without being willing to risk or sacrifice any of their actual privileges. Personally, I blame Foucault. (I’ve always loved Nietzsche’s writing, but his numerous epigones are a virulent plague.)
Thursday Throwback
[Originally published Jun. 14, 2016.]
The immediacy and excitement of social media and online journalism have encouraged people to ignore this and hold forth on everything. Mouthing off with insufficient knowledge of one’s subject can entail devising and promoting arrant nonsense. What it also does, however, which is far less acknowledged, is make it more difficult to stop promoting nonsense as having publically endorsed a particular opinion one has made a personal investment in its success, integrating the idea into one’s identity and gambling one’s social status on it being impressive.
In the depths of an old-growth forest, a tree falls. In a study, a blogger glances over the headlines and trending topics in his feed reader and rolls his eyes in disdain before closing the laptop and picking up a book. Do either of them make a sound? It depends, of course, on how you define the term. Are the soundwaves that ripple out from the tree’s crash into the undergrowth significant if they’re only registered by the ears of forest creatures incapable of recursively reflecting on them? Is the blogger’s weariness with the fatuity of social media meaningful if it isn’t publicly performed in that same venue in return for validation by clicks, likes and retweets?
It’s no fun ignoring people if they don’t know they’re being ignored, is it? How are people to know that the weeklong lapse between posts is due to my principled rejection of the popular topics and viral essays on offer, rather than my being occupied with work or other hobbies, if I don’t say so? However, conspicuous disdain, the kind that wants to make sure the audience knows just how unworthy of your attention you find the current object of your attention, should be beneath us. It’s akin to the fraudulent spectacle of a football player peeking through his fingers to see if the referee has taken note of his simulation of mortal injury. In the social media marketplace, sense and nonsense are not polar opposites, however invested consumers might be in believing otherwise; they’re competing products, like Coke and Pepsi. Performance, signaling, and seeking validation — these are the currency, the bills and coins which dirty the hands of the enlightened and benighted alike.