Nietzsche’s best-known attempt to break the hold that a philosophical or moral picture might have over us is genealogy. At least, genealogy can be liberating in this way if such a genealogy can show us that practices and norms could have been very much otherwise, that some assumption or norm we take for granted as inevitable and unavoidable in fact has a contingent, quite avoidable origin, and an origin considerably more complicated than any notion of “rational commitments” or “reflective endorsement” or “faith in revelation” or the like would allow.
Research shows an introvert is precise and cautious with their language and straight to the big issues, whereas extroverts are vague and abstract. This apparently has the benefit of making for briefer conversations, which is handy for introverts because we’re rubbish at small talk. We know exactly what we want to say and then we want out, because being around others can be tiring after such intense interactions.
Online might just be the introvert’s natural environment, where conversations can be staged, staggered and stopped at their discretion – all from a distance. Thoughts can be edited to perfection, solitary hobbies and pursuits can be meticulously researched before being shared online, friendships maintained without the obligation to meet face-to-face … plus it’s never been easier to uncover other introverts and forge friendships without the inconvenience of meeting. The internet has become an introvert’s playground, where they can dress up as extroverts and perform to a captive and sympathetic audience.
It’s wonderful that introversion is shedding its stigma. I just hope that finding your secret introvert does not turn into claiming a moral superiority because now you think you belong to some kind of elite club or as one introvert puts it “a minority in the regular population but a majority in the gifted population.”
Introversion for me was not about wanting to be gifted. I only wanted to build my Asterix village in peace and not have to play with some random boy just because he was my age. That’s all and I don’t need 23 signs to tell me that.
Finally, most work in the psychological and social sciences suffers from a lack of conceptual rigor. It’s a bit sloppy around the edges, and in the middle, too. For example, “happiness research” is a booming field, but the titans of the subdiscipline disagree sharply about what happiness actually is. No experiment or regression will settle it. It’s a philosophical question. Nevertheless, they work like the dickens to measure it, whatever it is—life satisfaction, “flourishing,” pleasure minus pain—and to correlate it to other, more easily quantified things with as much statistical rigor as deemed necessary to appear authoritative. It’s as if the precision of the statistical analysis is supposed somehow to compensate for, or help us forget, the imprecision of thought at the foundation of the enterprise.
Be damned if I can remember where I read it, but it was just recently that I saw someone lamenting that “we talk about numbers for lack of any meaningful vocabulary to address these issues otherwise,” or words to that effect. I can’t help but feel something similar is going on when I read an article telling me that I should be concerned about the gender disparity among newspaper crossword editors.
Which do I leave first: Facebook or Twitter? I’ve been mulling that question for about a year now, but it always seemed theoretical. On top of that, it would have been a no-brainer. Twitter is essential for work, while Facebook – increasingly burdened by adverts, security issues and intellectual property disputes – is not. But I am now watching Twitter become morally depopulated and seriously considering how long I’ll be on there.
…The solution has to be radical and collective, for the stakes are high. Twitter is the first mass, global conversation, and if it becomes fatally polluted by frat-boy perversity and the verbal spew of sociopaths it will take some time to rebuild elsewhere. Policing it comprehensively is impossible.
I laughed until I nearly choked, I did. Not just at the absurd spectacle of a supposed adult going all garden-of-Gethsemane over such a pathetic trifle, though that is indeed funny. No, my favorite part is that second paragraph — he still holds fast to his utopian vision of the potential for “the first mass, global conversation” even as he’s frantically backpedalling in horror upon hearing just what it is the global masses have to say. Ain’t that always the way? The starry-eyed morons who wax eloquent about an ideal form of humanity are the same ones who want nothing to do with the way most people actually live. Mass communication would be great if only the masses could somehow be excluded from it. We had to censor the conversation in order to save it.
The biggest problem with the digital zombie is not the walking into signposts. It’s more insidious than that, and involves an incurable disharmony with the outside world. A walker strolling down the sidewalk mulling how many exclamation points to append to his or her Facebook status update about walking down the sidewalk is not really here. Studies suggest that he or she is even more disconnected from the actual world than is someone driving 80 miles an hour across west Texas at two in the morning while cranking Carlos Santana.
…Stopping, looking, lifting your head up, searching around you for something… it’s all part of what being human means, of what we evolved to do.
Yes, Wayne Curtis is back again to remind you that you’re doing this walking thing all wrong. Getting to your destination in one piece without causing mayhem isn’t good enough; you’ve got to be absorbing the proper sensory stimuli and thinking about it in the right way to earn his approval. That’s all predictable romantic tripe, of course; I just enjoy the facile way he invokes evolution, of all things, to support his rigid, retrograde definition of what counts as “the actual world,” when, as far as evolution gives a fuck, humans could end up as half-android beings on spaceships eating freeze-dried nutritional powders and getting all their exercise through use of machines.
“Facial hair for the past century has been thought to reflect a suspicious streak of individuality and defiance,” says Christopher Oldstone-Moore, a history lecturer at Ohio’s Wright State University.
…”There’s a long history in our civilisation of anxiety about facial hair, and hair in general, as being unhygienic: hairs will fall into the chocolate and soil the food,” says Oldstone-Moore.
Even in the case of firemen, the waters are muddy. “The mask argument is in part a tool to be used for a larger argument, which is it’s just not uniform, it’s not respectable, it’s not proper, for disciplined professional men to have facial hair. That’s the bottom line.”
Currently reading Mark Forsyth’s The Horologicon, where I happened upon this passage:
Slavery has been abolished but shavery survives. This latter is rather a shame, as it lessens the need for all of the technical beard words, of which there are many. They all involve the Greek root pogo, which is pronounced in exactly the same way as the stick (although the two are etymologically unrelated). So there’s pogonology (the study of beards), pogonate (having a beard), pogoniasis (a beard on a lady), and pogonotomy (shaving). As we live in an essentially misopogonistic society of beard-haters, most men must start the day by taking a razor from the pogonion or tip of the chin up to the philtrum, which is the name for the little groove between your nose and your upper lip. Then you have to work carefully to avoid a neckbeard, which the Victorians called a Newgate fringe. Newgate was the name of a London prison where people were hanged. So a Newgate fringe was meant to resemble the rope that was slipped around the felon’s neck before he took the plunge into eternity.
Two lessons here, then. One, Forsyth is an immensely entertaining writer, and I have greatly enjoyed his books. Two, the next time you see someone using “neckbeard” as a synonym for a virginal, basement-dwelling loser, you may confidently accuse them of classism before instructing them to check their shaveowner’s privilege.
A profound contradiction, of which he was well aware, informs Leopardi’s philosophy. Although he saw in the will-to-truth the primary cause of the nihilism that he believed was drawing modern civilisation into its vortex, Leopardi fully embraced reason, logic, science and this will-to-truth. He followed the truth wherever it led him, refusing to shy away from its conclusions or to seek refuge in mystifications and self-deceiving consolations.
Leopardi’s open-eyed, disabused thinking led him ultimately to a monistic view of reality. All that exists is matter, he concluded, and whatever the tradition calls mind, soul or spirit is only in effect matter. Yet Leopardi’s concept of matter was so original, heterogeneous and self-expansive as to have little in common with the inert matter of the dualists who believe that mind is one thing, matter another. Late in the Zibaldone he declares that everything points to the conclusion “that matter can think, that matter thinks and feels”. Like many of the other thoughts that make the Zibaldone an ongoing conversation with the future, Leopardi’s inspirited concept of matter is one that calls on us to take it up and give it new life in our own time.
I’d never heard of Leopardi before reading this fascinating review, and while I hate to think of having missed out on something I might have enjoyed, there’s also a childlike delight to be had in the continued existence of hidden surprises. How many similar intellectual companions do I have yet to encounter?
A recent paper by Richard Topolski at George Regents University and colleagues, published in the journal Anthrozoös, demonstrates this human involvement with pets to a startling extent. Participants in the study were told a hypothetical scenario in which a bus is hurtling out of control, bearing down on a dog and a human. Which do you save? With responses from more than 500 people, the answer was that it depended: What kind of human and what kind of dog?
Everyone would save a sibling, grandparent or close friend rather than a strange dog. But when people considered their own dog versus people less connected with them—a distant cousin or a hometown stranger—votes in favor of saving the dog came rolling in.
…We jail people who abuse animals, put ourselves in harm’s way in boats between whales and whalers, carry our childhood traumas of what happened to Bambi’s mother. We can extend empathy to another organism and feel its pain like no other species. But let’s not be too proud of ourselves. As this study and too much of our history show, we’re pretty selective about how we extend our humaneness to other human beings.
My own dog over a stranger? Pfft, it doesn’t even qualify as a choice. Even in a case of a strange dog and a strange person, ehh, I’m still leaning toward the dog. Sorry, but the mere fact of membership in homo sapiens doesn’t mean anything to me. I refrain from actively inflicting pain or making anyone else miserable, but that’s as far as it goes. I’m certainly not moved by quasi-religious exhortations to believe in humanity’s elevated importance.
You’ve probably seen a lot of the same articles I have on the web — you know, those achingly stupid pieces about how “the ancient Egyptians communicated in what could be understood as a rudimentary form of social media!” Or, “The relationships between characters in Jane Austen’s novels reflect the ways in which we communicate through social networks!” Nothing is too staggeringly facile or utterly uninformative to be published as long as it somehow references social media, because lord knows, Facebook is the teleological endpoint to which human existence has been striving lo these interminable eons, and Twitter is the ground of all being, through our relationship to which we come to truly know ourselves.
Most of the time, I deal with the brainache by making a drinking game out of Maria Popova’s use of the word “timeless”, and in no time at all, unconsciousness claims me and the torment ends, at least until the next time I open my laptop. But do you know how to recognize when you’ve really spent way too much time obsessing over gadgets and filtering the world through a narrow, tech-related prism? When you write shit like this in earnest:
This is kind of blowing my mind…because of the compression of history, I’d always assumed all these people were around the same age. But in thinking about it, all startups need young people…Hamilton, Lafayette, and Burr were perhaps the Gates, Jobs, and Zuckerberg of the War.
Yes. The Revolutionary War as a Silicon Valley start-up. What a totally illuminating metaphor that in no way sounds myopic or narcissistic. Holy fuck. Maria, save me.