The conflict between the self as social performance and the self as authentic expression of one’s inner truth has roots much deeper than social media. It has been a concern of much theorizing about modernity and, if you agree with these theories, a mostly unspoken preoccupation throughout modern culture. Whether it’s Max Weber on rationalization, Walter Benjamin on aura, Jacques Ellul on technique, Jean Baudrillard on simulations, or Zygmunt Bauman and the Frankfurt School on modernity and the Enlightenment, there has been a long tradition of social theory linking the consequences of altering the “natural” world in the name of convenience, efficiency, comfort, and safety to draining reality of its truth or essence.
…These theories also share an understanding that people in Western society are generally uncomfortable admitting that who they are might be partly, or perhaps deeply, structured and performed. To be a “poser” is an insult; instead common wisdom is “be true to yourself,” which assumes there is a truth of your self. Digital-austerity discourse has tapped into this deep, subconscious modern tension, and brings to it the false hope that unplugging can bring catharsis.
…Of course, digital devices shouldn’t be excused from the moral order — nothing should or could be. But too often discussions about technology use are conducted in bad faith, particularly when the detoxers and disconnectionists and digital-etiquette-police seem more interested in discussing the trivial differences of when and how one looks at the screen rather than the larger moral quandaries of what one is doing with the screen. But the disconnectionists’ selfie-help has little to do with technology and more to do with enforcing a traditional vision of the natural, healthy, and normal. Disconnect. Take breaks. Unplug all you want. You’ll have different experiences and enjoy them, but you won’t be any more healthy or real.
According to two recent books, many people today believe in the Internet the way that the denizens of the Age of Faith believed in God, or that many on the left once believed in Marxism: as the exclusive source of universal personal and political salvation and the basic organizing principle of history. Varieties of this twenty-first century faith can be found in the most disparate places. While the geek elite of Silicon Valley are its natural constituency, other converts include young Egyptians who took part in the 2011 uprising, free-market libertarians, members of the Obama administration, “hacktivists,” open government activists, and a growing tribe of calorie-counting “self-trackers.”
And according to some prophet in the wilderness, the signs are already there of people losing faith in this newest quasi-religion.
After leaving Twitter in 2011 and helping to incubate, among other things, the blog network Medium, Williams found himself rethinking his original formulation. Computers have proliferated and diversified, in size and function, to the point of being unremarkable. Information has become similarly abundant, rendering the term unsatisfyingly generic. And after 20 years, the types of people and groups you find online are basically identical to the people and groups you find in the physical world. What’s now important are the connections between the people and the machines.
…“The internet is not what I thought it was 20 years ago,” Williams said. “It’s not a utopian world. It’s essentially like a lot of other major technological revolutions that have taken place in the history of the world.” He compares it to, well, agriculture. “[Agriculture] made life better. It not only got people fed, it freed them up to do many more things — to create art and invent things.”
The rub is that we often take convenience too far. “Look at the technology of agriculture taken to an extreme — where we have industrialized farms that are not good for the environment or animals or nourishment,” he says. “Look at a country full of people who have had such convenient access to calories that they’re addicted, obese, and sick.” He likens this agricultural nightmare to our unhealthy obsession with internet numbers like retweets and likes and followers and friends.
That warning wasn’t so much a slam on Twitter, which Williams helped create, as it was an observation about human nature. People will be people. The internet wants to give them exactly what they’re looking for. And people who understand how to channel that tendency will be disproportionately powerful.
The biggest problem with the digital zombie is not the walking into signposts. It’s more insidious than that, and involves an incurable disharmony with the outside world. A walker strolling down the sidewalk mulling how many exclamation points to append to his or her Facebook status update about walking down the sidewalk is not really here. Studies suggest that he or she is even more disconnected from the actual world than is someone driving 80 miles an hour across west Texas at two in the morning while cranking Carlos Santana.
…Stopping, looking, lifting your head up, searching around you for something… it’s all part of what being human means, of what we evolved to do.
Yes, Wayne Curtis is back again to remind you that you’re doing this walking thing all wrong. Getting to your destination in one piece without causing mayhem isn’t good enough; you’ve got to be absorbing the proper sensory stimuli and thinking about it in the right way to earn his approval. That’s all predictable romantic tripe, of course; I just enjoy the facile way he invokes evolution, of all things, to support his rigid, retrograde definition of what counts as “the actual world,” when, as far as evolution gives a fuck, humans could end up as half-android beings on spaceships eating freeze-dried nutritional powders and getting all their exercise through use of machines.
The invention of mobile camera-phones. On the one hand, as XKCD says, they’ve unwittingly eliminated whatever tiny likelihood there was of the existence of paranormal activity. On the other hand, they’ve helped remind us that we take our lives in our hands anytime we eat food we haven’t prepared and cooked ourselves.
How we write, in other words, affects what we write. You compose in a different way using pen and ink than you do on a computer. You think in a different way. It may even be that you are, to that extent, a different person, much as we take on a different personality when we speak a foreign language.
We are becoming different people now: Our brains are almost certainly being restructured by interacting with computers all day long. There’s nothing wrong with that. But it would be a shame if that were all we knew — if one day we found ourselves cut off from our ancestors, unable to fully comprehend the thoughts they composed, having forgotten how they used to compose them.
And no less an authority than Socrates/Plato thought that it was a devolution for an oral culture to become text-based, because it would make people forgetful and only superficially learned. Writing eats your brain! No, typing eats your brain! Paper eats your brain! No, pixelated screens eat your brain! Hey, you know, perhaps the popularity of the zombie motif in mythology and pop culture speaks to this apparently-ancient fear that something malevolent is trying to eat our brains. Anyway, before I digress any further, I’m going to go ahead and suggest that the narcissism of small differences plays a huge role in this perennial argument. I mean, come on, man; these are trivial differences in style we’re talking about here, not yawning chasms of comprehension. I read Chaucer and Beowulf in high school with the help of footnotes, and my brain has been able to segue into the idiom of text-messaging and back again without stalling and belching black smoke.
And if we’re going to trade anecdotes, well, I find longhand writing to be laborious and tiresome, causing my interest to fade long before I approach saying everything I could. On a computer, the fact that the actual text can be set down (legibly!) in a matter of seconds reserves that much more time for contemplation and mental composition. Despite the supposed irresistible logic of the machine working its hoodoo on me, my writing process consists of multiple re-readings of whatever excerpt I’m using as a springboard, followed by a lot of intent staring at the screen as I organize my thoughts, ending with a brief flurry of typing. Whatever my limitations as a writer, I likely would have never even tried exploring it as a hobby were it not for the ease facilitated by a keyboard.
Sometimes drudgery is just drudgery. Not everyone who chops wood and carries water becomes spiritually profound as a result. Forming letters individually with your fingers doesn’t make your thoughts deeper or your writing better. Clearing mental space and making time for reflection — assuming you truly want to do so — is much more important. Meaning cannot be reduced to mechanism.
Once there was a man who filmed his vacation.
He went flying down the river in his boat
with his video camera to his eye, making
a moving picture of the moving river
upon which his sleek boat moved swiftly
toward the end of his vacation. He showed
his vacation to his camera, which pictured it,
preserving it forever: the river, the trees,
the sky, the light, the bow of his rushing boat
behind which he stood with his camera
preserving his vacation even as he was having it
so that after he had had it he would still
have it. It would be there. With a flick
of a switch, there it would be. But he
would not be in it. He would never be in it.
— Wendell Berry
The digital age gives a new (and almost opposite) meaning to having a photographic memory. The experience of the moment has become the experience of the photo.
And it’s not only the subjects of the photos who are affected. In the age of the realtime, social web, the person taking the photos is often distracted by the urgent desire to share near realtime photos of an experience. Is it worth reducing an entire real life experience to what can be seen through a tiny screen? I recently attended a concert where I was the only one in my section who had no device between my eyes and the performance — and that was only because I forgot my iPhone.
I enjoy photography as a spectator, but I’ve never been able to get into the habit of snapping photos of stuff I’m experiencing. Just can’t seem to inhabit that removed perspective long enough to think, “This would make a great picture.” As for our increasingly-popular inclination to filter life through a gadget screen in order to experience it later at a more convenient time which will likely never come, well, it makes me think that maybe there’s some metaphorical usage to be gotten from the stories of American Indians who thought that photographs would steal the subject’s soul.
In the midst of another paint-by-numbers essay lamenting the baleful influence of digital media upon our reading and thinking, this part stood out to me:
As Nicholas Carr (2013) thoughtfully points out, Friedrich Nietzsche is an example worth considering. When Nietzsche started to compose on the typewriter rather than by hand due to poor eyesight, it changed the way he wrote philosophy. His prose changed from arguments to aphorisms, thoughts to puns, and rhetoric to telegram style. If we significantly change how we write, it significantly changes what we write. If we significantly change how we read, it significantly changes what we read.
He bought a typewriter in 1882. That was the year The Gay Science was published, a book many scholars feel to be the first expression of his “mature” philosophy. The following year saw the publication of Thus Spoke Zarathustra, which I might describe as a book-length epic poem/irreligious parable. Over the next few years, Beyond Good and Evil and The Genealogy of Morals, neither of which can be said to consist primarily of aphorisms, puns, or telegram-style terseness, completed this roughly-agreed-upon phase of his career where his most important ideas were developed to their fullest potential. Two of his most aphoristic works, Human, All Too Human and (my personal favorite) Daybreak, were published before his supposedly fateful encounter with the typewriter, the former’s consciously-chosen style owing more to the French writers he admired than any crude technological determinism.
Funny enough, a different sort of determinism — physiological — has also been suggested as responsible for his change of style. While rebutting that notion, Steven Michels stresses that however the aphorism might have first appeared in his style, Nietzsche quickly recognized the value it held for his philosophical project, which was a conscious rejection of the style of ponderous thought and argument that had traditionally dominated philosophy, and used it as a tool to that end:
Certainly illness played a role in Nietzsche’s brief bouts of creativity. But his aphorisms started much earlier than his chronic illnesses; many of his later books are less aphoristic, and his literary masterpiece, Thus Spoke Zarathustra, is anything but aphoristic. Nietzsche’s philosophising had more to do with the change in his philosophy than physiology. His aphoristic turn marked a turn in his intellectual development, to be sure, but it also allowed for a great advance in his philosophy, in presentation if not in development. His philosophy was limited by conventional forms, such as the treatise and the essay, and the aphorism allowed him to break from that. Although it was initially involuntary or subconscious, Nietzsche soon understood what this change of style allowed him to do philosophically, and he embraced it.
Unfortunately Nietzsche wasn’t totally satisfied with his purchase and never really mastered the use of the instrument. Until now, many people have tried to understand why Nietzsche did not make more use of it, and a number of theories have been suggested such as that it was an outdated and poor model, that it was possible to write only upper case letters, etc. Today we can say for certain that all this is only speculation without foundation.
The writing ball was a solidly constructed instrument, made by hand and equipped with all the features one would expect of a modern typewriter.
You can now read the details about the Nietzsche writing ball in a book, “Nietzches Schreibkigel”, by Dieter Eberwein, vice-president of the International Rasmus Malling-Hansen Society, published by “Typoscript Verlag”. In it, Eberwein tells the true story about Nietzche’s writing ball based upon thorough investigation and restoration of the damaged machine.
Friedrich Nietszche was not aware that his trouble in using the machine was caused by damage to it during transportation to Genoa in Italy, where he lived at the time. And when he turned to a mechanic who had no typewriter repair skills, the mechanic managed to damage the writing ball even more. In his book, Dieter Eberwein presents all the typescripts Nietzsche ever wrote on his machine (about 60) and reveals the true story concerning the damages. Nietzsche also did not know how to change the direction of the colour ribbon, so that he had to ask the mechanic to help him each time the ribbon was out.
Zuckerman, however, is not a knee-jerk naysayer about all things digital. The director of the MIT Center for Civic Media and cofounder of the international bloggers’ website Global Voices, he is extremely enthusiastic about the potential of using technology to connect people across cultures. He wants the Internet to be an empathy machine. The difference between him and the full-throated apostles of cyber-utopianism is that he does not believe that the online world is foreordained to fulfill this purpose, nor does he naively assume that the outcomes of cross-cultural connections will always be desirable.
Everything depends in the end on whether we can find direct, causal evidence: we need to show that exposure to literature itself makes some sort of positive difference to the people we end up being. That will take a lot of careful and insightful psychological research (try designing an experiment to test the effects of reading “War and Peace,” for example). Meanwhile, most of us will probably soldier on with a positive view of the improving effects of literature, supported by nothing more than an airy bed of sentiment.
The archetypal image of the sage is one of serenity and an almost-otherworldly lack of concern for the mundane obsessions of everyday life. In other words, the accumulation of knowledge and experience, rather than simply being a super-sum of positive integers, tends to produce a state of being that would seem very much at odds with the sort of simplistic, progressive partisan vision of “better” or “nicer” citizens. Knowledge is a double-edged blade that can be used for the same values and desires people have always had; increasingly complex experience isn’t likely to express itself in platitudes.
We don’t always strive for perfection. To the contrary: as a species, we often embrace imperfect conditions instead of attempting to better them. A desire for perpetual progress isn’t encoded in our genes. Large periods of human history were relatively static. For many generations, our forefathers lived contently without desiring radical change. We also know of contemporary tribes in inaccessible regions of the earth that appear to be quite happy with the world as it is and with their place within it. These tribes don’t necessarily aspire to “be different” or to be more like us.
In the modern West, many people are similarly sceptical of radical change even if it promises great technological, social or medical benefits. As Shakespeare already knew, it’s often easier for us to endure existing hardship than it is to aspire to an unknown future.
Why does this matter? Because attempts to develop new technologies and to propel engineering forward often fail to account for the diversity of views on progress. We are presented with shining examples of scientific possibilities that will change our everyday lives whether we desire it or not. Yet from the perspective of an objective observer, the introduction of many new technologies is unnecessary. Their development isn’t driven by some innate need for progress and survival but by our own curiosity and enthusiasm for new gadgets and by economic interests.
This is something I think about a lot. It’s only been in the last few hundred years that knowledge and technological prowess have begun to exponentially increase, but we’ve quickly become accustomed to thinking of this as something inevitable. As is often the case, Hume is there wagging his finger at us, reminding us that we ultimately have no solid foundation upon which to rest our blithe assumption that tomorrow will be a teleological improvement upon today. 165 million years of dinosaurs existing as a species ultimately meant nothing. Whatever we might hope or think will happen to a species of hairless ape that has only been around for a fraction of that time, we don’t know. There is no precedent here. And so, what might it be like if humanity were forced to reacclimate itself to such a static, cyclical way of life?