Jesus was a Hebrew rabbi. Unusually, he was unmarried. The idea that he had a romantic relationship with Mary Magdalene is the stuff of fiction, based on no biblical evidence. The evidence, on the other hand, that he may have been what we today call gay is very strong. But even gay rights campaigners in the church have been reluctant to suggest it. A significant exception was Hugh Montefiore, bishop of Birmingham and a convert from a prominent Jewish family. He dared to suggest that possibility and was met with disdain, as though he were simply out to shock.After much reflection and with certainly no wish to shock, I felt I was left with no option but to suggest, for the first time in half a century of my Anglican priesthood, that Jesus may well have been homosexual. Had he been devoid of sexuality, he would not have been truly human. To believe that would be heretical.Heterosexual, bisexual, homosexual: Jesus could have been any of these. There can be no certainty which. The homosexual option simply seems the most likely. The intimate relationship with the beloved disciple points in that direction. It would be so interpreted in any person today.
Speaking very generally, ordinary people are most likely to deny the existence of free will when they see our deliberations, choices and actions being overridden or bypassed in some way or another. For the folk, or most of them, the dominant idea in attributing free will to themselves and others seems to be a denial of fatalism.…Harris appears, then, to think that free will means acting (1) in circumstances such that I could have done otherwise (in the strong, mysterious sense), and (2) by means of a process of deliberation that is entirely conscious. Since, this does not happen, he concludes, we do not have (what he calls) free will.…Importantly, the concept of free will that Harris attacks so relentlessly bears little resemblance to either the dominant folk ideas (roughly speaking, that fatalism is false, and that we commonly act without coercion, with adequate time to think) or the technical concept used by most philosophers (we have the capacity to act in such a way that we are morally responsible for our conduct).…Furthermore, the folk (and perhaps philosophers) are not worried only by outright coercion but also by other circumstances, such as whether there was adequate time to think. But where do we draw the line with something like that – for example, how much time is “adequate”? Again, how should we handle such things as compulsions and phobias – are they just another part of our desire-sets, or are they more analogous to external barriers to our actions?…But even if we press such points as hard as possible, folk ideas of free will might survive. Perhaps whether we act freely becomes a matter of judgment and degree, and the question of whether we do so in various particular cases does not have an entirely compelling answer. Nonetheless, it might remain more false than true if we tell the folk, “You do not have free will.”
First, let’s underscore what the whole Slow culture quake is about. It’s not anti-speed. It’s not about doing everything in slow motion. It’s about doing things at the right speed – what musicians call the tempo giusto. Every act has the right rhythm for it, and if you find that rhythm you’re going to do it better and enjoy it more. Particularly in cities, we get infected by this virus of hurry, where our default mode is to do everything as fast as possible. We fall into the trap of trying to do more and more things in less and less time, putting quantity before quality in everything we do.
Some of biology is essentially a pause: sleep, for example. Pauses serve a purpose, breaking the flow. Like rests in music or caesuras in verse. Like the old nightly break in the news cycle and the financial markets, gone in our 7 X 24 era. Even a confirmed atheist and Sunday driver must believe that the Sabbath served a therapeutic purpose, too, in the epoch when people observed it. Now, of course, Puritanical blue laws are mostly long gone, and Federal Express boasts of delivering on Sunday “because the world works seven days a week.” Haydn may have been the first great master of the rest in musical composition; he used rests for surprise, rests for tension, and even rests with fermatas. Silence indefinitely prolonged. Rest and pause. A rest with a fermata is the moral opposite of the fast-food restaurant with express lane.
As we get deeper into the Internet age, and in particular the Twitter age, it’s getting easier to become less thoughtful. One-liners fly out into the ether and then disappear forever. For some, every single thing they do is broadcast online. Mourning, on the other hand, is traditionally a deeply private practice. If a loved one or family member dies, you feel a pain that only you know. Taking to Facebook or Twitter to express that kind of hurt seems trite, almost like a slight to the deceased. If you really cared that much, would you be able to sum up your thoughts in 140 characters, or with a sad face emoticon? Does it matter what other people thought about exactly how sad you really were?…The Internet can get a bad rap for being a wasteland of dumb content and dick shots, all of which it has in spades. What it also has, though, is potential. With its unlimited capacity and nearly unlimited number of contributors, the Internet has the ability to spin out in a variety of different directions. Sure, it can go brusque and glib, and that works sometimes. It can also spark real conversation or host excellent essay work and heartfelt rememembrances of those we’ve lost. What we have to figure out as content producers—and any one of us with a Facebook or Twitter account is a content producer—is what kind of message we want to put out into the ether. It’s not easy to do, especially when dealing with something as emotional as death and mourning, but it’s that modicum of thought that can mark the difference between a tribute and a shrug.
The University of Southern California has been given $40,000 by the National Endowment for the Arts to develop Walden, in which “the player will inhabit an open, three-dimensional game-world, which will simulate the geography and environment of Walden Woods”. With the game drawing from the detailed notes Thoreau wrote about the area and its landscape, flora and fauna, users will be able not only to walk in the author’s footsteps but also, said the university, “discover in the beauty of a virtual landscape the ideas and writings of this unique philosopher, and cultivate through the gameplay their own thoughts and responses to the concepts discovered there”.
…The team behind the video game Walden said it “posits a new genre of play, in which reflection and insight play an important role in the player experience”. While the player travels through the virtual world of Walden, and deals with everyday life at Walden Pond, they will also be asked, the team said, to “focus on the deeper meaning behind events that transpire in the world. By attending to these events, the player is able to gain insight into the natural world, and into connections that permeate the experience of life at Walden.”
Now, I know what you’re thinking—it seems like an incredibly ironic joke to use a video game to gain insight into the natural world. But this does sound like the sort of roughing it that Henry was most comfortable with:
The inestimably priggish and tiresome Henry David Thoreau thought nature was splendid, splendid indeed, so long as he could stroll to town for cakes and barley wine, but when he experienced real wilderness, on a visit to Katahdin in 1846, he was unnerved to the core. This wasn’t the tame world of overgrown orchards and sun-dappled paths that passed for wilderness in suburban Concord, Massachusetts, but a forbidding, oppressive, primeval country that was “grim and wild…savage and dreary.” fit only for “men nearer of kin to the rocks and wild animals than we.” The experience left him, in the words of one biographer, “near hysterical.”
At this point, two forms of language coexist in societies: choppy speech and crafted prose. No ancient Roman spoke the way Virgil and Cicero wrote. Even today, only about 100 of the world’s 6,000 languages are written much, and none of the 5,900 unwritten ones are spoken in Gibbonesque paragraphs.
…Yet the brevity, improvisation and in-the-moment quality of e-mails and texts are those grand old defining qualities of spoken language. Keyboard technology, allowing us to produce and receive written communication with unprecedented speed, allows something hitherto unknown to humanity: written conversation. In this sense, they are not “writing” in the sense we are accustomed to. They are fingered speech.
A sense that e-mail and texting are “poor writing” is analogous, then, to one that the Rolling Stones produce “bad music” because they don’t use violas. Note that one cannot speak capital letters or punctuation. If we accept e-mail and texting as a new way of talking, then their casualness with matters of case and commas is not only expected but unexceptionable.
Even if we call this new usage a form of writing, we can see Anglophone societies as now having two kinds of writing. For formal contexts, there is the long-lined kind that requires schooling and practice to master. Then there is a more natural form paralleling the way we speak for informal contexts – which are, after all, most of our lives.
For curmudgeons like me, the problem isn’t that texting and tweeting is an unfortunate diversion from the nuanced thinking and writing that people would be doing normally. Most people are trite and boring and I don’t waste a valuable moment lamenting the possibility that it could have ever been otherwise. No, I agree fully that it makes sense to see texting as a form of speech rather than writing. But as in speech, so in print: I’m irritated by people who never shut the fuck up long enough to allow contemplation to work its magic and produce thoughts that are interesting and worth hearing. This is why I complain about the hyperactive, overstimulated environment of social media—it’s an incubator for the qualities I despise in relationships, like superficiality and haste. There’s little time to do anything but react reflexively, which ensures that your output is primarily going to be banal and inchoate.
Yesterday’s post, for example, I worked on all day. I started on it in the morning, let it sit while I went to work, proceeded to mull it over while on the job, came home, worked on it a little more, did chores outside while mulling it over some more, and finally finished it in the evening. And I still think it turned out to be a pretty weak effort, despite all that. But I felt like there was something more to be said on the topic, something less obvious and possibly more penetrating, so I kept wrestling with it, hoping that maybe lateral thinking would kick in at some point and provide me with an insight that would make the difference between a good post and one that was just okay. I could have just blurted out the first thought that came to mind and been done in five minutes, quickly on to the next shiny object, but then I would have likely forgotten about the topic before I had a chance to explore any nuance in it.
It’s not that social media and texting inevitably doom their participants to drooling idiocy, or that only drooling idiots would want to use them to begin with. But even I, an unambitious blogger, know that the further you allow yourself to get lured into that environment, the more difficult it becomes to do the sort of writing and thinking that gives us such joy. You have to seek out that open, quiet space away from the clamor and din of nonstop chattering, where ideas formerly lurking hesitant and unseen out on the margins might finally be able to approach you.
I get annoyed whenever anyone slaps a label on something and then presumes that the label itself says all that needs to be said. Whenever a critic or a potential audience member sniffs about “dad rock” or “chick lit” or “one for the fanboys,” it raises my hackles. If you’d rather not engage with what a piece of art actually is—as in, what it expresses and how well it is expresses it—then fine. But don’t presume some kind of superiority because of that choice. One of the biggest fallacies in the way we talk about art is this idea that somehow personal taste equates to quality: That each of us miraculously only enjoys movies and music that are the best of their respective medium, and ergo, any movies and music we don’t enjoy must be terrible. It’s a standard we generally only apply to art. (Well, and politics.) If we dislike salmon, we don’t presume salmon itself to be bad; we just understand we don’t have a taste for it, and we’re generally willing to acknowledge that if prepared properly, we might even be capable of enjoying the occasional piece of salmon. It’s not that degrees of “good” and “bad” don’t exist, but ultimately our taste in art isn’t so different from our taste in food, in that it’s personal, and—if we’re being honest with ourselves—fairly malleable.
Contemporary strivers lack the tools with which people in the past have differentiated themselves from their peers: They live in a post-virtue, post-religion, post-aristocracy age. They lack the skills or inspiration to create something of genuine worth. They have been conditioned to find all but the most conventional and compromised politics worthy of contempt. They are denied even the cold comfort of identification with career, as they cope with the deadening tedium and meaninglessness of work by calling attention to it over and over again, as if acknowledging it somehow elevates them above it.Into this vacuum comes a relief that is profoundly rational in context—the self as consumer and critic. Given the emptiness of the material conditions of their lives, the formerly manic competitors must come to invest the cultural goods they consume with great meaning. Meaning must be made somewhere; no one will countenance standing for nothing. So the poor proxy of media and cultural consumption comes to define the individual. In many ways, cultural products such as movies, music, clothes, and media are the perfect vehicle for the endless division of people into strata of knowingness, savvy, and cultural value.These cultural products have no quantifiable value, yet their relative value is fiercely debated as if some such quantifiable understanding could be reached. They are easily mined for ancillary content, the TV recaps and record reviews and endless fulminating in comments and forums that spread like weeds. (Does anyone who watches Mad Men not blog about it?) They are bound up with celebrity, both real and petty. They can inspire and so trick us into believing that our reactions are similarly worthy of inspiration. And they are complex and varied enough that there is always more to know and more rarefied territory to reach, the better to climb the ladder one rung higher than the person the next desk over.
Ahahaha. It’s a good thing I didn’t go to college; everything I would have been interested in studying turns out to be a financial dead end. Oh, it’s the life of an autodidact for me, where the status is none, but the learning is free…
The past is always with us, and where we come, what we go through, how we get through it; all this shit matters. I mean, that’s what I thought he meant. Like at the end of the book, you know, boats and tides and all, it’s like you can change up, right, you can say you somebody new, you can give yourself a whole new story, but what came first is who you really are and what happened before is what really happened, and it don’t matter that some fool say he different cause the only thing that make you different is what you really do, what you really go through. Like, you know, like all the books in his library, now, he frontin’ with all them books but if we pulled one off the shelf ain’t none of the pages ever been open. He got all them books and he ain’t read one of them. Gatsby, he was who he was and did what he did, and because he wasn’t ready to get real with the story, that shit caught up to him. That’s what I think, anyway.
Poole, who was voted Time’s most influential person of 2008 – two years before Facebook’s Mark Zuckerberg was declared the magazine’s Man of the Year – believes Facebook’s commercial motivations shut down the online experience: “Mark and Sheryl have gone out and said that identity is authenticity, that you are online who you are offline, and to have multiple identities is lacking in integrity. I think that’s nuts.”
“We went from a web that was interest-driven, and then we transitioned into a web where the connections were in-person, real-life friendship relationships,” adds Poole. “Individuals are multifaceted. Identity is prismatic, and communities like 4Chan exist as a holdover from the interest-driven web.”
Allan believes such attitudes are naive. The millions who have gone online over the past decade want a safe place where they won’t experience bad behaviour, have their identities stolen or be duped by impostors, he says: “Pretend identities don’t work very well now that the web has moved from a minority sport for geeks to a mainstream occupation.”
I was checking in on the progress of the Mark Sandman documentary this week, and read through a few related articles in the process. One thing that struck me was that, in addition to the general umbra of mystique he cultivated around his personal life, he was particularly sensitive about keeping his age a secret. Having been born in 1952, he was already on the cusp of turning 40 at the beginning of Morphine’s recording career, which, for a rock musician, is an age more commonly associated with long-since-exhausted creativity, “Vegas Elvis” irrelevance and unwitting self-parody. Marketing executives worry about whether the youth demographics are going to identify with someone old enough to be their parent, that sort of thing. Anyway, it was kind of quaint to read about a reporter for Rolling Stone trying fruitlessly to find out his real age. Really? As recently as the mid-nineties, it was possible for a minor celebrity to keep something like that a secret without a reporter simply looking up his high school yearbook or something? How long would he be able to pull that off today before half of his acquaintances from adolescence would be tweeting pictures and anecdotes about him?
My affection for anonymity stems from my impatience with the irrelevance of the quotidian details of life. I love the interest-driven web; I love being able to find discussions of ideas that I would never hear otherwise; I love the creative stimulation; I love the excellent writing. Most of all, I love the fact that participation in all this revolves around what you think, not who you are, in that small-town sense of your identity being cemented in place by the opinions of everyone else. And I resent the fact that social media has brought all that small-town idiocy and superficiality back to center stage.
Continuing with a theme, here’s Gary Gutting:
The Sermon on the Mount, however, does not offer a clear view of what makes for a good life. Many seem to think Jesus is saying little more than be nice to everybody. Others see a call to a heroic life of total non-resistance or self-sacrifice. Still others hear him as requiring little more than an enhanced version of the Ten Commandments (e.g., avoiding not only murder but also anger, not only adultery but also lustful desires).
Almost all Christians ignore many of the things Jesus said on the Mount. Who literally takes no thought for their lives or for tomorrow? Who never resists evil? Who gives to anyone who asks? Who says “Hit me again” to an unjust attack? There may be ways of integrating such injunctions into our morality without reducing them to banalities, but the bare text of Jesus’ sermon doesn’t tell us how to do this.
Yeah, it’s true; a literal belief in the imminent end of the world does tend to render the advice you offer people a bit short on nuance and practicality. Seriously, it amazes me how much effort people put into trying to salvage something inspirational from this dreck. Even if you set aside the irrelevance of his preaching once the centrality of the apocalypse is removed, there’s just nothing profound there. As J.L. Mackie said:
Richard Robinson has examined the synoptic gospels as the best evidence for Jesus’s own teaching, and he finds in them five major precepts: “love God, believe in me, love man, be pure in heart, be humble.” The reasons given for these precepts are “a plain matter of promises and threats”: they are “that the kingdom of heaven is at hand,” and that “those who obey these precepts will be rewarded in heaven, while those who disobey will have weeping and gnashing of teeth.” Robinson notes that “Certain ideals that are prominent elsewhere are rather conspicuously absent from the synoptic gospels.” These include beauty, truth, knowledge and reason:
As Jesus never recommends knowledge, so he never recommends the virtue that seeks and leads to knowledge, namely reason. On the contrary, he regards certain beliefs as in themselves sinful…whereas it is an essential part of the ideal of reason to hold that no belief can be morally wrong if reached in the attempt to believe truly. Jesus again and again demands faith; and by faith he means believing certain very improbable things without considering evidence or estimating probabilities, and that is contrary to reason.
Jesus says nothing on any social question except divorce, and all ascriptions of any political doctrine to him are false. He does not pronounce about war, capital punishment, gambling, justice, the administration of law, the distribution of goods, socialism, equality of income, equality of sex, equality of colour, equality of opportunity, tyranny, freedom, slavery, self-determination, or contraception. There is nothing Christian about being for any of these things, nor about being against them, if we mean by “Christian” what Jesus taught according to the synoptic gospels.