Gnosticism + Critical Theory = “Truth“
August 2014
The Whole Ass
When it comes to the internet, the word revolutionary has been bandied about with such profligacy that educated people now seem to have confused a technical revolution, if one with real consequences, with actual social and political revolution of the kinds seen in 1776, 1789 and 1917. Now why hasn’t anyone written anything sensible about that? Is it because we are too busy publishing screeds about polyamorous trans rights or cris de coeur about hurt feelings on the internet? The ever expanding empire of emotion long ago captured our minds; it’s only now that anyone is noticing that amped-up and, ultimately fraudulent, sentiment has also captured the remains of the press via the brave new world of social media. Emotional incontinence, whether in the form of rage or abuse, is nothing less than the esprit du temps. The internet troll and the Twitter victim are, put simply, two cheeks of the same arse.
Brilliant. Yes, too often, my web browsing experience makes me feel like Joe Bowers in the theater.
I Disappear
“Some people want me to be this warm and fuzzy person. All filled with friendly hermit wisdom. Just spouting off fortune-cookie lines from my hermit home.”
…Anyone who reveals what he’s learned, Chris told me, is not by his definition a true hermit. Chris had come around on the idea of himself as a hermit, and eventually embraced it. When I mentioned Thoreau, who spent two years at Walden, Chris dismissed him with a single word: “dilettante.”
True hermits, according to Chris, do not write books, do not have friends, and do not answer questions. I asked why he didn’t at least keep a journal in the woods. Chris scoffed. “I expected to die out there. Who would read my journal? You? I’d rather take it to my grave.” The only reason he was talking to me now, he said, is because he was locked in jail and needed practice interacting with others.
“But you must have thought about things,” I said. “About your life, about the human condition.”
Chris became surprisingly introspective. “I did examine myself,” he said. “Solitude did increase my perception. But here’s the tricky thing—when I applied my increased perception to myself, I lost my identity. With no audience, no one to perform for, I was just there. There was no need to define myself; I became irrelevant. The moon was the minute hand, the seasons the hour hand. I didn’t even have a name. I never felt lonely. To put it romantically: I was completely free.”
That was nice. But still, I pressed on, there must have been some grand insight revealed to him in the wild.
He returned to silence. Whether he was thinking or fuming or both, I couldn’t tell. Though he did arrive at an answer. I felt like some great mystic was about to reveal the Meaning of Life.
“Get enough sleep.”
“You’re not a real hermit!” “You’re not a real hermit!” Fellows, fellows! Surely there’s enough room in the most sparsely-populated brotherhood in existence for us all to abide each other’s idiosyncrasies without resorting to competition or excommunication! That way lies {shudder} social status!
Excellent article, at any rate. Give it a read.
An Older, Colder Voice
Because his plays express no sense of a nearly divine vocation, of a mission to save humanity by transmitting ethical truths, Shakespeare cannot be the equal of Dante or Milton or Goethe, of the Greek dramatists or the Russian novelists, all of whom wrote to commune with the divine and to bring light to the world. What had in the Romantic tradition long been seen as Shakespeare’s unique strength — what Keats famously called his “Negative Capability,” his capacity for “being in uncertainties, Mysteries, doubts, without any irritable reaching after fact & reason” — on this view becomes a liability, a social irresponsibility, a feckless acceptance of humanity’s doomed and ignorant lot without any attempt to improve it. Shakespeare can be seen as the paradigm of the apolitical artist, the dissolute aesthete reviled not only by the religious conservatives of all faiths but also by those who nurse radical political hopes, such as the anarcho-pacifist Tolstoy, the Soviet sympathizer Wittgenstein, and even the socialist-feminist Lynn Stuart Parramore. From this perspective, we find Shakespeare at the origin of that dangerously aloof aestheticism for which Roberto Bolaño’s By Night in Chile has given us the most memorable picture in contemporary letters: the literary soirée above the torture chamber.
…Hamlet’s — and Shakespeare’s — charismatically demonic knowledge of the void at the heart of reality, the death that is the essence of life, catches something very real in our experience (or mine, anyway), a basic metaphysical uncertainty that should disturb all of us, a faithlessness and despair that no doubt has the poisonous potential to ruin the plans of our reformers and revolutionaries, of our dispensers of Christian charity and our disseminators of socialist-feminist politics, but a grim knowledge that nevertheless murmurs constantly beneath the busy clamor of everyday life and that seeks passionate expression in the face of all protest. Maybe Shakespeare sucks because — and to the extent that — life sucks. It doesn’t and shouldn’t please us if we want to believe in a better world, and it may not cheer the fans of NPR, but Shakespeare’s visionary perception that precisely nothing is at stake in each of our lives will probably continue to worry us as long as there are playgoers and readers to experience it.
It’s fascinating, this divide between those who feel artistic genius is timeless, above and beyond mere political reproach, and those who would place it in service to society’s contemporary mores. Even more fascinating is the way in which both types could be contained in the singular figure of Beethoven, who, despite having done more than anyone to free “the artist” from the bonds of church, state and aristocratic patronage, and despite almost singlehandedly creating the template of the Romantic artist, tormented by earthly misfortune but convinced of his posthumous importance, storming the empyrean gates to steal a bit of transcendence from God himself, nonetheless was quite the moralist when it came to the social function of art. As Harvey Sachs wrote:
To Beethoven, the word “philosophy” could probably have been defined as ethical guidance; when he said that music was a higher calling than philosophy, he meant that it was potentially more important as a moral force. Artists, he believed, must strive to contribute to humanity’s well-being — must help mankind to find the right path. This is why he condemned Mozart’s opera Don Giovanni on moral grounds, despite his admiration for the opera’s music, and why the Viennese triumph of Rossini’s operas, which, by Beethoven’s lights, lacked moral fiber, greatly upset him, as did the local public’s adoration of vocal and instrumental virtuosity for its own sake.
Most of those who share his hectoring tendencies can’t be redeemed by anything like his talent, unfortunately. As for me, I am one of those who believe that there is an amoral, trickster-like quality to great art that keeps it forever one step ahead of the socially responsible types who seek to yoke and domesticate it in service to this or that cause. That’s probably inseparable from the fact that I’m also one who believes in the cyclical nature of human existence, as opposed to those who imagine some sort of teleological dialectic at work in history, carrying us through progressive stages of development toward some moral/cultural apogee. Wisdom and transcendence, to me, are more like mountain peaks than broad plateaus — briefly attainable, but not fit for settling and building upon. King Lear and the Ode to Joy do not represent a state which humanity as a whole will one day achieve collectively; they are the awe-inspiring peaks from which we must inevitably descend to return to the chores and tribulations of daily living.
I’ve Come to See You Fall
PZ Myers on Robin Williams: “a wealthy white man” whose death distracts us from “news about brown people”. http://t.co/0z2VsKVPWd
— Richard Dawkins (@RichardDawkins) August 12, 2014
P.Z. Myers shows again that he’s a prick without empathy. Not really news, but still. http://t.co/a4rpr9sDny
— Massimo Pigliucci (@mpigliucci) August 13, 2014
Coyne in PZ: This is one of the most contemptible and inhumane things I’ve ever seen posted by a well-known atheist. http://t.co/9H5qVdi54m
— Massimo Pigliucci (@mpigliucci) August 13, 2014
Terry Firma at Friendly Atheist on P Z Myers’s dehumanizing & toxic response to Robin Williams’s suicide: http://t.co/dYAVi6Im3X
— Jeremy Stangroom (@PhilosophyExp) August 13, 2014
I wonder if God created P Z Myers, and his merry band of bloggers, as part of a cunning plan to discredit the cause of atheism…
— Jeremy Stangroom (@PhilosophyExp) August 12, 2014
You’ve had the same experience, I’m sure — standing in line at the grocery store, glancing at the tabloid covers, seeing this week’s magic fad diet promising that you can lose weight while eating ice cream in bed, or some such thing. You marvel for a moment at the brazen chutzpah required to pitch such a scam. How does anyone fall for this? Then you remember that people have a bottomless appetite for being told that they can have whatever they want without having to sacrifice anything for it.
Some people pride themselves on being more rational than that, though. In fact, their pride in their ability to see through the fallacies and cognitive biases that plague the rest of us mere mortals approaches outright hubris. They would never fall for anything so blatantly obvious as a miracle diet or a get-rich-quick scheme. Instead, they prefer to convince themselves of more rarefied delusions, like the one where a world of social justice will be created through them acting like the most relentlessly miserable, hateful assholes they can be. I mean, consider what thankless work is involved in being an activist for progressive causes. Imagine how much time and energy must be spent just to achieve even the most modest accomplishments. And imagine how, even after a lifetime of such work, one could still look toward the future and see an endless line of tasks yet to be tackled. So, wait, you’re telling me that none of that is necessary? I can instead spend my time online acting like a raving prick, surrounded by equally nasty sycophants, all of us stewing in our impotent rage and venting at the world at large like angry teenagers, making no attempt to change the mind of anyone who doesn’t already agree with us? And by doing this, I’ll be helping lay the foundations for a glorious future based on peace and equality? Well, sign me up already!
Like I said — a bottomless appetite.
Anyway, if you haven’t already read about it, you can click through the links for the details. In a nutshell, Peezus decided to use the occasion of Robin Williams’ death to bitterly complain (so unlike the author of The Happy Atheist, I know) about how the media focuses on trivia instead of the issues that matter. At least, once the backlash started, that was the ass-covering rationalization he came up with — he was just critiquing “celebrity culture”, not attacking Williams per se.
But I stress that word, rationalization. As we’ve seen before, he sometimes hardly seems aware of the gibberish he’s writing even as he’s doing it. The ad-hoc and incoherent nature of his “argument” here made clear that he only felt compelled to intellectually justify his post in self-defense. His reason for writing the post, the intuitive urge that actually motivated it in the first place, was the basic fact that he is simply a shit-encrusted asshole of a man. He saw a bunch of people commiserating over a sad event, and his visceral reaction was to go over and trivialize their feelings for self-aggrandizing purposes. He decided that the death of a beloved actor and comedian would serve as the perfect background scenery for one of his juvenile propaganda points. Were he not the throbbing, inflamed, hemorrhoidal asshole that he is, he could have simply ignored the spectacle and gone about his “important” business. Or he could have indulged some curiosity and thought, “Gee, I never paid any attention to his work, but he seems to have meant a lot to so many people. I wonder if I might learn anything interesting about him by listening to their recollections?” Instead, he decided to sneer about how all this chatter over a mere dead, wealthy white male meant that not enough people were paying attention to whatever P.C. causes he would otherwise assign them to.
His complaints about “real” news as opposed to trivial fluff have nothing to do with any Marxist-derived analysis of the media as adjuncts of the ruling class, doing their part to keep the masses in a narcotic daze, too stupefied and pacified to heed the calling of class warfare. His trite points about being well-informed ring hollow when you consider that he, like so many others, is a mere consumer of news who will not be motivated or empowered to act in any way upon what he takes in. His implication that the masses will be inspired to rise up and act against injustice if only they are fed a steady diet of informative articles by a progressively-inclined media is too absurd to withstand direct consideration. His generic ranting about how news is a zero-sum game where attention given to an undeserving topic can only come at the expense of a vitally important one betrays a borderline-retarded understanding of how human psychology and the news business actually work, and the logical conclusion of his inchoate suggestion that our attention needs to be rationed so that we only pay attention to news according to its rank on the Peezus-Approved Importance Index would be that only the threats of loose nukes, catastrophic climate change and maybe airborne superviruses would deserve column space until they have been solved, something that even he can’t possibly be stupid enough to endorse.
No, none of these fatuous brain belches are the real motivation behind the post that’s made him so many new friends. He just wanted to brag about his superior taste in news consumption. “Celebrity deaths and sporting events? Never pay any attention to ’em. I only read important news.” His post amounted to the text-based equivalent of a selfie taken against a backdrop of “serious” news headlines. Not only an asshole, but a cynical asshole at that — his main concern when it comes to news is that he looks good posing with it.
And this has become his niche in life. He has created an online space where he and his fellow cultists indulge their addictions to rage-blogging and wallowing in their own dysfunctional resentment. Different in the cosmetic details, but no different in essence from a typical Rush Limbaugh dittohead — they spend their days stoking their fury against all those stupid people out there thinking stupid thoughts and fucking up the world by failing to recognize how right they are about everything that matters. So much time spent maintaining a constant state of unproductive outrage. So much energy devoted to dwelling in miserable, negative states of mind. It would all be such a sad, pathetic waste if it weren’t for the fact that one day, all their efforts will help bring about a glorious future where all these social ills have been conquered. Oh…
I’m not sure why this particular incident seems to have been the final straw for so many people who, until now, have been hesitant to criticize him directly, but hey, whatever, I’ll take it. Of course, widespread opprobrium will only reinforce his supercilious adolescent attitude and his cult’s belief that the world can’t handle the truth they bring. Really, the worst thing that could happen to them would be to experience a single, crushing moment of clear, undiluted self-awareness.
And If My Interest Is Waning, I Can’t Fake It
By ‘thick’ description, I mean an extended, detailed, evidence-based, written interpretation of a subject. If you want to write a feature, or blog, or wikipedia entry, be it about the origins of the first world war; the authoritarian turn in Russia; or the causes and effects of the 2008 financial crisis, in the end you will have to refer to a book. Or at least refer to other people who have referred to books. Even the best magazine pieces and TV documentaries — and the best of these are very good indeed — are only puddle-deep compared with the thick descriptions laid out in books. They are ‘thin’ descriptions and the creators and authors of them will have referred extensively to books to produce their work.
In this sense, authors and publisher-curators are in the ‘civilisation business’, trafficking in the knowledge that provides the building blocks for culture and society. They probably shouldn’t go around talking about ‘civilisation’ too often, but it’s true nonetheless. Books are a different class of object, profoundly unlike magazines, newspapers, blogs, games or social media sites. The world they evoke is richer, more dense and, literally, more meaningful.
At times, tired from other responsibilities and bereft of inspiration, I’ve forced myself to consider whether I might have taken this blogging thing as far as I can. The answer I came up with is no, I enjoy writing too much to simply stop doing it. But in the course of my musings, I did have to acknowledge that I was spending too much of my already-limited free time looking for things to write about on a web that, to my curmudgeonly eyes, at least, was increasingly filled with ephemera and effluvia. The pop culture/current events section of the web, to be more specific, is overflowing with clickbait trash and superficial treatments of topics that deserve better.
I’m not pining for some lost Golden Age when the web was truly scintillating or anything; I realize that it’s just my own perspective which has changed over time. But with that aging perspective comes the realization of the many ways in which the web is a younger person’s game, and the recognition of how uninterested I am in keeping up with what today’s undergrads find important and exciting. Mundy reaffirms what I’d already been thinking, that there’s something unique about the form of the book that enables deeper consideration of a topic. I haven’t given up on the idea that there are still interesting blogs to be found out there somewhere far from the bright lights and big traffic of the mega-blogs, with authors diligently setting down their thoughts on whatever interests them, taking no notice of the changing fashions or stat-measuring contests on social media. But it appears increasingly clear to me that I will need to feed my writing habit with material found in books rather than on blogs. I’m sure that means my output will continue to be much reduced from what it once was, but so be it. You have to stop talking if you want to do some worthwhile thinking.
In the Shadow of Reason
If human beings were potentially capable of applying reason in their lives they would show some sign of learning from what they had done wrong in the past, but history and everyday practice show them committing the same follies over and over again. They would alter their beliefs in accordance with facts, but clinging to beliefs in the face of contrary evidence is one of the most powerful and enduring human traits.
Outside of some areas of science, human beings rarely give up their convictions just because they can be shown to be false. No doubt we can become a little more reasonable, at least for a time, in some parts of our lives, but being reasonable means accepting that many human problems aren’t actually soluble, and our persistent irrationality is one of these problems. At its best, religion is an antidote against the prevailing type of credulity – in our day, a naive faith in the boundless capacities of the human mind.
…The refusal to see clear and present danger shows that the idea that human beings base their beliefs on their experience is just a fairy-tale. The opposite is closer to the truth – shaping their perceptions according to what they already believe, human beings block out from their minds anything that disturbs their view of the world. Psychologists who examine this tendency – sometimes called cognitive dissonance – have speculated that refusing to face the truth may confer an evolutionary advantage. Screening out unpleasant or disturbing facts may, in some circumstances, give some people a better chance of survival. But at the same time this tendency leads us all into one folly after another. Many regard science as the supreme embodiment of human reason, but science may yet confirm what history so strongly suggests – irrationality is hard-wired in the human animal.
Certainly unreason can be tempered by the hard-won practices of civilisation, but civilisation will always be a precarious achievement. To believe that human beings can be much improved by rational argument is to assume that they are already reasonable, which is obviously false. The old doctrine of original sin contained a vital truth – there are impulses of irrational destructiveness in every one of us.
In Pascal’s vocabulary, heart is not a word for feelings or emotions but for a mode of thinking best understood as “intuition”. Intuition for Pascal is a compelling and effective method of comprehending certain things without having to reason our way to them. In addition to being a mode of thinking distinct from reason, intuition also supplies the basic apprehensions that reason requires for its own operation…Pascal knew well what he was saying in implicating intuition as the seat of religious belief, though there is no reason to view intuition as somehow less intelligent or rational than other, more explicit, modes of thinking. Indeed, much of the “reasoning” we do, as well as many of the “reasonable” conclusions that we come to, owe their existence to the supporting work of the kinds of intuitive thinking processes discussed in this and previous chapters.
…Blaise Pascal’s well-educated and imminently “reasonable” contemporaries pinned their hopes and lavished praise on reason precisely because they deemed it capable of cutting through unfounded beliefs and superstitious faith and arriving at true knowledge logically evaluated and clear-mindedly obtained. The problem, though, is that only someone wearing blinders can argue for this view of knowledge acquisition. While such careful, calculated thinking certainly takes place — scientists, for example, try very hard to emulate this process and hold their compatriots accountable to it — in everyday life the formation of the majority of our beliefs, even for the well-educated and reasonable, follows a less rigorous course.
Intuitive beliefs, the kind of basic beliefs that ought to be found in a belief box, are concrete, commonsense descriptions of the real world derived from perception and spontaneous, nonconscious inferences. Intuitive beliefs are intuitive both in the sense that they are products of innate cognitive mechanisms and in that you need not be aware of holding them, even less of the reasons why. Nonetheless, because intuitive beliefs arise from reliable perceptions and inferences, they are rigidly held. Examples of intuitive belief include the belief that when you are tired you should sleep, that you cannot walk through walls, and that books cannot swim. Intuitive beliefs are so obvious that they hardly seem like “beliefs” at all, yet they are. Countless such beliefs continually operate in the background of daily thought, supporting conscious reasoning and guiding behavior.
Reflective beliefs are what people normally understand by the word “belief”. Instead of being derived automatically and nonconsciously, reflective beliefs come from conscious, deliberate reasoning or from external sources of authority like parents, teachers and books. Reflective beliefs are usually explanatory and interpretative rather than descriptive. Reflective beliefs may or may not be fully understood or well grounded and, consequently, people’s commitment to them may vary widely, from loosely held notions to dogmatic convictions.
…Reflective beliefs are influenced by intuitive beliefs primarily because intuitive beliefs provide the default assumptions that underpin reflective beliefs. As an example, most of us believe that any two objects will fall to the ground at the same rate. This is a reflective belief concerning gravity that was taught to us and probably required a demonstration before it was accepted. What that reflective belief requires, though, is a previous understanding that objects in fact fall to the ground when released. This is an intuitive belief — derived from our intuitive physics — that all people share and that, interestingly, is neither itself in need of demonstration nor made an explicit part of the final reflective belief. Similarly, intuitive beliefs are routinely operating in the background of most of our reflective beliefs, supplying the host of default assumptions that make conscious reasoning possible but nevertheless go largely unnoticed.
…Pascal’s critics insist that people have good reasons for their beliefs, yet that not the way belief generally works. Many of the “reasons” that we put forth to explain particular reflective beliefs are simply post-hoc justifications that had little to do with the belief’s actual formation. Furthermore, a good number of our reflective beliefs have no explicit reasons for support because none are called for. If intuitive beliefs get the job done, we don’t stop to draw up more reflective arguments. Explicit reasoning or elaboration only comes later. The same is true of religious belief. Lots of people can give explicit reasons for their belief in gods, but few if any of these explicit reasons were actually part of the mental process that formed the original belief. Religious belief, rather, arises from that mode of thinking correctly identified by Pascal as intuition.
Like progress, reason certainly exists and we can be grateful for it. But also like progress, its advance is neither uniform nor permanent — individuals, cultures and historical epochs do not become more “reasonable” in some kind of zero-sum manner, and setbacks can occur in the right circumstances. On a individual level, we all know people who can be highly educated and intelligent experts in one area while being shockingly ignorant and stubbornly irrational in another, and behavioral economists have recently been mainstreaming the understanding of just how omnipresent this sort of selective, intermittent rationality is. It’s not any different on a grander stage, either — in the twentieth century, educated people prided themselves on no longer being the sort of ignorant, superstitious barbarians who put people to death for witchcraft, even as they prepared (one hundred years ago precisely as of this week) to slaughter each other by the millions for dimly understood reasons which seemed to take on a life of their own, outstripping the ability of their human creators to control them. Within a couple decades, the honestly-conceived byproducts of scientific advance, eugenics and racial science, would lead directly to the loss of tens of millions more lives before being officially discarded as a regrettable detour into pseudoscience. Steven Pinker cites widespread literacy as an essential building block in the history of humanity’s supposed progress toward increased reason and empathy, but Timothy Snyder responds by pointing out the essential role literacy also played in the religious wars of post-Reformation Europe and in the spread of virulent nationalism in our own day.
There’s always a tradeoff, always a new set of problems and dilemmas accompanying every advance.
A detached observer surveying the sweep of human history from a vantage point in the stratosphere might find it reasonable to conclude that “reasons” are just the changing fashions we wear while getting down to the perennial killing which is truly the serious business of our existence as a species. Apologists for Communism desperately try to convince themselves that the ostensible reasons for their mass murders somehow represented an advance over the reasons people had committed mass murder in the past — killing reactionaries, class traitors and counter-revolutionaries was “better” than killing people for being of a different ethnic group, or killing them in the process of colonizing their land. But the idea that history is a dialectical process in which humans hate and kill each other for increasingly better and more valid reasons until they arrive at a final synthesis in which no one needs to be killed for anything ever again is the most delusional of myths. Humans will always differentiate themselves from one another no matter how often the field is leveled, and some of those differences will intensify into hatred and violence.
The moral of the story from this perspective is not at all about urging anyone to give up reason and embrace irrationality, or to foreswear any attempts to improve their circumstances in practical ways. For that matter, I don’t believe humans are even capable of ceasing to use their minds to manipulate their environment to their liking. No, the goal is only to attempt to illuminate a sort of parallel track of human existence, one that has nothing to do with concrete progression within linear time. What Gray is saying about “religion at its best” could apply to philosophy or mystical spirituality as well, and it’s what I think Jerry Coyne misses in his eagerness to have the same old argument he’s always had about religion vs. science — there’s a certain type of wisdom that isn’t the same thing as intelligence or knowledge.
There’s a perennial question humans have been asking themselves as long as they’ve been recording their thoughts: How should we live? It’s not the kind of question that depends on amassing a collection of facts, or mastering intricate theories. We will never achieve complete understanding or control of all the countless variables that create the circumstances of our lives. We will never agree on what constitutes an ideal existence, let alone attain the god’s-eye perspective necessary to create and maintain it. Every improvement we make will also bring along unforeseen results, and the cycle will continue as long as we do. Realizing all that, one also realizes that the basics of how to live a good life do not become more complicated in tandem with our technology and our expertise. The virtues that ancient philosophers recommended are just as valid today; they haven’t been “improved” upon or rendered obsolete by material changes. As much as it offends our pride to hear it, we are not going to think our way beyond the basic dilemmas of the human condition.
Limit Your Imagination, Keep You Where They Must
A more legitimate literary objection to censorship is its implicit portrayal of a reader as the sort of person who jumps off a cliff when asked. Notions such as “obscenity” or “abasement before the west” make literary language a tool of subversion and ascribe to the novelist the hypnotist’s capacity for making a previously obedient or prudish member of the public throw stones or unzip. In censorship’s official, airbrushed view of the reading experience, dispositions are imposed, not reinforced. As J M Coetzee argued in Giving Offence: Essays on Censorship, “it is a feature of the paranoid logic of the censoring mentality that virtue, qua virtue, must be innocent, and therefore, unless protected, vulnerable to the wiles of vice.”
That paranoid logic is the pressure-relief valve that allows nominally liberal-minded people to blithely engage in their own form of censorship. Ambiguous art and unsettling concepts are fine for properly socialized, educated and civilized people like us, of course, but the rabble, well, I’m afraid they just can’t be trusted with them. Misanthrope that I am, though, I somehow have faith born from experience that the masses are not quite the impressionable blank slates that all these clucking mother hens would have you believe. And thus I can only shake my head sadly at Tauriq’s absurd logic here. You know, if we’re going to play this ridiculous “X degrees of separation” game, Nirvana is actually “responsible” for more sexual assaults than Robin Thicke. Why, it’s almost like there’s no clear, linear cause-and-effect relationship between the “message” of art and its effect on the audience.