“I think there are two Donald Trumps.” (Palm Beach, Florida, March 11, 2016)
“I don’t think there are two Donald Trumps. I think there’s one Donald Trump.” (Palm Beach, Florida, March 11, 2016)
In a recent posting of “The Stone,” a column devoted to philosophical issues in The New York Times, Michael P. Lynch well-notes Donald Trump’s disdain for logical consistency. “He not only doesn’t fear contradiction,” says Lynch of Trump. “He embraces it.” Lynch laments the disregard of reason’s dictates and conventions, fearing that it may lead to a loss of regard for truth. Subsequently, Politico.com ran a piece by Michael Kruze and Noah Weiland that consisted of a very long list of quotations by Trump in which he directly contradicts himself on a wide variety of topics.
Indeed, those of us who hold high the ideal of truth and rational discourse, and believe in reason’s power to lead us to what’s right and true, find ourselves shaking our heads over, not only Trump’s loose grasp of reality, but the willingness of millions of people to embrace him in spite of it. As we know, demagoguery often begets idol-worship. It can hardly escape our notice in this case that Trump’s followers display a manic, indeed almost religious, devotion to him.
Consider this. Scripture (I’m thinking specifically of the Bible, but the argument can be extended to other religious traditions) isn’t consistent. It makes claims that are patently false. It contradicts itself—slapping together the Old and New Testaments as one document unavoidably led to contradiction. The very idea of the monotheistic God, if rationally parsed, doesn’t make a whole lot of sense. But religion isn’t rational, though some have endeavored mightily to make it so.
People say religion is a matter of faith, but that’s not quite right (or it’s misleading). Faith is supposed to be the acceptance of some belief or claim in the absence of any evidence. But believers tend to pounce on any evidence they can find to justify their beliefs; it’s only when there’s a lack of evidence, or counter-evidence, that they chalk the whole thing up to faith. Rather, religion appeals to people’s emotions and to their intuitions as to how things ought to be, and not to their rational faculties.
Take, first, emotions, particularly hope and fear. We fear death. There’s no way around the fact of our mortality, and Western religion solves the problem by telling us that we don’t really die. We also have a deep desire to belong, and religion gives us that: community, but community bought at a price, the price of ostracizing others, the non-believers. We’re told to hate them, even murder them. (If you’re incredulous, have a look at, for example, Deuteronomy 13:12-16; or John 15:6.) But at that steep cost, we have a powerful feeling of belonging. We’re part of the tribe, the chosen people.
Second, think about our intuitions. We believe with all our being that our lives have purpose and meaning, and that there must be some grand truth to the universe and our existence, and religion gives us all of this. There’s a witness to our pain and suffering; a father to protect us, to listen to us; a powerful force to intervene on our behalf if we pray hard enough.
For a certain type of American, Trump plays the same role, fulfills the same needs, as religion does for others. These are people who feel disaffected, feel that no one is listening to them, feel that there’s less and less purpose to their lives. They’re nostalgic for the days when things were good, before those people started demanding more and more and taking it away from them, the hard-working, true Americans.
So Trump comes onto the scene and he gives them a voice. He appeals not to their reason, but to their emotions and their intuitions. He makes them feel like he hears them; he makes them believe that he’s listening. He’s going to fix all their problems. He encourages their hatred of, and violence towards, the others, those people (fill in the blank: Muslims, Mexican immigrants, blacks).
And here’s the thing: because he’s appealing to their emotions and intuitions, what he says doesn’t have to make sense, it doesn’t have to be consistent—in fact, it’s better if it doesn’t make sense, if it’s not consistent. Anything too carefully reasoned, too backed up with evidence, wouldn’t hit them in the gut, wouldn’t stoke the fires of their anger and frustration. Further, if he had any straightforward policy proposals worked out in detail, people could see how ridiculous they were. “Bar Muslims from entering the country!” “Deport millions of illegals!” “Build a wall on the border of Mexico!” If he offered detailed reasoning on any of these plans and how to bring them to fruition, it would be like giving precise, tangible descriptive characteristics to the divinity. “Let’s see, he’s five foot eight, has a receding hairline, and enjoys martinis.” Instead, the founders of religions had a tendency to make claims that are so vague and ambiguous as to be meaningless and/or statements that are nonsensical. “I believe it because it is absurd,” said Tertullian. The same is true of The Donald.
Besides, the more different things he says, contradictory though they may be, the more of the disaffected he can appeal to. Different Trump supporters, even though they may have wildly different ideas about what’s wrong and what’s right, can come together and say, “Man, he tells it like it is.” And they believe it!
Those who strive for clarity and consistency are open to refutation. Trump will have no truck with clarity and consistency, so he can’t be refuted. Good luck to the Democratic nominee in the upcoming Presidential debates.
I was recently interviewed by Ezra Zaid from BFM radio in Kuala Lumpur. He hosts an English-language show about comedy, “Finding the Funny,” and he’d run across the volume on Woody Allen and Philosophy I’d co-edited.
You can listen to the radio interview here.
In addition, you can read my essay from the volume, “God, Suicide, and the Meaning of Life in the Films of Woody Allen” here.
The recent death of David Bowie struck me rather hard. It’s not that I was the biggest fan. Don’t get me wrong, I loved his music. I’ve cherished The Ziggy Stardust album, for example, for decades. But I never saw him in concert. Sadly, a week before he died I signed up for updates on his website, so that I could learn of concert dates when they appeared; I knew he was releasing a new album, and I was determined finally to see him live.
No, his death was symbolic as much as anything.
I grew up in the seventies. I was in diapers or not long out of them when the trio of Hendrix, Morrison, and Joplin were lost to alcohol and drugs, so I could love their music and romanticize their lives and deaths as only a teenager could.
If those three and their kin were our roots (as far as I knew; I didn’t discover the blues until later), then the gods of my youth were the ones who survived the sixties (the Stones, Dylan, Zeppelin, Clapton, Neil Young), and those who followed in their trail (Aerosmith, Van Halen, Springsteen—and then, later, U2).
In a very real way, those of us growing up in the 70s experienced quite deeply what Nietzsche referred to as the death of God. All the old ways of understanding the world and of finding meaning and value were lost—the summer of love was over, and Vietnam, the Kennedy assassinations, Watergate had left the country shaken and us empty and doubtful. Certainly, the older generations still clung to their values; they went back to work and to church and held fast to whatever was left of the idea of the American dream.
That wasn’t an option for us—sure, we’d go to college and end up getting jobs, having careers, but it wasn’t like we believed in any of it.
I’m not saying that any of this was necessarily conscious, a fully-formed idea in our heads; certainly, in small town Ohio I didn’t go around spouting quotes from Also Sprach Zarathustra (okay, when I was in college I did). Rather, the feeling, the experience formed the background and context in which we grew up.
What filled the void for many of us was Rock music. To some, that might sound stupid. To those who experienced it, it’s a truism. Rock music was the closest thing we had to religion. It was uplifting. It made us strong, gave us community, and a vocabulary. It was the voice of our rebellion, our collective scream, a big “fuck you” to the world. It didn’t give us hope exactly; I’m not sure that was possible any longer. But it drove our lives, gave us spines, became the soundtrack to the stupidity and wonder that is youth.
(As Nietzsche says, “I would only believe in a god who knew how to dance.”)
At this point, I can’t speak for anyone else, but I passed through youth and into adulthood with a feeling that wasn’t formed into a conscious thought or articulated—else I’d have had to reject it as ridiculous—that somehow those icons would always be there. They were our gods, after all. And they lived on and grew older—but they were there when we needed them.
There were some exceptions to their ever-presence in our lives, of course. We lost John Bonham and Freddy Mercury way, way too soon, for example. In Freddy’s case, that was heartbreaking, but I was also coming of age and passing into adulthood in the shadow of AIDs—talk about messing up your early sexual life (we were all convinced we’d get infected the first time we screwed)—and so Freddy’s loss was devastating but understandable. He’d fulfilled the rock and roll imperative to live hard, die young, and leave a good-looking corpse.
But the others, well, they just had to be there for us when we needed them (not you, Phil Collins; you can go away).
I confess that I don’t listen to much Rock music any longer. I mostly listen to classic Jazz (Miles, Coltrane, Dizzy) and composers like Beethoven and Bach. There’s something comforting about knowing your icons have already turned into distant memories; there’s no danger of losing them.
So the loss of Bowie is symbolic; his death means the death of another god—one of the gods of our youth. They won’t always be here for us. (Nietzsche was right: “Gods, too, decompose.”) And if they die, it means we must also. That’s another truism, but I honestly didn’t need another reminder of my mortality.
Thank God we’ll always have Keith Richards.
Raymond Chandler was one of the greatest classic noir writers. He wrote The Big Sleep, Farewell, My Lovely, The Lady in the Lake, and The Long Goodbye, amongst others. Even if you’ve never read any of his work, you likely recognize at least some of those titles, since each one of them has been made into a movie, some of them several times.
The editor, Barry Day, has done a lovely job of allowing Chandler to tell his own story. Chandler did interviews and wrote about the process of writing, and he was also a prolific letter-writer. So Day mined these pieces, as well as Chandler’s fiction, and pulled from them a coherent narrative about Chandler’s life and his craft. Day organized the work into themes that become chapters: Writing, Philip Marlowe, Cops…and Crime, The City of Angels, and so on. And then Day offers running editorial comments that help pull the whole work together.
Chandler had a real love and knack for language, and his novels are told from the first person point of view of his immortal Private Detective, Philip Marlowe (who shall, in my mind, forever be associated with Bogart). Chandler blessed Marlowe with some unforgettable lines.
“It was a blonde. A blonde to make a bishop kick a hole in a stained-glass window.” Farewell, My Lovely.
“I was a blank man. I had no face, no meaning, no personality, hardly a name…I was a page from yesterday’s calendar crumpled at the bottom of the waste basket.” The Little Sister.
“You have to have your teeth clamped around Hollywood to keep from chewing on stray blondes.” The Big Sleep.
I’ll share from the book two stories from Chandler’s life that I hadn’t heard before.
Chandler did a fair amount of screenwriting in Hollywood. His first effort came when he was hired to collaborate with Billy Wilder on the screenplay for Double Indemnity, this despite the fact that Chandler had little respect for James M. Cain, who wrote the novel.
Chandler was a neophyte, and so didn’t have any familiarity with the form of a screenplay or with the craft of screenwriting, which is different from novel-writing. Further, he and Wilder were both strong-willed characters, so there was some head-butting going on. However, once Wilder started working with Chandler and realized how brilliant he was, things went a bit smoother.
Wilder said of Chandler: “He was a dilettante. He did not like the structure of a screenplay, wasn’t used to it. He was a mess but he could write a beautiful sentence. ‘There is nothing as empty as an empty swimming pool’. That is a great line.” (p. 128)
The second story has to do with the classic noir film Blue Dahlia, the Alan Ladd/Veronica Lake movie, for which Chandler also wrote the screenplay. Ladd was “Paramount’s current hot property. The only problem was that the war was still on and Ladd had a firm call-up date from the army…The film could not afford to go over its rather tight shooting schedule.” (p. 140)
John Houseman, who later became an actor (in The Paper Chase, for example), was the producer of the film, and George Marshall was the director. Marshall was shooting scenes faster than Chandler could write them, sometimes leaving his film crew standing around with nothing to do.
Chandler had a drinking problem, even at this early point (1945). So he came to Houseman with a solution to their Ladd/film schedule issue: “the only way he could complete the task in time was to write while he was drunk.” Houseman reluctantly agreed, and “for the next several weeks there was round-the-clock limo transportation standing by, six secretaries working in shifts to take Chandler’s dictation, and a doctor on call to give him glucose injections in lieu of solid food.” (p. 144)
Rather remarkably, this strategy worked. The film was completed with six days to spare, Ladd made his Army date, the movie made a lot of money for the studio, and Chandler was nominated for his second Oscar.
This is an excellent read about a great writer.
In my humble opinion, one of the wackiest things about contemporary physics is the notion of indeterminacy, or the idea that (as a recent essay put it): “Reality Doesn’t Exist Until You Look at It.” This title is doubly silly, since it equates reality with what goes on at the subatomic level, and not with trees, dolphins, mountains, gerbils, Buicks, and non-fat yoghurt (the yoghurt definitely exists before you look at it, fyi). This was Schrödinger’s complaint with his famous cat thought-experiment (read here for the details).
For a long time I’d been naming this the “fallacy of deriving ontological conclusions from epistemological premises.” Ontology is the study of being; epistemology is the study of knowledge. So, in other words, one has premises concerning what one can or cannot know, and one derives a conclusion about the structure of reality from those premises. This is as illegitimate as deriving an “ought from an is,” as Hume so famously argued: the fallacy of deriving a normative conclusion–a claim about what ought to be–from some description of the way things are (read here).
Recently, and much to my delight, I discovered that this fallacy had already been noted and named as the “Mind Projection Fallacy” by E. T. Jaynes in the 1980s. Jaynes had even coined the expression to describe the mistake at work in quantum physics. Further, and even more to my delight, there’s another mistake that this fallacy names. The other mistake is to attribute aspects of one’s own mind and thinking to nature. Jaynes uses this fallacy to describe the mistake in attributing intentions to events in nature (the rain falls in order to feed the crops, as Aristotle puts it). That mistake is usually capped off by positing God (or gods) as the source of the intentions.
As Jaynes puts this first mistake:
For educated people today, the idea of directing intelligences willfully and consciously controlling every detail of events seems vastly more complicated than the idea of a machine running; but to primitive man (and even to the uneducated today) the opposite is true. For one who has no comprehension of physical law, but is aware of his own consciousness and volition, the natural question to ask is not: “What is causing it?”, but rather: “Who is causing it?”
The answer was to invent Gods with the same consciousness and volition as ourselves, but with the additional power of psychokinesis; one in control of the weather, one in control of the seas, and so on. (Jaynes, “Probability Theory as Logic“)
Jaynes sums up the fallacy, and describes the fallacy at work in quantum theory, thus:
Once one has grasped the idea, one sees the Mind Projection Fallacy everywhere; what we have been taught as deep wisdom, is stripped of its pretensions and seen to be instead a foolish non sequitur. The error occurs in two complementary forms, which we might indicate thus:
(A) (My own imagination) — (Real property of Nature)
(B) (My own ignorance) — (Nature is indeterminate)
Form (B) arose out of quantum theory; instead of covering up our ignorance with fanciful assumptions about reality, one accepts that ignorance but attributes it to Nature. Thus in the Copenhagen interpretation of quantum theory, whatever is left undetermined in a pure state is held to be unknown not only to us, but also to Nature herself.
FYI, this is known as the “Copenhagen interpretation,” because it was favored and embraced by the Dane Niels Bohr and his cohort (whereas it made Einstein uncomfortable; and, as I noted, Schrödinger thought it was absurd). So I heartily agree with Jaynes that it’s a fallacy to a) note that (e.g.) we can’t determine something to be a wave or a particle, or determine both the location and the position of a subatomic particle; and then conclude b) that there is no way the thing actually is until observed. Again, that’s drawing an ontological conclusion from an epistemological premise.
The first form of the fallacy, (A), taking properties of my imagination to be real properties of nature sounds rather like what’s called the “anthropomorphic fallacy,” which is also known as the “pathetic fallacy.” The fallacy was identified and named by John Ruskin in the mid-1800s. Ruskin was attacking the sentimentality of the poetry of his time. So the pathetic fallacy focuses largely on attributing emotions to nature and inanimate objects. I suspect that Jaynes named his own fallacy to focus more on intentions (and perhaps other cognitive processes besides emotions) to nature.
If I’m allowed a bit of latitude then, I’d like to suggest extending Jaynes’ fallacy and use it to describe the mistake made in attributing intentions to inanimate objects (and not simply to nature). In that case, it can be used to describe the mistake made by some philosophers of mind and cognitive scientists who claim that computers can think. Some of you know that I like to rant about this issue. I’ve posted on it before. You can read the post here.
A recent New York Times opinion piece, “A Crisis at the End of Science,” raises the somewhat-unexpected question of whether “physicists need empirical evidence to confirm their theories.” This is unexpected because empirical confirmation has been the foundation of the natural sciences since the beginnings of modern science. If someone’s theories and claims can’t be empirically tested and confirmed, then those theories and claims have no right to be called science.
The question arises in the context of attempts to come up with a unified theory in theoretical physics, particularly in the wake of the recent discovery of the Higgs boson particle by researchers working with the Hadron collider. “Predicted about 50 years ago,” say the authors of the essay, Adam Frank and Marcelo Gleiser, “the Higgs particle is the linchpin of what physicists call the “standard model” of particle physics, a powerful mathematical theory that accounts for all the fundamental entities in the quantum world…and all the known forces acting between them…”
But the standard model is a bit of a dead end, since it offers no means of uniting its vision of the world with Einstein’s theory of gravity. The favored theory to unite them is something called “supersymmetry.” No need to get bogged down in the details. The point is that to date supersymmetry remains unconfirmed. The collider has produced no evidence to validate its claims. Some physicists, then, want to hang on to the theory despite its lack of confirming evidence, claiming that the relevant particles may simply lie beyond detection of the Hadron, etc.
Here’s where the essay becomes interesting to me. The authors say at this point: “Implicit in such a maneuver is a philosophical question: How are we to determine whether a theory is true if it cannot be validated experimentally?” What, philosophy mentioned in an article about theoretical physics, and written, no less, by two physicists (Frank is a professor of astrophysics at Rochester University, and Gleiser is a professor of physics and astronomy at Dartmouth)?!
Forgive my sarcasm, but it’s true that many in the natural sciences are dismissive of philosophy, and I’m always a bit irritated when those in the natural sciences run up against philosophical questions without acknowledging them as such (and I’m rather bemused at some of the wild speculations that go off the deep end in theoretical physics about time travel, parallel universes, etc., since these are clearly metaphysical speculations). Anyway, Frank and Gleiser are correct: questions about knowledge and what makes a theory viable are epistemological questions, which makes them philosophical in nature.
A bit of explanation. Let’s take the tale about Newton and the apple. He sees objects heavier than air falling, being drawn to the earth, and he wonders why this happens. To answer the question he theorizes the universal law of gravitation: “any two bodies in the universe attract each other with a force that is directly proportional to the product of their masses and inversely proportional to the square of the distance between them.” This is a first-order question and answer about natural phenomena and falls squarely within the realm of the natural sciences. However, I can step back ask certain second-order questions: “What counts as evidence for or against any particular theory?” “Under what conditions is a theory confirmed?” “Must a theory be falsifiable in principle in order to be valid?” When I ask such questions I’m engaging in epistemology, the study of knowledge, and so I’m properly doing philosophy.
Let me conclude with some discussion of David Hume, who’s been on my mind a lot recently. Hume was quite taken with the Newtonian “experimental method” of observation of natural phenomena and sought to employ it in his own philosophical researches. But he had sharp criticisms for Newton and some of Newton’s followers like John Locke and Samuel Clarke, when they left the realm of what was experiential and observable. All three were theists and sought to posit and argue for the existence of God in addition to (or perhaps in spite of) their rigorous thinking regarding the natural world.
Hume believed all our knowledge falls into one of two categories: “relations of ideas” (logical truths like 2 + 2 =4 or “All triangles have three sides”) or “matters of fact” (empirical matters like “grass is green” or “today is Sunday,” which can be confirmed by observation). He argued that any claims that don’t fit into one of these categories must be spurious (if not outright meaningless). Perhaps in another post I can go into Hume’s argument for this position, and into the great philosophical importance of it, historically. He claims:
“If we take in our hand any volume; of divinity or school metaphysics, for instance; let us ask, Does it contain any abstract reasoning concerning quantity or number? No. Does it contain any experimental reasoning concerning matter of fact and existence? No. Commit it then to the flames: for it can contain nothing but sophistry and illusion.” (An Enquiry Concerning Human Understanding, Section XII)
Newtonian mechanics was replaced by the contemporary physics that served as the starting point for this piece. The claims of contemporary researchers and theorists concerning the indeterminacy of reality and the aforementioned time travel and parallel universes sound a good deal like the kind of metaphysical speculations for which Hume had such contempt. Further, when scientists claim that certain theories need not be confirmed by empirical evidence in order to be accepted or confirmed they sound like religious believers, who of course likewise say they need no evidence for their beliefs. Hume no doubt would add their work to the bonfire, since in his judgment it must contain nothing but “sophistry and illusion.”
There’s a heartfelt essay in today’s New York Times entitled “Why I Can’t Forgive Dylann Roof” by Roxane Gay. This follows from the very emotional scenes when the family members of the murdered South Carolina church-goers confronted the monster who killed their brothers, sisters, mothers, fathers, and by-and-large forgave him for his unforgivable deeds.
Gay herself admits to being a theist and says “I believe God is a God of love but cannot understand how that love is not powerful enough to save us from ourselves.” She goes on to turn the conversation (perhaps rightfully) into one about racism. “The call for forgiveness is a painfully familiar refrain when black people suffer,” she says. “White people embrace narratives about forgiveness so they can pretend the world is a fairer place than it actually is, and that racism is merely a vestige of a painful past instead of this indelible part of our present.”
She goes on to conclude:
What white people are really asking for when they demand forgiveness from a traumatized community is absolution. They want absolution from the racism that infects us all even though forgiveness cannot reconcile America’s racist sins. They want absolution from their silence in the face of all manner of racism, great and small. They want to believe it is possible to heal from such profound and malingering trauma because to face the openness of the wounds racism has created in our society is too much. I, for one, am done forgiving.
Given all that black Americans have had, and continue to have to endure, this is an understandable reaction. The problem is that it rather misses the point of the expressions of forgiveness by the church-goers. Turning the other cheek is part of the Christian narrative, and so these devastated South Carolinians were acting perfectly in accord with that narrative. Humans do horrible things to one another, but in the end God will sort it all out.
So I’m reminded of the unforgettable scene in Dostoevsky’s The Brother’s Karamazov called “Rebellion,” in which the “intellectual” brother Ivan confronts his pious brother Alyosha. Alyosha wants to toe the standard line of hope and forgiveness, and a merciful God who will ease all the suffering of innocents in some future state. Ivan, on the other hand, regales him of tales of the grossest barbarisms visited upon children.
He tells one particular tale to emphasize a point. An eight year old boy was throwing stones and hurt the paw of the favorite dog of an army officer. The officer had the boy held, and the next morning had the child stripped, and–in front of his mother–set him off running and sent the dogs after him. They caught him and tore him to pieces in front of her eyes.
Ivan concludes by telling Alyosha that for him the price of the suffering of innocents is too great.
“I want to forgive. I want to embrace. I don’t want more suffering. And if the sufferings of children go to swell the sum of sufferings which was necessary to pay for truth, then I protest that the truth is not worth such a price. I don’t want the mother to embrace the oppressor who threw her son to the dogs! She dare not forgive him! Let her forgive him for herself, if she will, let her forgive the torturer for the immeasurable suffering of her mother’s heart. But the sufferings of her tortured child she has no right to forgive; she dare not forgive the torturer, even if the child were to forgive him! And if that is so, if they dare not forgive, what becomes of harmony? Is there in the whole world a being who would have the right to forgive and could forgive? I don’t want harmony. From love for humanity I don’t want it. I would rather be left with the unavenged suffering. I would rather remain with my unavenged suffering and unsatisfied indignation, even if I were wrong. Besides, too high a price is asked for harmony; it’s beyond our means to pay so much to enter on it. And so I hasten to give back my entrance ticket, and if I am an honest man I am bound to give it back as soon as possible. And that I am doing. It’s not God that I don’t accept, Alyosha, only I most respectfully return him the ticket.” (The Brothers Karamazov)
What’s fascinating about this is that Ivan isn’t embracing atheism; he isn’t saying God doesn’t exist. Rather, he’s saying the familiar problem of evil is unsolvable and he’s placing justice higher than God. From this point of view, the Charleston parishioners are being perfectly consistent: They’re accepting that the suffering on earth will somehow be atoned for, contra Ivan; but, from this position, Ms. Gay’s position is incoherent. You have to choose between a loving God and Justice. As Ivan notes, you can’t have both.
Those of you who follow me know that I’m fond of adding “and shit” to the end of quotes.
E.g., “Religion is the opiate of the masses, and shit.” (Marx)
This time I’ve focused exclusively on quotes from various philosophers.
I always add a comma before the “and shit” to mark the end of the actual quote, whether that comma is actually needed or not.
PHILOSOPHY AND SHIT!
“For once touched by love, everyone becomes a poet, and shit.” (Plato)
“Tyranny is the exercise of power beyond right, and shit.” (Locke)
“Obligation is thralldom, and shit.” (Hobbes)
“Nothingness haunts being, and shit.” (Sartre)
“Be a philosopher; but, amidst all your philosophy, be still a man, and shit.” (Hume)
“The good life is one inspired by love and guided by knowledge, and shit.” (Bertrand Russell)
“I would only believe in a god who knew how to dance, and shit.” (Nietzsche)
“Must not all things at the last be swallowed up in death, and shit?” (Plato)
“In the state of nature, Profit is the measure of Right, and shit.” (Hobbes)
“We can open our hearts to God, but only with Divine help, and shit.” (Aquinas)
“Everything that is possible demands to exist, and shit.” (Leibniz)
“If a man will begin with certainties, he shall end in doubts, and shit.” (Bacon)
“The owl of Minerva spreads its wings only with the falling of the dusk, and shit.” (Hegel)
“In all things of nature there is something of the marvelous, and shit.” (Aristotle)
“A serious and good philosophical work could be written consisting entirely of jokes, and shit.” (Wittgenstein)
“It makes me happy that men do not want at all to think the thought of death, and shit.” (Nietzsche)
“Terror is the primary principle of religion, and shit.” (Hume)
“All men by nature desire to know, and shit.” (Aristotle)
“Beasts that have deliberation must necessarily also have will, and shit.” (Hobbes)
“Truth will sooner come out from error than from confusion, and shit.” (Bacon)
Anyone who gets updates on my blog knows that I haven’t posted anything in a while, and I’m here to say that I’m not going to post anything for a while longer (except for this, of course). There are three reasons for this, two practical, and one ideological. The first, more mundane reason is that very few people read these posts (and I greatly appreciate the fact that some of you do); the second is that I’ve become insanely busy. I’m teaching four classes, one of them new, as well as working on my anthology, Nietzsche and the Philosophers, as well as trying to write my own essay for that volume. In addition, I’m working on a new novel. Add to that the duties and obligations of both work and home, and I just don’t have the time or the energy to do any blogging.
The Mistake of Self-Publishing
The more serious reason has to do with my evolving attitude about a writer’s online presence. Let me begin that discussion by saying that, in my humble opinion, self-publishing is for the most part a big mistake. It encourages both those who aren’t ready to publish and those who just shouldn’t be published to glut the marketplace with material that’s mediocre at best, and often far worse. With regard to the former, instead of putting in the time and the hard work and effort (I estimate about ten years, give or take)—and leaving behind those first, weak attempts that should’ve been aborted—these writers put them out there for all the world to see and hardly anyone to buy. It takes a great deal of time and words written in order to be able to hear your writing, to hear your mistakes. So these would-be authors can’t yet hear how crappy their prose is, and haven’t yet figured out that that their characters are shallow, etc. Instead of going through a kind of apprenticeship and putting in the effort to produce something good that either a literary agent or an independent publisher is willing to take a chance on, they self-publish. On the other hand, some of these folks just aren’t cut out to be writers; they’ll never have the chops. Their work is the equivalent of that sophomoric poetry or music we all produced in our early years. Now, instead of molding in a shoe box at the back of the closet, the stuff gets tossed, with all the other detritus, into the online literary market place.
Let me pause to note that I’m sure there are good writers who, for whatever reason, self-publish. Perhaps they enjoy the absolute autonomy of the process; perhaps they haven’t yet found an agent or a publisher, whether independent or industry, who shares their visions of things. It’s a hard path, and I don’t mean at all to denigrate their efforts. I’m talking about the majority here.
An Online Presence
The phenomenon of self-publishing has led to step A: These would-be Hemingways take to social media to hock their wares. They have no other way to market their writing, and online promoting is largely free (Twitter and Facebook) or at least cheap (a website, a blog). That then led to step B: The claim, now largely taken as Gospel, that an author must build an audience and promote him or herself, even before, and whether or not, he or she has published anything. Consequently, Twitter (for example) is full of users who call themselves writers, discuss the writing life, post platitudes about writing, either without having published a word or after having self-published a substandard piece of work. (I’ve seen posts by people online calling themselves writers that made me wonder if they were even fully literate. No joke.)
It’s true that (at least some) agents say that an author needs to have some sort of online presence. Many won’t take a writer seriously if he or she has no track record at all or is publicly invisible. But that means putting up a website or blog, perhaps posting some short stories, seeing if you can get pieces of short fiction published in print or online journals. It doesn’t mean promoting oneself as a writer, pre-marketing works that don’t yet exist.
[Of course my whole discussion here is about fiction writers who blog; it’s not about non-fiction writers whose medium is a blog, which is now a perfectly legitimate platform for nonfiction work.]
If you couple the sometimes-adolescent need to express one’s feelings (in writing), along with the sometimes-adolescent need for validation, and throw in the public format of the internet, you get the phenomena described above of a deluge of self-published crap and empty self-marketing—empty because there’s nothing yet to market. People want celebrity without actually having any talent and without having actually done anything interesting (and unfortunately, society and the media often foster that desire, rewarding some who are talentless and unproductive with fame and fortune). Some believe if they post enough pictures of themselves and tweet and blog about every minute and mundane detail of their lives, those lives will somehow matter, despite the fact that they aren’t really living, just living vicariously on the internet. The Cartesian dictum, “I think, therefore I am” is transformed in the 21st Century to “I tweet, therefore I am.”
My Current Situation
I’ve written ten or eleven novels (not sure because I’ve lost count, and at least one was aborted in the middle), one of which was published by an independent press. I had two subsequent e-publications through what my agent at the time called “agent-assisted self-publishing,” which he concocted as a kind of hybrid notion of working through an agent and self-publishing. Really, I was just self-publishing and he was helping me do it. At that point I got on the bandwagon, joined Twitter, started a blog, updated my website. I had gone through the apprenticeship, so I wasn’t putting out crap, but nonetheless I did fall under the spell of the so-called common wisdom that I had to pre-build an audience for that next real publication.
Nothing came of all this. No one bought my e-novels, and I spent precious time composing blogs and supposedly building an online audience, when I should’ve been doing the actual work of a writer: writing and getting better at writing. It’s this realization that led me to quit blogging and to cut back on my online activities until I actually have something real to promote.
I refuse ever again to self-publish, and while I have nothing at all against independent publishers—there are many fine ones who are doing excellent work—I’m dedicating myself to the traditional route of industry publishing. And that means finding an agent who has real connections, which means producing novels that an agent believes he or she can sell. Yes, I hear the voices of the nay-sayers shouting that the industry is only interested in making money, in novels that have commercial value; they’re not interested in real art, so anyone who fits into that mold is selling out, etc. I respond: First, it’s quite possible to produce something worthwhile that is also sellable by industry standards (witness Chuck Palahniuk, Cormac McCarthy, and Philip Roth); Second, Shakespeare, Dostoyevsky, and Joyce were artists (that word gets way overused these days); I’m a craftsman. I want to tell good stories that people will enjoy reading. It’s not selling out to craft those stories such that they’re marketable.
I’m pleased to say that four different agents at four different literary agencies in New York are currently reading one of my manuscripts. That doesn’t mean of course that any of them will end up representing me, but it is encouraging. (I had an offer from two independent presses to publish that same novel, but neither deal was quite what I wanted.) I’m also pleased to say that the new novel I’m working on kicks ass. I always think, when I finish a novel, “this one’s going to sell for sure!” But I have new and better reasons for thinking that’s really true of this particular story.
So, who knows, maybe soon I’ll be back blogging and promoting—because I’ll really have something to say.
In the spirit of the season (sort of), I’ve been posting tweets about Santa to reveal what a badass he is. Below you’ll find a selection of them. Some of these are more gangsterish than others. Anyway, I hope you find them funny. Enjoy!
Santa’s Bona Fides
Santa carries a .38 strapped to his ankle at all times.
Santa smokes menthols.
Santa heads the Yakuza.
Santa once beat an elf senseless with a whiskey bottle.
Santa fronts a death metal band called Reindeer Sandwich.
Santa once pistol-whipped Sinatra in front of all his pals.
Santa was one of the Watergate burglars.
Santa once bitch-slapped Snoop Dogg.
Santa always carries a shiv made from a toothbrush and a razorblade.
Santa came up with the riff for “Purple Haze.”
Santa water-boards the elves when they get out of line.
Santa bounty-hunts bail-jumpers.
Santa has to wear an electronic tracking bracelet around his ankle because he violated parole twice.
Santa hunts Salvation Army bell-ringers with a crossbow.
Santa won a split-decision 12 round fight against Ken Norton.
Santa runs a cock fighting ring in Tijuana.
Santa wrote Brando’s lines in Apocalypse Now.
Every year Santa gives the elves a box of magnum condoms as a stocking-stuffer gag gift.
Santa taught James Brown how to dance.
Santa collects protection money from storeowners in Flatbush.
Santa switched baby Jesus at birth with the kid in the next manger.
Santa gave Mia Wallace a foot massage.
Santa interrupted the Lincoln-Douglas debates three times with loud farts.
Santa took the gun *and* the cannolis.
For a price, Santa will ugly you up before you go to prison.