When in 1930 a reporter asked Mahatma Gandhi what he thought of modern civilization, the great religious leader and political philosopher replied, "That would be a good idea." Addressing the topic of prosody for 21st-century poets, one should probably say, first and foremost, that it would be a good idea. In recent generations, verse has witnessed interesting developments in imagery, rhetoric, and subject matter. But prosody—"the science of versification; that part of the study of language which deals with the forms of metrical composition," to cite the OED's definition—has largely disappeared from English-language poetry.
Today, I should like to speak about this disappearance and to suggest that all 21st-century poets, regardless of the modes they favor and write in, would benefit from a recovery of the science of versification and the forms of metrical composition.
Historically, versification involves the fusion of meter and rhythm. Meter refers to the fixed, abstract norm of the verse line; rhythm involves the fluid modulations of living speech. Meter is impersonal and unchanging; rhythm is personal and variable. Up until the 20th century, the education of poets entailed their learning how to harmonize their unique rhythms with regular metrical forms. To be sure, some poets—John Milton and Robert Frost are examples—delight in setting meter and rhythm in what Frost calls "strained relation." But even in such cases, the trick remains to square and combine the two elements, so that meter gives rhythm memorable shape and stability while, at the same time, rhythm animates meter with spirit and variety.
In 20th-century poetry, meter and rhythm not only experience "strained relation," but undergo a destructive divorce; and in the court settlement, rhythm gets the house, the car, and the condo in Aspen, while meter is left with the toaster oven and the kids. Free verse, a poetry of rhythm without meter, emerges and is so widely and rapidly adopted that by 1977 Stanley Kunitz observes in an interview with Antaeus: "Non-metrical verse has swept the field, so that there is no longer any real adversary from the metricians. The defining element of poetry is no longer whether it is metrical or non-metrical . . . [P]oetry is defined by a certain inflection of the voice rather than in terms of a particular prosodic practice."
If 20th-century poetic practice favors rhythm over meter, so does the poetic theory of the period. Critical discussion often characterizes free verse as adventurous and risk-taking, whereas metrical verse is described as restrictive and repressed. Indeed, The Oxford Companion to English Literature (5th ed., 1985), in its entry on "Metre," compares the medium to the garment used to pinion and confine people who are gravely disoriented or disturbed. "Verse in the 20th century," the Companion states in the one-sentence paragraph that concludes its entry, "has largely escaped the straitjacket of traditional metrics."
Many factors contribute to the elevation of rhythm and the depreciation of meter. I explore a number of these in Missing Measures. On this occasion, I will offer only two points on the subject.
First, to the extent that 20th-century poets cease to use meter, they cease to understand it. More specifically, many of them come to confuse metrical practice with metrical analysis. We see an influential manifestation of this confusion in Ezra Pound's dicta, "As regarding rhythm . . . compose in the sequence of the musical phrase, not in sequence of a metronome" and "Don't chop your stuff into separate iambs," and in his description of the iambic pentameter as "ti tum ti tum ti tum ti tum ti tum from which every departure is treated as an exception."
If we consider the iambic pentameter as a paradigm—as an abstract model of ten syllables alternating uniformly between light and heavy—Pound's description is accurate; and we poets should be grateful for his reminders to avoid rhythmical clunkiness. His comments are, however, misleading insofar as actual iambic practice is concerned. English does not consist of syllables that are all either Identically Weak or Identically Strong. Gradations of stress in spoken English are virtually infinite, and the stress we give a particular syllable may change from one occasion to another, depending upon the surrounding phonetic and verbal environment and upon the grammatical or rhetorical context. Moreover, metrical poets do not compose their lines one foot at a time. Rather, they write in larger phrases or clauses that fit their meter or different segments of it; and since any complete articulation has, as linguists inform us, one and only one primary stress, most of these larger phrases and clauses will feature syllables with different degrees of secondary, tertiary, or weak stress.
Consequently, in actual iambic verse, the fluctuation between weaker and stronger syllables is not absolute, but relative. Sometimes, it may be fairly pronounced. At other times, it will be quieter and subtler. We can grasp this point by examining the following lines by Richard Wilbur, Edgar Bowers, Jean Toomer, and Wendy Cope, all of which are metrically identical (in the sense of being conventional iambic pentameters), but each of which differs rhythmically from the other three (in the sense of having distinctive variations of speech contour):
Veranda, turret, balustraded stair
The rhetorician classifies my pain
And there, a field rat, startled, squealing bleeds
Who needs a bridge or dam? Who needs a ditch?
As these examples also indicate, rhythmical variety within metrical order is augmented by the number and placement of syntactical junctures within the lines. And other elements, such as enjambment—the carrying over of meaning from one line to the next, with little or no grammatical pause at the line end—supply metrical poems with additional rhythmical diversity.
Though Pound deserves the utmost respect for his critical abilities and concern with prosody, he blurs the distinction and relationship between rhythm and meter. It is fine to urge that rhythms of poems should move "in the sequence of the musical phrase, not in sequence of a metronome," but verse rhythm has always been a matter and a result of musical (or verbal) phrases. It is meter in the abstract that is metronomic. Likwise, Pound's succession of ti tums describes the theoretic norm of iambic pentameter rather than what occurs in real, living verse construction. Far from being "exceptions," continually and flexibly modulated lines have characterized English iambic verse from the time of Philip Sidney and Edmund Spenser. In fact, rhythmically speaking, the exception is the ti tumming line—the line that reproduces the metrical paradigm. Such lines occur very rarely, and to achieve them, the poet must resort not only to severe rhythmical repetition, but also to strict grammatical recurrence, as in "The room, the rug, the desk, the lamp, the pen" or "He laughs and skips and whoops and runs and hops."
Nevertheless, numerous critics and textbook writers have repeated the Poundian suggestion that the rhythm of iambic verse is inherently monotonous. And many poets have come to believe that to write metrically is to commit themselves to rigid verbal schematization. They imagine that to write in meter is to be confined to a single analytic abstraction rather than to be supported by a general pattern that permits and encourages innumerable individual realizations of it.
My second point is that the modern emphasis on personal rhythm at the expense of impersonal meter reflects an extension of Romantic aesthetics into versification.
This point needs qualification since the Romantic poets and the pioneers of free verse stand opposed, in a literal sense, on the meter question. William Wordsworth's preface to the second edition of Lyrical Ballads contains an eloquent explanation of the function of meter and a memorable defense of metrical composition; and as prosodists, the Romantic poets use traditional meter in ways that range from the unaffectedly adept (e.g., Wordsworth and Keats) to the virtuosic (e.g., Coleridge and Byron). In contrast, such early masters of free verse as Ford Madox Ford, T. E. Hulme, William Carlos Williams, Pound, D. H. Lawrence, H. D., and T. S. Eliot all to some degree break with traditional meter and, in some cases, go so far as to argue that it is obsolete or inappropriate to modern subject matter. As Hulme puts it in his "Lecture on Modern Poetry," modern verse "has become definitely and finally introspective. . . . Regular metre to this impressionist poetry is cramping, jangling, meaningless, and out of place."
In addition, many modernists condemn Romantic and Victorian poetic practice in general, regarding it as chronically prone to inflation and sentimentality. Especially illustrative in this respect is Ford, who juxtaposes, in his memoir Thus to Revisit, the vapidity of much nineteenth-century poetry with the freshness of imagistic vers libre. "The work is free," Ford says of the Imagists, "of the polysyllabic, honey-dripping and derivative adjectives that, distinguishing the works of most of their contemporaries, make nineteenth-century poetry as a whole seem greasy and ‘close,' like the air of a room."
Yet if we closely examine the romantic and modern viewpoints, continuities as well as disjunctions emerge. The pains Wordsworth takes, in his preface, to explain that his attack on neo-classical diction implies no censure of traditional metric indicates that he realizes that his advocacy for natural poetic expression could be turned against meter. By the same token, when modern poets inveigh against their predecessors, they are not, for the most part, focusing on the intellectual foundations of earlier practice. They are chiefly objecting, as Ford's comment indicates, to insipid diction or to the facile treatment of predictable subject matter. In this regard, we might recall William Butler Yeats's perceptive observation that Eliot was "the most revolutionary man in poetry during my lifetime, though his revolution was stylistic alone."
Even as modern poets overthrow nineteenth-century style, they adopt fundamental elements of romantic thought, and these provide crucial assistance in the development of free verse. For instance, we encounter, in many modern poets, the romantic doctrine that "organic" form is superior to "mechanical" form; and this doctrine comes to serve the modern experimental impulse insofar as rhythm is increasingly associated with organicism and meter with mechanicality. So, too, the romantic belief that music is the purest of the arts appears in such modern poet-critics as Pound and Eliot, who repeatedly (if somewhat vaguely) contend that musical structure can or should substitute, in poetry, for metrical structure. Poetic modernism is also informed by the Kantian idea that art is independent of pure and practical reason, and that external criteria are therefore less relevant to the creation of a poem than the poet's inner promptings and intuitions. As expounded in Kant's Critique of Judgment, this idea is not necessarily inimical to metrics or other artistic conventions, but it does reflect and support the general turn towards subjectivism in the arts that occurs in the late 18th and 19th century and that, in 20th–century poetry, finds expression in the free verse movement.
Something similar can be said about the Romantic preoccupations with such matters as self-expression, novelty, and spontaneity. Though they may originally co-exist with traditional approaches to poetic craft, they tend over time to drive a wedge between rhythm and meter and to draw poets to the former and alienate them from the latter. Likewise, there is initially nothing anti-metrical in the anxiety (evident as early as the 17th century, but increasingly acute during the Romantic period and the 19th century) that poetry and the arts are being progressively overshadowed by the sciences. Eventually, however, the desire to keep poetry culturally up-to-date expresses itself in a search for instrumental innovation on the model of science and in the development of non-metrical modes of verse characterized as "experimental" and promoted as having the potential to enable poets to achieve quasi-scientific breakthroughs and discoveries.
Without denying our modernity or post-modernity, we still live, in key respects, in the Romantic era. We are still trying to assimilate and defend its crucial virtues, such as its respect for individual men and women and its desire for a human community unconstrained by divisions of class and political tyranny. We are also still trying to struggle free of its darker currents, such as its brutal nationalisms and its sometimes unguarded cultivation of the irrational and sensational aspects of our nature and culture. And the future health of our poetry will probably depend, to a significant extent, on our ability to come to a clearer-sighted and more balanced understanding of the legacy and persistence of Romanticism than we have been able to achieve so far.
Now at the beginning of the 21st century, poets have good reasons to recover prosody. Though excellent poems have been written and continue to be written in free verse, a law of diminishing returns may have set in. If everybody in contemporary verse cultivates rhythm alone, poetry risks declining from an art to a mere activity—an anything-goes pursuit, with poets isolated in small inward-looking schools and composing more and more narrowly on the basis of self-expressive fiat. In such a climate, free verse itself will wither and die. Free verse can be truly free only if it has something to be free from.
If we poets can recover an appreciation of prosody, we may recover as well the sense that rhythm and meter are not necessarily opposed, but can be complementary partners in the poetic enterprise. We may discover that meter, far from restricting us, can encourage us to examine ideas and images, and ways of expressing them, from different angles and perspectives, and can thus help us explore our subjects more deeply or fully than we otherwise could. We may find, too, that meter can at times valuably caution us, in the manner of a resistantly honest friend or spouse, against hasty, ill-considered, or arbitrary speech. And we may realize that meter often has a magical, magnetic power to attract to our poems words and thoughts truer and better than those that normally come to mind.
Versification benefits from both rhythm and meter. Without rhythm, verse is lifeless. Without meter, verse risks sacrificing memorability, subtlety, force, and focus. If the experiment of the 20th century was to separate rhythm and meter, the challenge of the 21st century may be to re-connect them in a vital and fruitful way, so that poets again may, as Thom Gunn writes in "To Yvor Winters, 1955,"
. . . keep both Rule and Energy in view,
Much power in each, most in the balanced two.
Note: This essay, in a slightly different version, was presented at a panel on "Prosody for 21st-Century Poets" at the 2006 AWP Convention in Austin, Texas. Gandhi's comment about modern civilization is reported in E. F. Schumacher, Good Work (New York: Harper and Row, 1979), 62. Frost's remark about setting meter and rhythm in "strained related" occurs in a letter that he wrote to John Cournos on July 8, 1914 and that appears in Robert Frost, Collected Poems, Prose, and Plays, edited by Richard Poirier and Mark Richardson (New York: Library of America, 1995), 680. Stanley Kunitz's discussion of the diffusion and triumph of free verse may be found in The Structure of Verse, rev. ed., edited by Harvey Gross (New York: Ecco, 1979), 262. Pound's remark about the metronome, and his warning against chopping verse into separate iambs, appears in Literary Essays of Ezra Pound, edited with an introduction by T. S. Eliot (New York: New Directions, 1968), 3, 6; Pound's analysis of iambic pentameter is in the "Treatise on Metre" section of his ABC of Reading (New York: New Directions, 1960), 203-04. Hulme's comments about modern poetry and meter appear in T. E. Hulme, Further Speculations, edited by Sam Hynes (Lincoln: University of Nebraska Press, 1962), 72, 74. For Ford's criticism of nineteenth-century verse, please see Ford Madox Hueffer, Thus to Revisit (New York: Dutton, 1921), 157. Yeats's observation about Eliot appears in W. B. Yeats, Essays and Introductions (New York: Macmillan, 1961), 499. The lines of verse cited in this essay may be found in Richard Wilbur, Collected Poems 1943-2004 (San Diego: Harcourt Brace, 2004), 7; Edgar Bowers, Collected Poems (New York: Knopf, 1997), 14; Jean Toomer, Cane, edited by Darwin T. Turner (New York: Norton, 1988), 5; Wendy Cope, Making Cocoa for Kingsley Amis (London: Faber and Faber, 1986), 13; and Thom Gunn, Collected Poems (New York: Farrar, Straus and Giroux, 1994), 70.
I address, in greater detail, matters discussed in this essay in Missing Measures: Modern Poetry and the Revolt against Meter (Fayetteville: University of Arkansas Press, 1990) and All the Fun's in How You Say a Thing: An Explanation of Meter and Versification (Athens, OH: Ohio University Press, 1999).
From a scientific perspective, the starting point must be different from that of traditional manuals, which are lists of dos and don'ts that are presented mechanically and often followed robotically. Many writers have been the victims of inept copyeditors who follow guidelines from style manuals unthinkingly, never understanding their rationale.
For example, everyone knows that scientists overuse the passive voice. It's one of the signatures of academese: "the experiment was performed" instead of "I performed the experiment." But if you follow the guideline, "Change every passive sentence into an active sentence," you don't improve the prose, because there's no way the passive construction could have survived in the English language for millennia if it hadn't served some purpose.
The problem with any given construction, like the passive voice, isn't that people use it, but that they use it too much or in the wrong circumstances. Active and passive sentences express the same underlying content (who did what to whom) while varying the topic, focus, and linear order of the participants, all of which have cognitive ramifications. The passive is a better construction than the active when the affected entity (the thing that has moved or changed) is the topic of the preceding discourse, and should therefore come early in the sentence to connect with what came before; when the affected entity is shorter or grammatically simpler than the agent of the action, so expressing it early relieves the reader's memory load; and when the agent is irrelevant to the story, and is best omitted altogether (which the passive, but not the active, allows you to do). To give good advice on how to write, you have to understand what the passive can accomplish, and therefore you should not blue-pencil every passive sentence into an active one (as one of my copyeditors once did).
Ironically, the aspect of writing that gets the most attention is the one that is least important to good style, and that is the rules of correct usage. Can you split an infinitive, that is, say, "to boldly go where no man has gone before,"or must you say to "go boldly"? Can you use the so-called fused participle—"I approve of Sheila taking the job"—as opposed to "I approve of Sheila's taking the job" (with an apostrophe "s")? There are literally (yes, "literally") hundreds of traditional usage issues like these, and many are worth following. But many are not, and in general they are not the first things to concentrate on when we think about how to improve writing.
The first thing you should think about is the stance that you as a writer take when putting pen to paper or fingers to keyboard. Writing is cognitively unnatural. In ordinary conversation, we've got another person across from us. We can monitor the other person's facial expressions: Do they furrow their brow, or widen their eyes? We can respond when they break in and interrupt us. And unless you're addressing a stranger you know the hearer's background: whether they're an adult or child, whether they're an expert in your field or not. When you're writing you have none of those advantages. You're casting your bread onto the waters, hoping that this invisible and unknowable audience will catch your drift.
The first thing to do in writing well—before worrying about split infinitives—is what kind of situation you imagine yourself to be in. What are you simulating when you write, and you're only pretending to use language in the ordinary way? That stance is the main thing that iw distinguishes clear vigorous writing from the mush we see in academese and medicalese and bureaucratese and corporatese.
The literary scholars Mark Turner and Francis-Noël Thomas have identified the stance that our best essayists and writers implicitly adopt, and that is a combination of vision and conversation. When you write you should pretend that you, the writer, see something in the world that's interesting, that you are directing the attention of your reader to that thing in the world, and that you are doing so by means of conversation.
That may sound obvious. But it's amazing how many of the bad habits of academese and legalese and so on come from flouting that model. Bad writers don't point to something in the world but areself-conscious about not seeming naïve about the pitfalls of their own enterprise. Their goal is not to show something to the reader but to prove that they are nota bad lawyer or a bad scientist or a bad academic. And so bad writing is cluttered with apologies and hedges and "somewhats" and reviews of the past activity of people in the same line of work as the writer, as opposed to concentrating on something in the world that the writer is trying to get someone else to see with their own eyes.
That's a starting point to becoming a good writer. Another key is to be an attentive reader. One of the things you appreciate when you do linguistics is that a language is a combination of two very different mechanisms: powerful rules, which can be applied algorithmically, and lexical irregularities, which must be memorized by brute force: in sum, words and rules.
All languages contain elegant, powerful, logical rules for combining words in such a way that the meaning of the combination can be deduced from the meanings of the words and the way they're arranged. If I say "the dog bit the man" or "the man bit the dog," you have two different images, because of the way those words are ordered by the rules of English grammar.
On the other hand, language has a massive amount of irregularity: idiosyncrasies, idioms, figures of speech, and other historical accidents that you couldn't possibly deduce from rules, because often they are fundamentally illogical. The past tense of "bring" is "brought," but the past tense of "ring" is "rang," and the past tense of "blink" is "blinked." No rule allows you to predict that; you need raw exposure to the language. That's also true for many rules of punctuation. . If I talk about "Pat's leg," it's "Pat-apostrophe-s." But If I talk about "its leg," I can't use apostrophe S; that would be illiterate. Why? Who knows? That's just the way English works. Peole who spell possessive "its" with an apostrophe are not being illogical; they're being too logical, while betraying the fact that they haven't paid close attention to details of the printed page.
So being a good writer depends not just on having mastered the logical rules of combination but on having absorbed tens or hundreds of thousands of constructions and idioms and irregularities from the printed page. The first step to being a good writer is to be a good reader: to read a lot, and to savor and reverse-engineer good prose wherever you find it. That is, to read a passage of writing and think to yourself, … "How did the writer achieve that effect? What was their trick?" And to read a good sentence with a consciousness of what makes it so much fun to glide through.
Any handbook on writing today is going to be compared to Strunk & White's The Elements of Style, a lovely little book, filled with insight and charm, which I have read many times. But William Strunk, its original author, was born in 1869. This is a man who was born before the invention of the telephone, let alone the computer and the Internet and the smartphone. His sense of style was honed in the later decades of the 19th century!
We know that language changes. You and I don't speak the way people did in Shakespeare's era, or in Chaucer's. As valuable as The Elements of Style is (and it's tremendously valuable), it's got a lot of cockamamie advice, dated by the fact that its authors were born more than a hundred years ago. For example, they sternly warn, "Never use 'contact' as a verb. Don't say 'I'm going to contact him.' It's pretentious jargon, pompous and self-important. Indicate that you intend to 'telephone' someone or 'write them' or 'knock on their door.'" To a writer in the 21st century, this advice is bizarre. Not only is "to contact" thoroughly entrenched and unpretentious, but it's indispensable. Often it's extremely useful to be able to talk about getting in touch with someone when you don't care by what medium you're going to do it, and in those cases, "to contact" is the perfect verb. It may have been a neologism in Strunk and White's day, but all words start out as neologisms in their day. If you read The Elements of Style today, you have no way of appreciating that what grated on the ears of someone born in 1869 might be completely unexceptionable today.
The other problem is that The Elements of Style was composed before there existed a science of language and cognition. A lot of Strunk and White's advice depended completely on their gut reactions from a lifetime of practice as an English professor and critic, respectively. Today we can offer deeper advice, such as the syntactic and discourse functions of the passive voice—a construction which, by the way, Strunk & White couldn't even consistently identify, not having being trained in grammar.
Another advantage of modern linguistics and psycholinguistics is that it provides a way to think your way through a pseudo-controversy that was ginned up about 50 years ago between so-called prescriptivists and descriptivists. According to this fairy tale there are prescriptivists who prescribe how language ought to be used and there are descriptivists, mainly academic linguists, who describe how language in fact is used. In this story there is a war between them, with prescriptivist dictionaries competing with descriptivist . dictionaries.
Inevitably my own writing manual is going to be called "descriptivist," because it questions a number of dumb rules that are routinely flouted by all the best writers and had no business being in stylebooks in the first place. These pseudo-rules violate the logic of English but get passed down as folklore from one style sheet to the next. But debunking stupid rules is not the same thing as denying the existence of rules, to say nothing of advice on writing. The Sense of Style is clearly prescriptive: it consists of 300 pages in which I boss the reader around.
This pseudo-controversy was created when Webster's Third International Dictionary was published in the early 1960s. Like all dictionaries, it paid attention to the way that language changes. If a dictionary didn't do that it would be useless: writers who consulted it would be guaranteed to be misunderstood. For example, there is an old prescriptive rule that says that "nauseous," which most people use to mean nauseated, cannot mean that. It must mean creating nausea, namely, "nauseating." You must write that a roller coaster ride was nauseous, or a violentmovie was nauseous, not I got nauseous riding on the roller coaster or watching the movie. Nowadays, no one obeys this rule. If a dictionary were to stick by its guns and say it's an error to say that the movie made me nauseous, it would be a useless dictionary: it wouldn't be doing what a dictionary has to do. This has always been true of dictionaries.
But there's a myth that dictionaries work like the rulebook of Major League Baseball; they legislate what is correct. I can speak with some authority in saying that this is false. I am the Chair of the Usage Panel of The American Heritage Dictionary, which is allegedly the prescriptivist alternative to the descriptivist Webster's. But when I asked the editors how they decide what goes into the dictionary, they replied, "By paying attention to the way people use language."
Of course dictionary editors can't pay attention to the way everyone uses language, because people use language in different ways. When you write, you're writing for a virtual audience of well-read, literate fellow readers. And those are the people that we consult in deciding what goes into the dictionary, particularly in the usage notes that comment on controversies of usage, so that readers will know what to anticipate when they opt to obey or flout an alleged rule.
This entire approach is sometimes criticized by literary critics who are ignorant of the way that language works, and fantasize about a golden age in which dictionaries legislated usage. But language has always been a grassroots, bottom-up phenomenon. The controversy between "prescriptivists" and "descriptivists" is like the choice in "America: Love it or leave it" or "Nature versus Nurture"—a euphonious dichotomy that prevents you from thinking.
Many people get incensed about so-called errors of grammar which are perfectly unexceptionable. There was a controversy in the 1960s over the advertising slogan "Winston tastes good, like a cigarette should." The critics said it should be "as a cigarette should" and moaned about the decline of standards. . A more recent example was an SAT question that asked students whether there was an error in "Toni Morrison's genius allows her to write novels that capture the African American condition." Supposedly the sentence is ungrammatical: you can't have "Toni Morrison's" as an antecedent to the pronoun "she." Now that is a complete myth: there was nothing wrong with the sentence.
Once a rumor about a grammatical error gets legs, it can proliferate like an urban legend about alligators in the sewers. Critics and self-appointed guardians of the language will claim that language is deteriorating because people violate the rule—which was never a rule in the first place. It's so much fun to be in high dudgeon over the decline of language and civilization that these critics don't stop to check the rulebooks and dictionaries to discover how great writers write or to learn the logic of the English language.
Poets and novelists often have a better feel for the language than the self-appointed guardians and the pop grammarians because for them language is a medium. It's a way of conveying ideas and moods with sounds. The most gifted writers—the Virginia Woolfs and H.G. Wellses and George Bernard Shaws and Herman Melvilles—routinely used words and constructions that the guardians insist are incorrect. And of course avant-garde writers such as Burroughs and Kerouac, and poets pushing the envelope or expanding the expressive possibilities of the language, will deliberately flout even the genuine rules that most people obey. But even non-avant garde writers, writers in the traditional canon, write in ways that would be condemned as grammatical errors by many of the purists, sticklers and mavens.
Another bit of psychology that can make anyone a better writer is to be aware of a phenomenon sometimes called The Curse of Knowledge. It goes by many names, and many psychologists have rediscovered versions of it, including defective Theory of Mind, egocentrism, hindsight bias, and false consensus. They're all versions of an infirmity afflicting every member of our species, namely that it's hard to imagine what it's like not to know something that you do know.
It's easiest to see it in children. In one famous experiment, kid comes into a room, opens a box of candy, finds pencils inside, and the kid is surprised. Then you say to him, "Now Jason's going to come into the room. What does he think is in the box?" And the child will say "pencils." Of course, Jason has no way of knowing that the box had pencils, but the first child is projecting his own state of knowledge onto Jason, forgetting that other people may not know what he knows.
Now we laugh at the kids, but it's true of all of us. We as writers often use technical terms, abbreviations, assumptions about typical experimental methods, assumptions about what questions we ask in our research, that our readers have no way of knowing because they haven't been through the same training that we have. Overcoming the curse of knowledge may be the single most important requirement in becoming a clear writer.
Contrary to the common accusation that academic writing is bad because professors are trying to bamboozle their audience with highfalutin gobbledygook, I don't think that most bad prose is deliberate. I think it is inept. It is a failure to get inside the head of your reader. We also know from psychology that simply trying harder to get inside the head of your reader is not the ideal way to do it. No matter how hard we try, we're at best okay, but not great, at anticipating another person's state of knowledge.
Instead, you have to ask. You've got to show people a draft. Even if you're writing for laypeople, your reviewers don't all have to be laypeople; a colleague is better than no one. I'm often astonished at things that I think are obvious that turn out to be not so obvious to other people.
Another implication of the curse of knowl.edge is that having an editor is a really good thing. Supposedly there are writers who can dash off a perfectly comprehensible, clear, and coherent essay without getting feedback from a typical reader, but most of us don't have that clairvoyance. We need someone to say "I don't understand this" or " What the hell are you talking about?" To say nothing of attention to the fine points of punctuation, grammar, sentence structure, and other ways in which a sophisticated copyeditor can add value to your written work.
How much of this advice comes from my experience as a writer and how much from my knowledge as a psycholinguist? Some of each. I often reflect on psychology behind the thousands of decisions I make as a writer in the lifelong effort to improve my prose, and I often think about how to apply experiments on sentence comprehension and the history of words and the logic (and illogic) of grammar to the task of writing. I might think, ", Aha, the reason I rewrote this sentence that way is because of the memory demands of subject versus object relative clauses,."
This combination of science and letters is emblematic of what I hope to be a the larger trend we spoke of earlier, namely the application of science, particularly psychology and cognitive science, to the traditional domains of humanities. There's no aspect of human communication and cultural creation that can't benefit from a greater application of psychology and the other sciences of mind. We would have an exciting addition to literary studies, for example, if literary critics knew more about linguistics.Poetry analysts could apply phonology (the study of sound structure) and the cognitive psychology of metaphor. An analysis of plot in fiction could benefit from a greater understanding of the conflicts and confluences of ultimate interests in human social relationshipos. The genre of biography would be deepened by an understanding of the nature of human memory, particularly autobiographical memory. How much of the memory of our childhood is confabulated? Memory scientists have a lot to say about that. How much do we polish our image of ourselves in describing ourselves to others, and more importantly, recollecting our own histories? Do we edit our memories in an Orwellian manner to make ourselves more coherent in retrospect? Syntax and semantics are relevant as well. How does a writer use the tense system of English to convey a sense of immediacy or historical distance?
In music the sciences of auditory and speech perception have much to contribute to understanding how musicians accomplish their effects. The visual arts could revive an old method of analysis going back to Ernst Gombrich and Rudolf Arnheim in collaboration with the psychologist Richard Gregory Indeed, even the art itself in the 1920s was influenced by psychology, thanks in part to Gertrude Stein, who as an undergraduate student of William James did a wonderful thesis on divided attention, and then went to Paris and brought the psychology of perception to the attention of artists like Picasso and Braque. Gestalt psychology may have influenced Paul Klee and the expressionists. Since then we have lost that wonderful synergy between the science of visual perception and the creation of visual art.
Going beyond the arts, the social sciences, such as political ,science could benefit from a greater understanding of human moral and social instincts, such as the psychology of dominance, the psychology of revenge and forgiveness, and the psychology of gratitude and social competition. All of them are relevant, for example, to international negotiations. We talk about one country being friendly to another or allying or competing, but countries themselves don't have feelings. It's the elites and leaders who do, and a lot of international politics is driven by the psychology of its leaders.
Even beyond applying the findings of psychology and cognitive science and social and affective neuroscience, it's the mindset of science that ought to be exported to cultural and intellectual life as a whole. That consists in increased skepticism and scrutiny about factual conventional wisdom: How much of what you think is true really is true if you go to the, the numbers? For me this has been a salient issue in analyzing violence, because the conventional wisdom is that we're living in extraordinarily violent times.
But if you take into account the psychology of risk perception, as pioneered by Daniel Kahneman, Amos Tversky, Paul Slovic, Gerd Gigerenzer, and others, you realize that the conventional wisdom is systematically distorted by the source of our information about the world, namely the news. News is about the stuff that happens; it's not about the stuff that doesn't happen. Human risk perception is affected by memorable examples, according to Tversky and Kahneman's availability heuristic. No matter what the rate of violence is objectively, there are always enough examples to fill the news. And since our perception of risk is influenced by memorable examples, we'll always think we're living in violent times. It's only when you apply the scientific mindset to world events, to political science and history, and try to count how many people are killed now as opposed to ten years ago, a hundred years ago, or a thousand years ago that you get an accurate picture about the state of the world and the direction that it's going, which is largely downward. That conclusion only came from applying an empirical mindset to the traditional subject matter of history and political science.
The other aspect of the scientific mindset that ought to be exported to the rest of intellectual life is the search for explanations. That is, not to just say that history is one damn thing after another, that stuff happens, and there's nothing we can do to explain why, but to relate phenomena to more basic or general phenomena … and to try to explain those phenomena with still more basic phenomena. We've repeatedly seen that happen in the sciences, where, for example, biological phenomena were explained in part at the level of molecules, which were explained by chemistry, which was explained by physics.
There's no reason that that this process of explanation can't continue. Biology gives us a grasp of the brain, and human nature is a product of the organization of the brain, and societies unfold as they do because they consist of brains interacting with other brains and negotiating arrangements to coordinate their behavior, and so on.
Now I know that there is tremendous resistance to this idea, because it's confused with a boogeyman called "reductionism"—the fear that we must explain World War I in terms of genes or even elementary particles.
But explanation does not imply reduction. You reduce the building blocks of an explanation to more complex phenomena one level down, but you don't discard the explanation of the phenomenon itself. So World War I obviously is not going to be explained in terms of neuroscience.On the other hand, World War I could be explained in terms of the emotions of fear and dominance and prestige among leaders, which fell into a deadly combination at that moment in history. And instead of just saying, "Well, that's the way things are, and there's nothing more we can say about it," we can ask, , "Why do people compete for prestige? Why do people have the kinds of fears that they do?
The answer doesn't have to be, "Because I said so" or "Because that's the way it is." You can ask, "How does the psychology of fear work? How does the psychology of dominance work? How does the psychology of coalitions work?" Having done that, you get a deeper understanding of some of the causes of World War I. That doesn't mean you throw out the conventional history of World War I, it just means that you enrich it, you diversity it, you deepen it. A program of unifying the arts and humanities with the psychological sciences and ultimately the biological sciences promises tremendous increases of depth of understanding for all the fields.
I'm often asked, "Who are the leaders of this movement? Whose writings should we be reading and discussing?" But that misses the point. It's not about individual people. It's more revolutionary than just reading this, that or the other person. There has to be a change in mindset coming from both directions. It's not just a question of getting traditional scholars from the humanities and social sciences to start incorporating more science, to start thinking more like scientists. It's got to work the other direction as well. A lot of scientists really are philistines when it comes to history and political theory and philosophy. We need to break down the idea that there are these separate disciplines and modes of study.
In trying to figure out what would give us the deepest, most insightful, most informative understanding of the world and ourselves, we have to be aware of the turf battles: who gets the franchise for talking about what matters. That is one reason that there is cadre of traditional intellectuals who have been hostile to science. I'm not talking about the climate deniers or the vaccine kooks but those who resent the idea that the discussion of what matters, of morality, of politics, of meaning, of purpose should be taken on by these philistines called scientists or social scientists. They act as if the franchise for these heavyweight topics has been given to critics and literary scholars and commentators on religion.
But we need not give credence to people who are simply protecting their turf. It's becoming increasingly clear over the decades and centuries that an understanding of science is central to our understanding of the deepest questions of who we are, where we came from, what matters. If you aren't aware of what science has to say about who we are and what we're like as a species, then you're going to be missing a lot of insight about human life. The fact that this upsets certain traditional bastions of commentary shouldn'tmatter. People always protect their turf.
That's why I'm reluctant to answer when I'm asked who are the people we should be reading, what names can we associate with this approach. It's not about people. It's about the ideas, and the ideas inevitably come piecemeal from many thinkers. The ideas are refined, exchanged, accumulated, and improved by a community of thinkers, each of whom will have some a few ideas and a lot of bad ideas. What we've been talking about is a direction that I hope the entire intellectual culture goes in. It's not about anointing some guru.
Another intellectual error we must be suspicious of is the ever-present tendency to demonize the younger generation and the direction in which culture and society are going. In every era there are commentators who say that the kids today are dumbing down the culture and taking human values with them. Today the accusations are often directed at anything having to do with the Web and other electronic technologies—as if the difference between being printed on dead trees and displayed as pixels on a screen is going to determine the content of ideas. We're always being told that young people suck: that they are illiterate and unreflective and un-thoughtful, all of which ignores the fact that every generation had that said about them by the older generation. Yet somehow civilization persists.
An appreciation of psychology can remind us that we as a species are prone to these bad habits. When we comment on the direction that intellectual life is going, we should learn to discount our own prejudices, our own natural inclination to say "I and my tribe are entitled to weigh in on profound issues, but members of some other guild or tribe or clique are not." And "My generation is the embodiment of wisdom and experience, and the younger generation is uncouth, illiterate, unwashed and uncivilized." better
There is no "conflict between the sciences and humanities," or at least there shouldn't be. There should be no turf battle as to who gets to speak about what matters. What matters are ideas. We should seek the ideas that give us the deepest, richest, best-informed understanding of the human condition, regardless of which people or what discipline originates them. That has to include the sciences, but it can't come only from the sciences. The focus should be on ideas, not on people, disciplines, or academic traditions.