Pierre-Narcisse Guérin,
Aeneas tells Dido the misfortunes of the Trojan city, 1815  

In April 1981, The New York Times Book Review published an essay by the literary theorist Geoffrey Hartman that proposed a whole different model of academic criticism. Whereas traditional criticism served the primary text, aiming to clarify, elucidate, and otherwise expound on the original, Hartman argued that “new kinds of commentary” possess an “expressive force” and mark an “inventive feat, a ‘creative’ rather than a definitive answer.” Older critics drew meanings out of poems, passed them on to readers, and retired, he stated, meeting the “plainer functions” of “reviewing and explaining.” The emerging interpreters are artists of language and thought in their own right, not “handmaiden[s] to more ‘creative’ modes of thinking like poems or novels,” and they deserve equal “dignity” and “will have to be read closely.” They have forged a “literature of criticism,” Hartman announces at the end, answering definitively the essay’s title, “How Creative Should Literary Criticism Be?”

Thirty years later, Hartman’s conception sounds fatuous and overdone, but it imparted well an advancing force in literary studies at the time. Hundreds of influential people claimed the same creativity in seminars and conferences, at the School of Criticism and Theory, and in the pages of Glyph, boundary 2, and other periodicals of High Theory. Outsiders may wonder how a collection of intelligent, learned professionals could have distinguished their practice with so much self-inflation, but the appeal was commanding. Individuals in other fields can hardly imagine how intoxicating it was to graduate students and assistant professors eager for approval and identity. To believe that those dreary months researching and writing a dissertation chapter would yield more than just a derivative exegesis of a poet more important and talented than they, to answer a query about their studies not with “Wordworth’s ‘Tintern Abbey’ means . . . ” but “My reading of ‘Tintern Abbey’ interrogates . . . ,” to anticipate that their journal articles would collect readers of them, not assist readers of someone else . . . who could resist? It made our toil adventuresome and purposeful, and it convinced others that literature professors were up to something momentous and transformative (for better or worse).

For those younger scholars eager for political action, not personal confidence, the liberation of criticism from obedience to great literature provided equal inspiration. While Hartman, Jacques Derrida, Stanley Fish, and other notables continued to heed the classics, postcolonialists, feminists, neopragmatists, ecocritics, queer theorists, and political critics of various leftist kinds addressed literature not as texts to be studied, but as pretexts for an agenda. If charged with “politicization,” they could invoke the “creative” turn as epistemological cover. When a panelist promised a feminist reading of Wordsworth’s Lucy poems and another raised the common-sense question, “But do those poems really have feminist content?” the speaker could reply, “Your question assumes that the poems are fixed objects to which we must respond ‘properly,’ but as Derrida and Rorty have shown, that is a mystification, and it restricts us from doing the labor of social change.” Whatever change the speaker proceeded to envisage, the disciplinary adjustment was plain: The field had to stand on us, the professors, not on them, the poems, plays, novels.

It couldn’t work, of course. First of all, the actual products of literary theory and criticism disappointed that elevated conception, the thousands of readings tumbling forth in quarterlies and conferences through the 1980s and ’90s looking like a plodding industry running down, not an outburst of creativity. The rhetoric amplified—book blurbs and letters of recommendation echoed with “bound to revolutionize” and “radical rethinking” and “brilliant new approach”—but how many times could the breakthrough element be repeated before it turned into routine hype?

Second, to establish creativity as a disciplinary norm was to forget the basic truth that genuine creativity belongs only to the fortunate few. For every Lionel Trilling and Harold Bloom there have been 5,000 more or less competent, unexceptional scholars doing journeyman work. Asking them to do more only evoked the overwrought and immodest efforts that made The New Criterion’s reports on the MLA Convention so humorous and sad.

Finally, apart from meeting political dreams and personal needs, the killing of primary texts—more precisely, canceling the primacy of them—could prosper only if a particular transfer took place. As the professors substituted their own activity for Great Books, the prestige of Hamlet, “Because I could not stop for death,” and Invisible Man couldn’t just go away. It had to fall upon them, the killer interpreters. That was the conviction—that the heritage of Dead White Males would lose its authority and the professors would gain it. The genius of Shakespeare would wane and the braininess of Judith Butler would soar. The transfer empowered them, and apparently they expected everyone but the retrograde elders to agree.

It was a fatal choice, this turn from canonical work to interpretative act, with damaging effects continuing today. We witnessed its disabling impact in a revealing episode last summer when the humanities became once more a topic of national conversation. Starting in June, a flurry of reports and commentaries appeared projecting a dim present and dark future for the fields. A report by the Commission on the Humanities and Social Sciences, a formation of the American Academy of Arts & Sciences, showed the humanities losing funding, enrollment, and appreciation from politicians and business leaders, “a pattern that will have grave, long-term consequences for the nation.” One week earlier, Harvard issued a like warning under the banner “The Teaching of Arts and Humanities at Harvard College: Mapping the Future,” which examined enrollments on campus and found that, since the mid-twentieth century, degrees in the fields have plummeted from 36 percent of graduates to 20 percent, while entering students who aim to major in a humanities field have shifted to other areas at disproportionate rates. A June 22 statement in The New York Times by Vernon Klinkenborg bore the title “The Decline and Fall of the English Major,” while Leon Wieseltier’s 2013 Commencement Address at Brandeis opened, “Has there ever been a moment in American life when the humanities were cherished less, and has there ever been a moment in American life when the humanities were needed more?”

The evidence they and others invoked was material, not ideological. As has been widely reported, foreign language programs have closed at several campuses around the country, tenure-track job openings are but a fraction of the number of Ph.D.’s on the market, research funding is down, the number of humanities courses in general education requirements is meager, barely one percent of four-year degrees are in any foreign language and less than four percent are in English.

In response, those who regretted the numbers offered arguments against them, joining numerous voices in recent years justifying the humanities as an essential course of study. The American Academy report praised the humanities because they help us manage a world undergoing profound change. In Not for Profit: Why Democracy Needs the Humanities (2010), Martha Nussbaum advocates urgently for the fields because they provide “skills that are needed to keep democracy alive.” Annie Murphy Paul’s June 3 article in Time magazine says it all in the title: “Reading Literature Makes Us Smarter and Nicer.” The Harvard study begins, “The Arts and Humanities teach us how to describe experience, how to evaluate it, and how to imagine its liberating transformation.” The leftist professor Henry Giroux links the decline of the humanities to the university’s receding identity as a “public good” whose leaders “address major social issues and problems.”

These statements and others on how the humanities foster critical thinking, cultivate Information Economy skills, help enact social change, resist utilitarianism in human affairs, etc., may be challenged in one aspect or another, but they are all reasonable and they pop up in education discussions all the time. Their commonplace status, however, shouldn’t obscure the fact that they share an extraordinary characteristic. It is a trait so simple and obvious, and so paradoxical, that one easily overlooks it, especially as these voices so earnestly endorse the humanities. The paradox is this: They affirm, extol, and sanctify the humanities, but they hardly ever mention any specific humanities content. The American Academy report terms the humanities “the keeper of the republic,” but the names Homer, Virgil, Dante, Shakespeare, Bernini, Leonardo, Gibbon, Austen, Beethoven, Monet, Twain, Frank Lloyd Wright, and Martha Graham never surface. In the Boston Globe (“Humanities: The Practical Degree,” June 21), Carlo Rotella claims that the humanities instill a “suite of talents” that include “assimilating and organizing large, complex bodies of information,” but he doesn’t tie that installation to any particular works of art. These pro-humanities documents drop a “Proust” and “Dickens” here and there, but little more. The works of the ages that fill actual humanities syllabi barely exist in these heartfelt defenses. Instead of highlighting assigned authors, artists, writings, and artworks, they signal what happens after the class ends: the moral, civic, and workplace outcomes.

In a word, the defenders rely on what the humanities do, not what they are. If you take humanities courses, they assure, you will become a good person, a critical thinker, a skilled worker, a cosmopolitan citizen. What matters is how grads today think and act, not what Swift wrote, Kant thought, or O’Keeffe painted. No doubt, all of the defenders love particular novels and films, symphonies and paintings, but those objects play no role in their best defense. Ironically, the approach resembles the very utilitarianism the defenders despise, the conversion of liberal education into a set of instruments for producing selected mentalities and capabilities.

What an odd angle, and an ineffectual one. Think of it from the perspective of two individuals whose decisions directly affect the humanities, one of them a twenty-year-old sophomore picking classes for spring term, the other a sixty-year-old state legislator on a committee setting the year’s higher education budget. If the sophomore avoids humanities courses, she hurts enrollment numbers for the fields, a factor in how a dean allocates resources across departments. If the politician discerns no palpable gain from humanities instruction, he will steer funds to technical colleges and vocational programs. What will change their minds?

Certainly not these guarantees:

•The reason we need the humanities is because we’re human. That’s enough. (Adam Gopnik, “Why Teach English?,” The New Yorker, Aug. 27)

•In the complex, globalized world we are moving toward, it will obviously benefit American undergraduates to know something of other civilizations, past and present. (Christina Paxson, “The Economic Case for the Humanities,” New Republic, Aug. 20)

•A fully balanced curriculum—including the humanities, social sciences, and natural sciences—provides opportunities for integrative thinking and imagination, for creativity and discovery, and for good citizenship. (the American Academy report)

•Our students are preparing to act adroitly in a global environment; they are also preparing to flourish in an austere job market. The Arts and Humanities are essential on both interrelated fronts. (the Harvard report)

Neither of our two figures will reject these points, but the points won’t sway them, either. The advantages they promise are too vague and deferred (“to know something of other civilizations,” “opportunities for integrative thinking,” “act adroitly,” “we’re human”), especially in contrast to other options (“major in speech therapy and become a speech therapist—there’s a shortage!”). Besides, social science fields claim the same insights, such as the anthropologist who rejoins, “And we don’t study what it means to be human?!” Hard scientists, too, might add, “You want critical thinking? Learn the scientific method!”

Tepid and half-credible, these fuzzy encouragements sound ever more vain and dispirited the more they circulate. They exhort the public to appreciate the humanities, but, with the grounds so abstract and promissory, the appeal falls flat. The failure comes down to bad marketing. The defenders misconstrue their audience. They think that support for the humanities will stand on the anticipation of a job skill, a civic sense, or moral self-improvement, but these future benefits are insufficient to youths worried about debt, politicians about revenue, and employers about workplace needs. No, students enroll and politicians fund and donors donate for a different reason, because they care about the humanities themselves, and they care about them because they’ve had a compelling exposure to a specific work. They may admit the moral, practical, and civic effects of humanities coursework, but that level of commitment can’t compete with other pressures such as manufacturers in a state telling the governor and college presidents that they need more grads with industrial skills. Whenever they do override those pressures, their devotion springs from an experience that lingers. People back the humanities with their feet and pocketbooks because they savored Monet’s seascapes, got a thrill when Frederick Douglass resolves to fight Mr. Covey, and relax after work with Kind of Blue or Don Giovanni. They had an 11th Grade English teacher who made Elizabeth Bennet and Henry V come alive, or they recall a month in Rome amid the Pantheon, St. Peter’s, the Trevi Fountain, and Apollo and Daphne as a high-point of their college days.

Their attachment pinpoints for the defenders a winning tactic: Underscore the object. To attract the undergraduate who focuses narrowly on career and the CEO with $10 million to give, advocates should realize, don’t wax righteous or pragmatic about the humanities as a bloc or as an instrument. Rather, show them vivid images of architecture in Washington, D.C.; recount Captain Ahab on the quarter-deck enlisting the crew in his obsession, or Dido’s reaction once she learns her beloved Aeneas has slipped away in the night, or Satan in the Garden eyeing Adam and Eve, tormented by their innocence and plotting their ruin; stage the avid sadism of Regan and Goneril or the banter of Algernon and Ernest; or run the final scene when the Tramp, just out of prison, turns to face the blind flower girl, now cured, who clasps his hand, grimaces at the sight of him, and mutters, “Yes, I now can see.” These are the materials of inspiration, and they are the highest card the humanists can play.

My former boss Dana Gioia understood it well. As Chairman of the National Endowment for the Arts (2003–09), he was obligated to use the bully pulpit and summon local and national, public and private support for museums, orchestras, and after-
school arts programs. It was a delicate task partly because of the suspicion conservatives retained for this agency at the center of the Culture Wars ten years earlier, and partly because saying the wrong thing could jeopardize the annual request for funding from Congress. In the early 2000s, as No Child Left Behind pressed schools to cut arts, theater, dance, and music programs, organizations such as Americans for the Arts offered standard reasons for arts education including the commercial value of arts investments, better reading and math scores by kids in schools with music instruction, and behavioral improvements for kids in theater programs. Gioia recited them dutifully, but relied at critical times on another one: direct exposure. When he conceived a national initiative called Shakespeare in American Communities with a large in-school component, he might have presented it to Members of Congress in testimony backed by the usual moral and economic corollaries. But instead, he hosted an event on Capitol Hill for Members and invited 5th-graders from Rafe Esquith’s legendary Shakespeare program in Los Angeles to show up in Elizabethan garb and perform scenes and soliloquies for them.

The event proved the point. The kids acted splendidly, and a few Members themselves grabbed a costume and declaimed lines, reenacting their own school days and drama club. The politicians had heard every rationale for cultural programs before, but the call of “We few, we happy few, we band of brothers” they could not withstand. Gioia got the funding—and heaps of good will, too.

Exposure works better than explanation, participation better than entreaty. The humanities defenders, mistakenly, try to persuade and coerce when they should intrigue, excite, fascinate, and inspire. Why humanities defenders neglect this no-brainer option, why they lay down their strongest weapons, is a mystery only if we forget the turn from primary texts decades earlier. It marks a deeper source of renunciation than multiculturalism, which doesn’t quite explain it even though ’80s- and ’90s-era contempt for “Eurocentrism” certainly speeded the takedown of primary texts inaugurated in the ’70s. The problem with multiculturalist accounts is that the defenders could easily serve multiculturalism by citing great works by women and people-of-color authors and artists, for instance, insisting “We must study the humanities so that we can preserve the superb tradition of African American writing,” then adding striking and noble passages from Phillis Wheatley, Douglass, Up from Slavery and Souls of Black Folk, Hurston’s fiction and Baldwin’s prose. But they don’t.

Behold another dead-end for adversarial culture, a crazy situation in which professionals profess their materials only in a critical mode, never in an approving one. When circumstances solicit them to espouse their own content, the defenders hail ancillary effects instead, and the eyes of their audience glaze over. Do they expect a stirring rendition of Hector in the dirt, bleeding to death while Achilles pledges to defile his corpse, only to reply, “So this is Achilles”—will that lead to charges of naïve hermeneutics and Western-Civ imperialism? Do they really think that fuzzy estimates of mental and moral rewards will boost enrollments and resources? Whatever they expect, they suffer an alienated condition. Chemistry professors esteem chemistry, psychology professors psychology . . . and their conviction serves them in the campus marketplace and the public sphere.

But not humanities professors. They have sacrificed the great tradition that was their raison d’être to a vain belief about themselves. They exchanged their meal ticket for a moment of counter-cultural roguishness and pseudo-admission to the company of creators. They were theoretically astute and politically progressive, but institutionally stupid. When anti-humanist forces arose—careerism in undergraduates, utilitarianism among administrators—they didn’t mount Dryden-Pope-Swift or Michelangelo-Raphael-Titian campaigns to maintain pipelines. They complained about anti-intellectualism and corporatization. When a curriculum committee recommended reducing humanities general education requirements, none of them rose to declare, “Have you read Augustine stealing pears? Let me detail it for you. . . . Is there any better rumination on peer pressure and sin? NO, AND EVERY STUDENT ON THIS CAMPUS SHOULD READ IT!” They couldn’t, and the saddest fact of all is that many of them applauded the expulsion of core humanities content.

Still, it isn’t too late, and the great words and images haven’t lost their power, only the purveyors of them. In one brief section, the Harvard report breaks the gag order with eight humorous sentences on Aristophanes’ Clouds (423 BC), and everything quickens. It’s funny and sharp, and it yields exactly the effect that every other paragraph misses: It makes you want more. If we wish to maintain the humanities against hostile or indifferent attitudes, we should capitalize on this advantage, but to do so the insiders must first alter their self-conception. We need a disciplinary adjustment, a reversal of the creative turn Hartman broadcast thirty years ago. Critical schools rise and fall, academic trends shuffle at an accelerating pace, celebrities are made and forgotten, but Hegel’s master-slave sequence, Wordsworth’s “and, oh,/ The difference to me!” O’Keeffe’s blossoms, Parker’s riffs . . . they endure and they impress no matter how much the professors unmask, demystify, politicize, and otherwise play with them. We are far enough removed and in bad enough shape to judge the denial of the greatness and priority of High Art a terrible miscalculation. Unless the professorate reasserts its subservient role and foregrounds actual genius, all the solemn committee reports and importunate op-eds in the world won’t slow the steady deterioration of the fields.