Sign in  |  Register

The New Criterion

It operates as a refuge for a civilizing element in short supply in contemporary America: honest criticism
- The Wall Street Journal


December 2003

“All sail, no anchor”: architecture after modernism

by Michael J. Lewis

On American modernism in architecture. The fourth of our series “Lengthened shadows: America and Its Institutions in the Twenty-first Century.”

When architecture gets a hall of fame, it needs to find a niche for a certain amiable rogue I will refer to as Palladio of the Wastepaper Basket. He made his mark during the 1960s at Yale's school of architecture. There it is the monthly task of students to design a hypothetical building, for which they make a model out of cardboard and foam core in a notoriously time-consuming operation. The process culminates in the crit, the stressful and often prickly review session in which visiting critics inspect the models and question the students, unfailingly finding the weak points of both. It is often the case that a verbally nimble student makes a better impression than an inarticulate designer, even one with a better design.

Our Palladio of the Wastepaper Basket found it more congenial to talk than cut cardboard. He prowled the halls of the architecture school at night, ransacking the trash for old models that had been discarded by their makers when the concept failed to stick. These he dusted off and submitted as his own work, often for a different class and a different assignment. A model of a circular drive-in restaurant might do duty as a church, for example, or—shifting the scale—a one-room beach house. Students watched their jetsam float back into the classroom but no one took it amiss. It seemed to fit the anarchic spirit of the times. But it also fits the spirit of our time, perhaps in a deeper way, when the words that fly around architecture seem fundamentally independent of it, and have precious little to do with its vital essence.

Fifty years ago there was no such split: architecture and the words written about it stood in harmonious concord. This is seldom true in art, where theory and practice normally race wildly after one another. But modern architecture reconciled the opposites of book and building to an astonishing degree and achieved a surpassing programmatic unity. Modernism's founding figures, especially Le Corbusier and Walter Gropius, were themselves theorists, and both wrote and designed. And the principles carefully spelled out in Le Corbusier's Vers une architecture (1923)—the open plan, the elimination of ornament, the rational use of modern materials—lent themselves to programmatic abstraction in a way that those of neoclassicism or the Baroque could not. This was more than mere intellectual consistency. It was the broad and comprehensive unity of the Modern Movement itself, of which architecture was but one lobe, and in which painting and sculpture, music and poetry aspired to the same radical emancipation from traditional structures of form and authority.

We take for granted the conquest of America by this high modernism, as something that was in the course of things inevitable, but in many ways it was quite alien to the American experience. A man of the 1920s would have scoffed at the notion that America needed a radical modernism, for America already was modern. Jazz, cinema, the radio (a national craze since 1925), and the affordable automobile were all quintessentially modern phenomena. So was the fanciful Art Deco skyscraper: the Chrysler Building with its winged chrome sentinels or the Empire State Building, whose soaring spire was to be a mooring mast for zeppelins, yet another modern phenomenon.

But this American modernism was organic rather than programmatic, and was accomplished without benefit of theory and manifesto; it was modernism without program. Its achievements were made possible by modern engineering and science, manufactured by the modern assembly line, and disseminated through mass-marketing (likewise modern), without central coordination. Its most striking aspect was its unpremeditated, even inadvertent character. Some of its accomplishments were not even the product of conscious thought, let alone theory. For example, the new skyscrapers: Prior to 1916, skyscrapers were permitted to rise without limit, to blacken the streets below and to cast their neighbors to the north into permanent shadow. In that year New York City passed a remarkable zoning ordinance that sought to protect property rights by restricting the height and bulk of skyscrapers. At each increment of height, a building was required to step back a prescribed number of feet (depending on the width of the street below); on only one quarter of its footprint could it rise without constraint. The result was the characteristic chiseled silhouette of the modern skyscraper (which fortuitously resembled ancient Mayan pyramids, then being studied, and brought about a charmingly incongruous fad for Mayan ornament). Here was an absolutely new architectural form, as distinctively American as the colonial saltbox, whose principal formal element—its forceful and expressive contour—was not an artistic invention at all, but a hard-earned armistice line between the demands of real estate and of legislation. Such was America's non-programmatic modernism.

The modernism that was simultaneously arising in Europe followed an entirely different trajectory, where it was both more urgent, in the wake of the First World War, and more centralized. To a continent mutilated by four years of cruel trench warfare, during which time it built little, the urgency was understandable. And the centralization was the normal state of affairs in Europe, where architectural patronage and industrial policy had always been a matter of state patronage, and was regularized in the state architectural bureaucracies that had arisen in Europe during the seventeenth and eighteenth centuries. The socialist or paternalist model of patronage on which modernism depended was the natural successor to this. Le Corbusier's audacious and daft proposal to demolish medieval Paris and replace it with serried ranks of rational towers differed from the urban interventions of Napoleon or Baron Haussmann only in degree.

In America, where architecture was historically a matter of private taste and commercial patronage, this model had but little applicability. Americans were hardly ignorant of European innovations but partook of them as consumers rather than believers, much as American painters had once discovered French Impressionism, stripped it of its social and theoretical content, and brought it home as a lighthearted and diverting fashion. When George Howe adapted the flat roof and abstract cubical massing of European modernism, it was for a fashionable suburban house. Most American architects were already moving in the general direction of their modernist counterparts, toward simplified volumes and flowing space. They needed no ukase from Le Corbusier prohibiting the cornice; the cornice had already fallen in disrepute by the early 1920s, as had the woman's

So long as capitalism thundered along, and the Twenties still roared, Americans had only passing interest in Europe's programmatic modernism. All this changed in October 1929. The Great Depression ravaged no American institution, the stock market excepted, as much as it did architecture. The violent, almost instantaneous contraction in building activity was stupefying. A single example from T. E. Tallmadge's 1936 Story of Architecture in America shows its magnitude. In 1927 the city of Chicago approved 12,025 building permits, for a total value of nearly $353,000,000. In 1932 the figure was only 467 permits, valued at $4,000,000—barely one percent of its pre-Depression activity. The figures for New York are similar. The Empire State Building, begun in 1929 and finished in 1931, was the end of the line. In short order, almost the entirety of the American architectural profession was unemployed. As private and commercial patronage stagnated, the American architectural world took on European lines. The state became by default the principal architectural patron; the buildings being built were schools, hospitals, community centers, and—above all—social housing projects, not skyscrapers or suburban manors. The whole world of Twenties eclecticism, of faience-tiled movie palaces, neo-Mediterranean villas, and moderne stores, had vanished, along with the capitalist patronage that had sustained them. Under these changed circumstances, European modernism looked different indeed.

The new state of affairs was captured in The International Style, the catalogue to the celebrated 1932 exhibition at the Museum of Modern Art, curated by Philip Johnson and Henry Russell Hitchcock. In it the buildings that were heralded a few years earlier as the very embodiment of American might and sophistication were subjected to malicious mockery. Instead of bold acrobatics in the sky, these were the vulgar acts of crass commercial clients. In the book's preface, Alfred H. Barr, the founding director of MOMA, did not try to mask his contempt: "We are asked to take seriously the architectural taste of real estate speculators, renting agents, and mortgage brokers." Of course it was easy to scoff at capitalism in 1932 when it lay prostrate. Only after their owners were penniless did the jaunty forms of Art Deco skyscrapers begin to look ludicrous. The Empire State Building was discredited not by the fanciful zeppelin mooring mast on its summit but by the apple vendors huddling around its base.

It is one of history's fateful coincidences that at the very moment of the collapse of America's architectural patronage, a stream of refugees began to make their way from Hitler's Germany to the United States. These included the key figures of the Bauhaus, Walter Gropius and Ludwig Mies van der Rohe, who found plum positions in America's schools of architecture and design, the former at Harvard and the latter at the Illinois Institute of Technology (IIT). To a remarkable extent, American conditions now recapitulated those of Europe in the 1920s: a dire economic crisis, enlightened government patronage, and an activist cadre of influential modernists inspired by a sense of mission. Of course the prostrate condition of commerce would not last, but, when it recovered, America's architectural culture was permanently changed.

In terms of absolute numbers, the European émiggrés were negligible, but they exercised a wildly disproportionate influence. And if the great body of American architects was indifferent or hostile, they were now jobless and therefore irrelevant. In short order, other schools of architecture followed the lead of Harvard, and wherever they landed, modernist deans conducted swift and efficient purges of dissenters. A few anecdotes suffice: at Washington University, St. Louis, the modernist dean Buford Pickens opened his Saturday newspaper to see a rendering of a neo-Georgian office tower by one of his instructors--rank heresy; the man was fired on the spot. At the University of Pennsylvania, the chief instructor in design was Harry Sternfeld, the 1914 winner of the prestigious Beaux-Arts Institute's Paris Prize and the designer of Philadelphia's most imaginative Art Deco and moderne buildings. Unable to fire so prominent a figure, the new modernist dean relegated him to teaching "professional practice," an act on the order of requiring Cézanne to teach the care and cleaning of paint brushes. Every school has similar stories. And each school eliminated or drastically truncated its courses in the history of architecture, which had once been the mainstay of architectural literacy and their chief source of raw material for design.

These purges call to mind the political dimension of high modernism. Architecture was but one of many modernisms, each drawing strength from the others, but all were deeply conscious of the greatest modernism of all, the Russian Revolution, which always lay in the background. Modernism came to America at a time when world revolution still seemed distinctly possible, if not imminent. [1] And it came surrounded by that mightiest of accompaniments, the bold and silencing aura of inevitability. If the Russian Revolution could topple those principal pillars of western culture—religion, capitalism, class structure, the family itself—how much easier was it to sweep away music's tonality, poetry's meter, or architecture's gabled roof? Of all the arguments that modernism could bring in favor of its spacious claims, historical inevitability was the one that brooked no reply.

But even as modernism established itself in America, the cultural vanguard of the West was gradually losing faith in the Soviet Union—a prolonged process that stretched from the purge trials of 1937 to the invasion of Hungary in 1956—and the Modern Movement as a whole shed its presumption of historical inevitability. A bundle of separate and discrete modernisms, literary, musical, and so forth, remained, still omnipotent in the institutional compounds in which they were ensconced but without the bracing sense of being part of a collective movement that was reshaping the whole of civilization.

Ironically, this happened just as American modernism was enjoying its most conspicuous public successes, in the 1950s. America's commercial patronage had long since rebounded and made its peace with modernism; in fact, America's greatest monuments of modernism are those raised by corporations: the Seagram Building, Lever House, the TWA Terminal, the headquarters of Pittsburgh Plate Glass. Here was the high water mark of modernism, when modern architecture was endorsed equally by American business, government, and the academy. Thus Gropius could design both the Pan Am Building in New York and the American embassy in Athens, even as he presided over Harvard University's Graduate School of Design. By then there was no longer any appreciable dissent. For a brief and startling moment, the architectural avant garde and establishment had merged and become identical.

Passing in turn through depression, world war, and prosperity, modernism's hegemony and overweening moral authority lasted but a generation. If one had to pick a death date, the point at which the collapse of modernism was irrevocable, one could do worse than 1968—a year when so many things seemed to go wrong at once. Since then no movement has enjoyed anything remotely comparable to its prestige and authority. In fact, the last point of common agreement on which the dissenters to modernism concurred before they fragmented into a welter of contending tendencies and movements was the fact of modernism's failure. Here was the last consensus, the last fixed navigational point on the horizon (the same role played in the pictorial arts by Clement Greenberg's formalism, which fell from favor at this same moment: in a trackless ocean, one navigates from the well-documented shipwreck).

The critique of modernism had both a theoretical and an empirical wing. The theoretical assault took modern architecture to task for its reductivism, its mad attempt to encompass the tempest of human experience within the hard matrix of rational form. Empirical critics had it easier; they only had to look, and soon assembled a vast roster of exhibits: malevolent social housing projects, totalitarian megastructures, and cities disfigured by blundering urban renewal campaigns. Jane Jacobs, that perceptive observer of urban life, produced the most damning manifesto of all, The Death and Life of Great American Cities (1961), a book with a startling international resonance, particularly in Germany. Other thoughtful observers of public life and culture, notably William H. Whyte, followed in her path.

The most deadly challenge of all, however, came from within the ranks of modernism itself, and bore the imprimatur of the Museum of Modern Art, the keeper of the modernist flame. This was Robert Venturi's Complexity and Contradiction in Architecture (1965), a book that did much to extinguish that flame. As with Gorbachev's perestroika, there are some paths of reform that, once launched, soon acquire their own willful momentum. Complexity and Contradiction, an incendiary manifesto effectively disguised as an interminable art history lecture, used examples as diverse as Michelangelo and Levittown to show that the formal clarity of modernism—its chilly, ruthless idealism—represented a fatal diminution of architecture's expressive power: its infinite capacity to challenge the intellect, to bewilder, inform, and delight.

Venturi drew his examples across the whole of architectural history and geography, reproduced in tiny postage-stamp images. Having been banished from schools of architecture, history was now permitted in once more, although under a sort of probation--not as style (a formal and coherent system of design) nor as archaeology (forms derived from certain historical and cultural conditions), but simply as a great schoolhouse of graphic ideas, whose origin and authorship was ultimately irrelevant.

And so, one by one, the separate pillars of modernism were kicked away during the course of the 1960s. But to demolish an edifice is one thing and to raise one quite another, and here the critical facility proved mightier than the creative. While the past few decades have given rise to a dazzling pageant of styles and movements, none of them has come remotely close to replacing the role played by mid-century modernism.

Of course, not all modernists practiced the rarefied, haiku-lite minimalism of Mies van der Rohe, and one faction, the New Brutalists, sought to restore architecture's lost physicality. This they achieved with concrete, a material as modern as steel but with the visceral properties of weight, texture, and bulk. Beginning in England in the mid-1950s, and soon reaching the United States, the New Brutalists produced muscularly textured buildings clad in rippling corrugated concrete whose surfaces were coarse enough to draw blood. It took only a few conspicuous examples—Yale's Art and Architecture Building by Paul Rudolph and Boston City Hall by Kallmann, McKinnell, and Knowles—for the public to recoil violently from Brutalism and its claim that its focus on architecture's palpably physical nature was a gesture of humanism.

Others handled concrete more adroitly and serenely. The Philadelphia architect (and mentor to Venturi) Louis I. Kahn formulated a stately, substantial modernism, constructed of masonry, and evoking the weight and permanence of the great buildings of the past. At their best, his works have the tragic dignity of archaeological ruins. Kahn was a late bloomer; after having been schooled in the early 1920s according to the old Beaux-Arts system with its insistence on strong axes and geometric order, he cast this off to embrace the flowing space and dematerialized planes of modernism. Still, he never quite fit in as a mainstream modernist. Only in the 1960s did he find his voice, when he reconciled modernism with the classical principles he had encountered as a student and during subsequent travel to Europe. In such masterworks as the Kimbell Art Museum and the Salk Institute at La Jolla, he triumphantly rehabilitated those respective victims of Mies and Le Corbusier: the wall and the room.

The quiet introspective character of Kahn's work lent itself to civic commissions, but not to houses or commercial buildings. Moreover its essential seriousness made it suspect in the eyes of a generation wearied of Modernism's self-consciously heroic stance, and ultimately Kahn's legacy has been more inspirational than practical. Another paragon of concrete was Eero Saarinen, of Cranbrook, whose great TWA Terminal was itself a diagram of flight, concrete frozen into great liquid arcs, depicting a type of architectural energy that had not been exploited since the Baroque: torque. But this veered close to expressionism, which as a highly personal language was ill-suited to the formation of a collective language. Saarinen's forms are still reprised whenever a building needs to look like a snail or conch, such as hockey rinks, but his influence essentially ended with his premature death in 1961. (Nonetheless the writhing forms of Frank Gehry, a student when Saarinen was in his glory, betray a lingering influence)

But these were late modernisms, not anti-modernisms. They still accepted the central imperative of modernism, that a radically new age required radically new forms. Ultimately it is skepticism or derision of this claim that distinguishes the non-modernist, or postmodernist, from the modernist. Even when otherwise using modernist materials, detailing, and planning—these lessons of modernism were not easily unlearned—it was their anti-heroic stance that indelibly marked them. And it was in Robert Venturi that anti-heroism found its anti-hero.

Venturi's buildings in and around Philadelphia put his doctrine of complexity and contradiction into practice. His first major work, Guild House (1962), managed the hat trick of violating every norm of high modernism—it had a monumental arched entrance, a clear hierarchy of major and minor spaces, ambiguous and multivalent architectural elements—and did so within the context of a modest brick home for aged Quakers. Even the dull brick wall at the rear was radical, an explicit reference to the undistinguished factory architecture of the neighborhood. This was a revolutionary nod to context, that most humble of architecture criteria, and one which the International Style, with its universal aspirations, had swept from consideration.

A few years later, Venturi's celebrated Mother's House rehabilitated the gable and the chimney, evoking the great Shingle Style architecture of the nineteenth century, and opening the gates to a deluge of historical references, all playing across the plane of the façade in lively calligraphic animation. In place of the laconic Cartesian order of Mies, here was an architecture that not only spoke to the viewer but did so volubly.

Venturi's book and buildings helped set in motion the movement that came to be known as Postmodernism, although he himself has rejected the term. With its stance of permissive eclecticism, openness to history, and sense of responsibility to making the city more humane and livable, the movement proved enormously attractive. Venturi was soon followed by Charles Moore, Michael Graves, Robert A. M. Stern, and others, the high-spirited doyens of postmodernism. Even Philip Johnson joined in with his AT&T Building in New York, a skyscraper with a broken pediment at the skyline, penance perhaps for the abuse he heaped on Art Deco skyscrapers during his youth. For a brief moment around 1980 it seemed as if postmodernism offered a worthy and viable successor to modernism, which might knit the raveled sleave of history, reconnecting the orphaned present to the great channel of the past. There was a great optimism and idealism about this, and those who were in schools of architecture, here or abroad, at that time will recall the giddy hopes that then blazed up around postmodernism.

Although it was not immediately apparent, a certain paradox afflicted the Venturi rebellion. On the one hand, it was wonderfully whimsical. Its overflowing bounty of irreverence seemed essential, its laughter the only sure way of deflating the prim custodians of modernism. This was the era of Pop, after all, of "make love not war," and Venturi's aphorisms had the pithy punchiness of protest slogans like "Less is a Bore," "ugly and ordinary," "Main Street is almost all right." After architecture's long afternoon of Teutonic severity, this whimsy came as a much needed tonic. (It is difficult now to imagine how heavy the authoritarian hand rested on schools of architecture. A friend designed a schoolhouse in the 1960s in which each classroom was expressed by a graceful curved bay; her dean drew a large X across them, warning her, "Modern architects do not use curves.")

On the other hand, the movement also invited a certain amount of intellectual pretension. The chimney and roof gable of Venturi's Mother's House, for example, were no mere revival of traditional architectural motifs, no Shingle Style revival, but were primal elements of the house, signs of fire and shelter; they could be read as components of a complex system of signs and symbols, the study of which is semiotics. Through a strange quirk of timing, universities in the 1970s and 1980s were awash in the fashionable study of semiotics, an offshoot of linguistics, and it was inevitable that postmodern architecture was studied not so much for its specific cultural content but as a system of language. To many, it seemed, this was as ponderous a philosophical system as the one just toppled, and without the added attraction of imminent utopia.

But while postmodernism turned pretentious in the academy, it turned goofy in the streets. By the mid-1980s it had become a kind of architectural zoot suit (to use Richard Etlin's phrase), distinguished by cut-out windows, red granite stripes, and a liberal share of pastel walls. Many of the examples in Venturi's Complexity and Contradiction were drawn from America's anonymous roadside architecture, and now postmodernism returned the roadside the favor, finding in the billboards and fast-food joints of the highway strip a fountain of jaunty commercial imagery.

This process of borrowing motifs from such a diverse range of sources—from Michelangelo to McDonald's—produced a certain leveling, both literally and figuratively. After all, Complexity and Contradiction treated architecture in essentially two-dimensional terms, reproducing diagrams of palace and church façdes schematically to show their wonderfully complex geometry. The buildings inspired by Venturi's book likewise tended to the two-dimensional and schematic, their façdes serving as mere billboards. The act of designing a building came to resemble the work of a graphic designer rather than a sculptor. For commercial buildings, which need to swagger and strut and catch the viewer's eye, this did not matter much, but as complete architecture—fully realized in its tactile, plastic, and sculptural dimensions—it fell short. It is ironic that this movement that faulted modernism for impoverishing architecture did a good bit of impoverishing itself.

As this occurred, and as postmodernism became tainted by its association with the crassness of commerce, it was quickly disavowed by the leading schools of architecture. By the mid-1980s, postmodern buildings were seen as the architectural manifestation of the "decade of greed," of "Ronald Reagan's America," and prominent commercial architects such as Michael Graves and Helmut Jahn were scorned for their frivolity and envied for their success. Their unabashedly commercial practices were contrasted unfavorably with those earlier modernists who had specialized in social housing projects. That it was precisely the disastrous failure of these same social housing projects that had helped bring about postmodernism in the first place was conveniently forgotten.

To some extent, the charge of frivolity was justified. Although postmodernists cheerfully ransacked the past for usable form, much like Palladio of the Wastepaper Basket, they did so without reference to their original meanings, or the historical circumstances of their birth. Their cut-out porticos did not represent, for example, the majesty of Roman law and their great curved atria, such as Charles Moore's Piazza d'Italia in New Orleans, had no connection to counterreformation theology. (The examples were invariably taken from classicism; Gothic architecture was not included in the general rehabilitation of historical styles, perhaps because the playful sensibility of postmodernism worked better against a strict formal system such as classicism).

If postmodernism was playful, it was not insincere, for it never pretended to seek a revival of the historical styles. [2] The forms it quoted and paraphrased it held at a distance, with the wry detachment of the Pop artist, not recreations but "ironic references." As a foundation on which to revive a style, let alone bring another one into existence, irony is no Gibraltar. But the creation of an authentic new style was never in the cards. The great architectural styles of the past, Roman or Gothic, or modernism for that matter, arose with a new structural principle or new social order—and usually both. Postmodernism offered neither, beyond what modernism had already created. The postmodern revolution tended to touch only the visible surface of the building, the thin cladding atop the steel frame, which exchanged its glass curtain wall for a festive skin of stripes, cutouts, and polished stone panels. For all its lofty rhetoric, it amounted to little more than a thorough-going fashion makeover, giving modernism a new suit of clothes and the fixed knowing wink it has worn ever since.

The ultimate meaning of every building is that conveyed by the society that produces it. Of all the arts, architecture is most fully a social act. The making of a novel, a symphony, or a painting occurs in private, but a building is the product of a complex collaboration between designer, builder and client, involving the expenditure of capital, and insertion of permanent objects in the social space of the community. And whether it represents Communism or Christianity, Roman civitas or the Greek polis, every building in the end is the concrete manifestation of a belief system. For an architecture without a belief system is but a mechanical art, differing only from plumbing in its complexity and in being subject to certain cyclical oscillations in fashion.

Postmodernism, for all its charms, was not a belief system; modernism, for all its faults, was—and in many ways an attractive one. It certainly had the merit of completeness. It took upon itself the responsibility of providing for everything, its purview reaching from a building's smallest detail to the most spacious aspect of a city. This imperative to transform architecture to meet the demands of the modern world, functional and psychological, gave it an enormous sense of responsibility, which it did not shirk. Nothing like this belief system has emerged in the collapse of modernism, and certainly not the belief system that preceded it: the liberal and modern version of western culture as it emerged from the nineteenth century. Without a confident and self-assured civilization to give it direction from without, architecture was left to fend for itself from within. In place of a comprehensive belief system, it came up with that internal surrogate: theory.

During the 1970s, history made its way back into the academy but now in the form of a professor of history and theory, often of continental pedigree. It was his task to lend the content-free and relatively straightforward process of teaching students to design a certain gravitas (although in some cases, levitas might be more to the mark). His was a tall order: to replace the void left by the heroic imperative of modernism, or of belief in western civilization itself, and to do so with nothing more than some speculation about the fundamental ambiguity of signs and symbols. Of course, some theorists have aimed at goals higher than cant. Those who have, like David Leatherbarrow at the University of Pennsylvania and Karsten Harries at Yale, tend to teach architecture from the point of view of decorum, as a cultured art with a long accumulated tradition of rules and customs, with which any aspiring designer must be acquainted. It is by no means the worst approach to teaching young architects: if one does not believe in something, one can at least be literate.

Of course architecture students are not foolish, and they quickly observe that the intellectual and theoretical models in these courses are usually superfluous to the process of learning how to design (although in a pinch they might provide a flashy bit of window dressing in a crit). Those who avidly study theory are typically those who themselves aspire to teach it, who are often foreign students who will later teach in their home countries. Students during the 1990s came to have far more pressing concerns than rarefied continental philosophy, such as the harnessing of the computer and CAD (computer-aided design) in the design process. The architectural student at the end of the century was spending far less time reading treatises and writing essays than his predecessor even a decade earlier.

One of the chief reasons for the indifference to architectural theory is that theory is irrelevant to the element of competition—one might almost say Darwinian struggle#8212;that animates architecture in a capitalist society. It is regrettable that theory has not confronted the fact of competition foursquare, for it is the cauldron of America's most interesting architecture. It is there where new technologies are forged and new forms introduced; these are the most conspicuous buildings, the ones with the largest budgets on the most prominent sites. Of course jostling aggressive competition roils the suburban street as well, where architecture serves the exaggerated descriptive role it always assumes in a nation with an elastic class system. While European architecture has a long tradition as a polite and urbane art, working within a settled social order, the American building invariably seeks to differentiate and distinguish itself. The polite ensemble is not an American strength.

For these reasons, America's bouts of greatest innovation and originality invariably coincide with the most overheated financial booms: the post-Civil War boom produced the aggressive Victorian generation, including Frank Furness and those inventors of the commercial skyscraper, Louis Sullivan and John Wellborn Root; the 1920s boom created the peerless inventor of skyscrapers, Raymond Hood; and even the much lamented 1980s brought Michael Graves and Charles Moore to the fore. The decades of "greed" have been kind to architecture.

While this was proceeding apace, one group of architects clung stalwartly to the great modernist cause. These were the Whites, so designated in the early 1970s for their uncompromising quest for purity (in contrast to the omnivorous eclectics of postmodernism, who were termed the Grays). Emulating Le Corbusier's Villa Savoye and Villa Stein, they took as their point of departure the cube, whose sides they unfolded and bent as if held on invisible hinges, in an increasingly complex process of geometric manipulation and axial rotation. Richard Meier and Peter Eisenman are the best known of the Whites, the latter for the emphatically rectilinear houses he built since 1967, which he numbered sequentially (a cheeky act in a field where houses are invariably designated by the names of their clients), indicating that they were part of an experimental series.

The Whites were often congratulated for their "intellectual rigor," as if their whirling axes and colliding grids were the hard-won solution to a vexing mathematical problem. In actuality, the only variable being satisfied was the architect's own eye. At bottom arbitrary, these bent axes and fractured parallelepipeds were not Euclidean proofs but spoofs of them. Clients found all this terribly perplexing but it looked quite earnest, a serious virtue if the alternative was a lighthearted or vacuous commercial architecture. And when a billion-dollar investment was at stake, as the mighty Getty Museum in Los Angeles was, it is easy to see that mock seriousness would trump playful irony--and a modernist like Meier would win the commission.

Several years ago I debated with some colleagues about what our students really ought to know about American history, their own culture. I suggested that an ability to distinguish the War of 1812 from the Spanish-American War (a not uncommon mistake) was essential knowledge. A colleague disagreed, saying the Stonewall Riots were essential knowledge to her. But a more senior colleague was wiser than us both. Our duty, she said with startling concision, was not to impart a prescribed canon of core knowledge to our students, in the belief that it was necessary for cultural literacy or good citizenship; our duty was to teach the current hot research topics in graduate programs.

One appreciates the honesty, but also the insight. In the absence of a collective belief in culture and civilization—in which politics, art, and science are not merely detached and free-floating specializations, but also participants in an overarching hierarchical structure that relates them to things beyond themselves—no canon is possible. The specialist knows no obligation to a wider structure of things, but merely to the guild-like rules and disciplines of his own conventicle. And something like this has also happened in the past few decades to architecture, whose forms are less likely to refer to the society in which they sit but to architecture itself—its processes, preoccupations, identity. In short, it has spiraled in upon itself.

This was certainly the path of the Whites. As their geometry grew more intricate, and their axial dissections more obsessive, their buildings soon came apart literally. And perhaps literarily, for it was the French philosophical school of deconstruction, with its doctrine about the infinite elasticity of text, that designated the new movement that emerged from the Whites. Here a metaphor about architecture was borrowed by philosophy and then returned to architecture, punning on the predilection of Deconstructionist buildings to fragment and disintegrate visibly.

Perhaps the first to do so was Frank Gehry's house in Santa Monica, which conspicuously unpeeled itself in 1978. Even more unruly was Eisenman's Wexner Center for the Visual Arts, in Columbus, Ohio (1983-1989), with its hanging column supporting nothing, a brick arch cleanly sliced in mid-curve, and its flight of stairs leading to a dead end. Here was architecture's irrational dream of itself, as Piranesi might have imagined it, although Piranesi never had the impulse to try to build his fantasies, or the capacity to coax a client into paying for them. And thus the Whites moved inexorably from upholding the ordered rationality of early modernism to a geometry of stupefying irrationality.

If Deconstruction strove mightily to challenge the architectural establishment, it failed. Since weathering the fall of modernism, that establishment has been more or less impervious to shock or scandal. In the schools and in the streets, architects of wildly differing stylistic orientations worked happily alongside one another; nothing was mutually exclusive. Festive postmodern towers rose cheek to jowl beside sleek black cubes with dark tinted windows, much as Gothic and Art Deco skyscrapers huddled gregariously in the 1920s. It was true that there was no single unified style that uniquely expressed the Zeitgeist, but it was not missed. Architecture had settled into an anti-heroic stance, which frowned on absolutes and definitives and endorsed pluralism, grazing omnivorously and pleasantly from the bouquet of offerings to the mild background drone of theory. So architecture ambled on, chartless but content, into the morning of September 11, 2001.

It is difficult to imagine a place in the world less suited for theory than the aching chasm of Ground Zero. Theory by nature is speculative and provisional, and the World Trade Center Site needed much stronger medicine than speculation or ironic detachment. And yet in the months following the attacks, as New York began to consider rebuilding the towers of the destroyed World Trade Center, it became clear that contemporary architecture had little else to offer. The rapidly organized exhibition of proposals at Max Protech Gallery in the late fall of 2001 was a warning of things to come, an anarchic pageant of Pop Art gestures in the Claes Oldenburg mode, or mournful earthen mounds.

To the acute embarrassment of the architectural establishment, the attacks of September 11 had an explicit national content. The International Style had effectively abolished national identity as a criterion in architecture, a state of affairs not changed by modernism's failure; certainly postmodernism, which blithely appropriated every symbol without regard for its original context, had no nationalist content. Yet the hijackers who plunged their jets into the World Trade Center towers attacked them not in their architectural or even financial capacity but in their role as emblems of American identity, which they have since become, indelibly, in retrospect.

To replace these towers without taking this into account simply would not do. Here was a commission of clear and unmistakable national importance, perhaps the most in half a century or more. But here, alas, all the factors that had transformed the nature of American architecture since the fall of modernism came into play: the freewheeling and noncommittal eclecticism; the two-tiered system of celebrity architects and mere practitioners; and the abdication by society at large of fundamental questions about the meaning of architecture to the architectural establishment itself—they came into play and converged catastrophically. The result was an incoherent program and the selection of a wildly inappropriate architect.

One of the unappreciated benefits of having a unified style is that the process of competition and comparison serves as a constant spur to improvement and correction. In the absence of such a yardstick, the intelligent and well-meaning building committee has no recourse but to turn to the architectural establishment for guidance. It is not surprising that the winner was that impeccably credentialed product of the establishment, fifty-seven-year-old Daniel Libeskind.

And Libeskind's credentials were indeed impeccable. His career had been spent largely as a professor of architecture, who only began to receive commissions in the past decade, and almost entirely from institutional clients. His principal work, his Jewish Museum in Berlin, was presumably a qualification for World Trade Center project. After all, until now it was perhaps the world's most visible object consecrated to grief and mourning. This was a Deconstructionist essay, a compressed and bleak procession through a series of irrational and disorienting gestures. In terms of invoking chaos and irrationality, as a mighty spasm of nihilistic despair, it worked well enough. But nihilism is not the proper tone at Ground Zero, especially when the events set in train on September 11, 2001, are still unfolding turbulently around us.

It is a truism in architecture that when the client is weak or divided, the architect fills the void. This is the case at the World Trade Center, where the lease-holder Larry Silverstein was nominally the client, but where his power was offset by the Lower Manhattan Development Corporation, a group finely tuned to public pressure groups. They faced two possible courses of action for Ground Zero, both of them honorable. On the one hand, one might rebuild the towers (or something like them) as quickly as possible as a mighty symbol of defiance and determination. On the other hand, one might plausibly argue that this land had forever been rendered "hallowed ground," akin to the battlefields of Gettysburg or Omaha Beach. To choose between these opposite and irreconcilable options was the responsibility of the client, but this was not done. Instead the decision was relegated to Libeskind, who has tried to have it both ways.

Libeskind placed a mantle of commercial buildings, splintered and fractured in his characteristic way, to the north of the site, leaving the pit of Ground Zero open, its rough concrete slurry wall still intact as a visible vestige of the attack and the subsequent heartrending cleaning operation. He was not commissioned to design a formal memorial--for this a separate competition was conducted during the summer of 2003, and its results not announced as of this writing--but in fact designed several, including the 1,776-foot tower that was to rise over the site and the so-called "wedge of Light" at the northern edge of the pit, which was to be bathed in sunlight on the anniversary during the precise duration of the attack and collapse of the buildings. The result is a design of tragic incoherence, simultaneously swaggering and weeping, unable to resolve its display of commercial vigor at the skyline and a ghoulish obsession with the yawning pit below.

Shortly after Libeskind won the commission, some of his published poetry surfaced, in which nihilism and scatological imagery play a conspicuous part. In his oft-quoted line, "America turns its mass-produced urine antennae toward Caesar's arrogant ganglion, while history is advocated by utopians as a substitute for defecating." This is absurdist, stream-of-consciousness poetry, not to be taken literally. But the train of thought has its own implicit logic. The references to America, to mass-production, and to arrogance, are not random: they are all of a piece, and suggest a vision of his adopted homeland rather different from that of the grateful immigrant.

For this Libeskind should not be blamed. Anyone casting more than a cursory look into his resume would have noticed that he was fundamentally a philosophical prankster, a creature of the theoretical compound, in which he has thrived, but constitutionally unable to relate that compound to things larger than itself--like America. And when he has tried, as with his memorial tower, the result has been kitsch, the simplistic symbolism of its 1776 exposing his non-Euclidean geometry as simple arithmetic. And with his willingness to plunge into bathos and kitsch, he proved himself another Palladio of the Wastepaper Basket.

This is the paradox of American architecture today—overwhelming technical prowess and boldness of swagger and gesture, but with a fundamental uncertainty about the ultimate meaning of things. But this is nothing new. American architecture remains a provincial enterprise, haunted by its psychological dependence on Europe. Although technically unsurpassed and although quivering with commercial vitality, it lacks that natural and unaffected sense of self that characterizes the architectural traditions of Europe. Hence the persistent American willingness to abandon its own lines of development and radically reorient itself almost overnight to European currents. The massive realignment toward European modernism in the 1930s was not a unique event. Something similar happened in the 1890s, when America abandoned the promising direction of its Chicago-style skyscrapers, Richardsonian Romanesque civic buildings, and Shingle Style Houses—each a creation of great originality and merit—and tossed them all overboard in a matter of a few years in a national fad for French classicism, the academic version taught at the Ecole des Beaux Arts in Paris. An entire living architectural culture was displaced in a matter of a few years by the doctrine of Paris. The Beaux-Arts revival produced much that was great, including most of our finest civic buildings, but it had about it a certain cold archaeology. Long-lived Frank Lloyd Wright twice experienced the abrupt and convulsive abandonment of indigenous American lines of architectural development, in his twenties during the 1890s, and in his sixties during the 1940s; it is no wonder that he was so dyspeptic in later years.

Such is the fragility of the American architectural tradition. The same ability to respond instantly and cleverly to commercial pressures, uninhibited by tradition or academic institutions, is the source of a persistent chronic rootlessness—as Macaulay said of the English constitution: all sail, no anchor. Before Modernism, the meaning of architecture was provided by settled traditions of decorum, custom, and above all belief that architecture was a high and noble art that did more than fill the mundane need for space and shelter, making civilization itself visible in permanent form. Until we can do this again, we cannot reassemble the broken pieces at Ground Zero, much less restore the void at the center of architecture. We seem able to make objects that proclaim "we suffer," "we mourn," or "we repent"—but not ones that convey the simple assertion "we are good." And a culture that cannot say that will build no pyramids or cathedrals, and in the end not even a very good house.

  1. This is not to say that the early modernists were necessarily Communists, although many were, at the Bauhaus and elsewhere. After all, Mies van der Rohe blithely submitted projects for Nazi buildings before eventually fleeing Germany following the close of the Bauhaus in 1933. But this is nothing more than garden-variety artistic opportunism, the same sort that permitted the painter David to serve in turn the ancien régime, the French Revolution, and Napoleon.
    • In fact, within the postmodernist camp only one quixotic band has used classicism with a straight face. There were the new classicists, also known affectionately as the young fogeys, including such architects as Thomas Gordon Smith who made the school of architecture at Notre Dame a center of modern classicism. With the support of Henry Hope Reed and his organization Classical America, they have played an important role in championing classicism as a valuable cultural heritage rather than a set of stuffy, if amusing, forms to make sport of.


Michael J. Lewis's latest book is American Art & Architecture (Thames & Hudson).

more from this author

This article originally appeared in The New Criterion, Volume 22 December 2003, on page 4

Copyright © 2014 The New Criterion |

E-mail to friend

add a comment

Leave this field empty

The New Criterion

By the author

MOMA adrift

by Michael J. Lewis

The destruction of the Folk Art Museum will be remembered as MOMA’s great lost opportunity.

Philanthropic tyranny at the NYPL

by Michael J. Lewis

The Central Library Plan's renovations to the New York Public Library will hurt both scholars and average users.

Frank Furness, rational rogue

by Michael J. Lewis

Reconsidering the "rogue" architecture of Frank Furness.

You might also enjoy

Bertolt Brecht’s Marie-Antoinettism

by Anthony Daniels

The German poet, playwright, and Marxist.

Humanities: doomed to lose?

by Mark Bauerlein

How humanities professors are letting identity politics destroy their discipline.

The compensations of Michael Oakeshott

by Timothy Fuller

Revisiting the philosopher through his personal notebooks

Most popular

view more >

Subscribe to our newsletter!

* indicates required


November 04 2014

Friends and Young Friends Event: Election Night Party

November 12 2014

Friends and Young Friends Event: Book Launch Party with Andrew Roberts


The Walter Duranty Prize for Journalistic Mendacity
On May 5, 2014, The New Criterion and PJ Media presented the second Walter Duranty Prize for Journalistic Mendacity. The award is given to highlight egregious examples of dishonest reporting. Also awarded this year was the Rather, a new award for lifetime achievement in mendacious journalism.
The Duranty Prize is named after Walter Duranty, the New York Times Moscow corresponded in the 1920s and 1930s who whitewashed Joseph Stalin’s forced starvation of the Ukrainians (the Holodomor) and many other aspects of Soviet oppression. Duranty was awarded the Pulitzer Prize in 1932 for his efforts. It has never been revoked.
Audio copyright Ed Driscoll,

Introduction to The Kennedy Phenomenon
Roger Kimball introduces The Kennedy Phenomenon, a conference presented by The New Criterion on Tuesday, November 19.

The Kennedy Phenomenon: "Watching the Kennedy Train-Wreck"
Roger Kimball reads Peter Collier’s paper on oft-overlooked unsavory details of the Kennedys' lives. Much of the paper is drawn from Collier’s book, coauthored with David Horowitz, The Kennedys: An American Drama.