It is almost trite to observe that a free society cannot exist without the rule of law. But on the matter of liberty, law is, at best, highly overrated. In fact, it can be downright pernicious.

As Kevin Williamson has argued, real liberty is evolutionary. Free societies are dynamic, efficient, and innovative. Law, by contrast, can become the paralyzing debris that de Tocqueville predicted might someday cover the surface of modern democratic society. It is the “network of small complicated rules, minute and uniform, through which the most original minds and the most energetic characters cannot penetrate, to rise above the crowd.”

If the modern welfare state softens, bends, and usurps the will of man, law is the mechanism by which—to borrow again from de Tocqueville—it “compresses, enervates, extinguishes, and stupefies a people, till each nation is reduced to nothing better than a flock of timid and industrious animals, of which the government is the shepherd.”

To be sure, law is a pillar of liberty—there can be no liberty without it. But law can never be the foundation of a free society—that role is reserved for culture. And if law is to enable freedom rather than impede it, it must be a very particular kind of law: the kind that regulates government—that reins in state authority. When law empowers government, when its tendency is to become our willful master rather than our defense against oppression, such law is the enemy of freedom.

Law can never be the foundation of a free society—that role is reserved for culture.

Of course, in some contexts, law’s hostility to liberty is precisely what we intend. The most obvious example is the criminal law. Our penal statutes are designed to curb the liberty of wrongdoers so that responsible citizens can thrive. It is our common law heritage that an accused is presumed innocent until the state proves beyond a reasonable doubt that he has violated a criminal statute. Once such proof is established, though, liberty is withdrawn for a period of time commensurate with the severity of the offense.

This is as it should be, provided that the statutes in question—enacted by lawmakers who are politically accountable to the governed—truly reflect the society’s sense of serious wrong. Malum in se, which Daniel Hannan may agree with me is nearly as easy to translate as Magna Carta, is the category of acts that are inherently evil. Such offenses as murder, theft, and bearing false witness that are prohibited in virtually every civilization. In a free society, the understanding that such transgressions will be justly punished is essential. It is the basis of ordered liberty, that delicate balance between unfettered license and civic obligation that any society must have to be truly free—to flourish.

But then there is malum prohibitum, the category of acts that are not innately wrong but are proscribed because our lawmakers choose to prohibit them in the service of societal interests thought to be compelling. To take a concrete example, consider the “structuring” of cash transactions: say, breaking a large deposit of currency into several smaller deposits, all in amounts of less than $10,000. (While there are aggressive money-laundering laws in the U.K., the English do not have a cash-transaction reporting requirement as we have in the U.S. and Australia.)

It is not that there is anything inherently wrong with depositing $9900, but the society sees the distribution of harmful narcotics—an enterprise that, in the U.S., is typically conducted in cash—as a grave problem. So the law imposes a reporting mandate on cash transactions of $10,000 or more. An intention to conceal prodigious cash income, a telltale sign of drug dealing, is inferred from deposits just under the reporting threshold.

Obviously, there is a limitless array of legitimate reasons to engage in cash transactions in amounts slightly under $10,000. Moreover, one can imagine many perfectly innocent reasons for parsing these dealings into smaller amounts. If the money is legally possessed and disposed of, the libertarian would understandably say that the relevant transactions are none of the government’s business.

Yet lawmakers reason that a mere reporting requirement—describing the transaction and the ownership of the currency—is an acceptably small infringement on the individual’s liberty, and is not an undue record-keeping burden on financial institutions. The ledger, it is said, tips in favor of society’s compelling interest in discouraging and rooting out serious criminal enterprises. This may be a reasonable cost/benefit analysis, but the fact that we engage in such analyses to justify regulating a great deal of innocent conduct is a heavy burden on liberty.

To take another example, one involving even less serious criminality, albeit conduct that it has become common to punish with astounding severity, American law bars “insider trading” in securities. This, of course, involves purchases or sales of stock and other commercial paper by corporate officials who, because of their position, are privy to business information prior to its being learned by the rest of the market—information that will affect the price of the company’s shares.

Inherently, there is nothing wrong with trading on inside information. In fact, a strong argument can be made that such trading benefits other market participants, signaling what the best-informed traders think of a corporation’s prospects. But lawmakers, answerable to voters, have assessed that the society’s interest in access to valuable information is trumped by concerns for “fairness”—portrayed in this instance as a “level playing field” in which all traders can theoretically be put on an equal informational footing and no insider may exploit his advantage for profit.

It is in the malum prohibitum category that we begin to perceive the profound tension between law and liberty. Is there, after all, really a “level playing field” in stock trading? Is the point really to level the playing field, or are the aggressive prosecutions and harsh sentences more like a morality play—an indication of hostility against profit in the unpopular financial sector, a demonstration that we can be just as tough on rich white crooks as minority offenders?

Is it really better for market participants to be deprived of valuable information? If the CEO wants to dump his shares, isn’t that something you’d like to know before buying in?

Furthermore, what is the limiting principle? The law is always supposed to have one, so that a person of average intelligence is on notice of what is prohibited. If “fairness” is the goal, if the level-playing field means everyone must be equally blind, why stop at the corporate insider? Why not rope in his brother-in-law, his golfing buddies, and his broker, to all of whom he may have said something or other that provided a leg up over other investors?

In fact, crusading prosecutors often endeavor to push this envelope, to broaden the universe of “insiders.” Having been a prosecutor for nearly twenty years, I can tell you that envelope-pushing is an occupational hazard.

When Congress writes criminal statutes, it typically uses language that casts a net wider than the narrow misbehavior that provoked it to act in the first place, essentially delegating its law-making power to the executive on the assumption that prosecutors will exercise their discretion with reason and restraint. Besides the sheer volume of statutes, this criminalization creep has become a feature of our system.

It is an increasingly perilous enemy of liberty. It is one thing when law prohibiting serious crime literally has a broader application than the lawmaker intended; it is quite another when inevitable overreach expands, in the malum prohibitum realm, beyond conduct whose criminalization was dubious in the first place.

In any event, populist rationales like “fairness” and “social justice” are a slippery slope—whether they are invoked to justify aggressive use of the law or, increasingly during the Obama administration, the non-enforcement of law, which is effectively the imperial executive repeal of legislative enactments.

During the 2008 U.S. presidential campaign, Senator Barack Obama famously called for higher tax rates, particularly for “the rich”—a descriptor that, in Obama’s estimation, turns out to apply to an alarming number of people who are far from wealthy. It was pointed out to the candidate that lower tax rates often result in higher revenue collection by the Treasury, meaning that the higher rates he championed would actually result in less money available for redistribution to the social welfare programs he holds dear. But Obama said he would raise the rates anyway . . . as a matter of “fairness.”

Populist rationales like “fairness” and “social justice” are a slippery slope.

The same Obama now endorses a racially discriminatory policy in the enforcement of our civil rights laws: If the victim in a voter intimidation case is white and the aggressor is black, the case is not prosecuted. Although the laws are racially neutral on their face, protecting all Americans, Obama’s Justice Department appointees theorize that they are not so much law as narrative, penance for the indelible stain of racism.

Analogous reasoning informs the burgeoning field of “hate crime.” Not long ago, we prosecuted criminal acts; we did not obsess over their motivation, since the law’s role in a free society is to promote order, not regulate thought. But now, acts are investigated, or not, based on the class of victim, and on whether the acts provide our grievance professionals with grounds to agitate against the society’s alleged racism, sexism, homophobia, Islamophobia, and the like. The point is to justify ever more law designed to socialize us, not protect our liberty.

Similarly violative of the Constitution’s command that government provide equal protection under the law is the Obama administration’s waiver policy. The president purports to have the power to grant immunity for prospective violations of law. This is a novel privilege generally extended to cronies and important electoral constituencies. Compliance, for them, would be prohibitively expensive or would provide a pre-election window into the punitive consequences of Obamacare—the president’s signature “achievement,” yet one whose most dreadful provisions are specifically designed not to kick in until after his reelection has been secured.

In addition, the administration has sued the state of Arizona for attempting to enforce the federal immigration laws. Not for flouting the law, but for honoring it.

Until four years ago, the dispositive legal question when state law collided with Leviathan’s statutes was whether Congress had intended to preempt the states from enacting contradictory law. Under Obama, by contrast, a state that embraces Congress’s law is nonetheless penalized for failing to defer to the executive’s policy of non-enforcement—i.e., his policy of encouraging illegal immigration.

This is the increasingly common role of law: not sentry of liberty but agent of social change. It is still called “the rule of law,” but it is really the rule of lawyers, the tool by which agenda-driven ideologues sculpt a society, rather than unleash its potential. Why are they succeeding today where they have failed in the past?

The most significant explanation lies in the centrality of culture. We are law-obsessed in the West. This is especially so in the United States, where we venerate our Constitution as if it were the cause, rather than the effect, of our freedom. We seem to have forgotten a central truth Paul Johnson brought into such stark relief in his magisterial history of the American people: Americans were a distinct cultural phenomenon for well over a century before shots rang out at Lexington and Concord—long before there was a Constitution and a federal “rule of law.” Our commitment to individual liberty, free markets, and limited government shaped our law and our politics—not the other way around.

For two generations, however, we have ceded to progressives Western society’s most influential institutions: the universities, the arts, the press, the government and its ever-metastasizing, ever-less-accountable bureaucracy. Culturally, progressives are a different breed. They are social engineers, not societal energizers; consequently, they see the traditional rights to be free from government demands and to have government restricted to its expressly enumerated powers as nuisances, not necessities.

For the modern Left, in particular, the individual’s freedom is a relic of a bygone time, when life was supposedly simpler and dominated by sexist, slave-holding white men of a colonialist bent. In contrast to the Right’s emphasis on liberty, focusing on what the state cannot do to you, the Left’s métier is rights, what the state must do for you—which is to say, what the state must compel you to give to me.

Law is the compulsive device by which such schemes are carried out. But, again, law is enacted by politically accountable officials—usually, we can vote the bums out. These social-engineering schemes take root only because society has become habituated to them. In the main, habituation is the function of culture, not law.

The late, vastly underappreciated, political scientist Jacob Lieb Talmon coined the phrase “totalitarian democracy” to describe the form of “political Messianism” that infected free societies during the twentieth century. It was based, he asserted, on “the assumption of a sole and exclusive truth in politics.” The progressive would have his truth transformed into society’s “absolute collective purpose.” The notion of liberty is thus turned on its head: Freedom becomes submission to this exclusive truth, this collective purpose. Law becomes the mechanism by which dissenters are trained and disciplined until the need for coercion fades away—because alternatives have been eliminated.

Now what does that sound like? For anyone paying attention during the last two years, it sounds like the “Arab Spring”: the ascendancy of Islamic supremacism, which is Middle Eastern Islam’s sole and exclusive truth. Islamic supremacism is implemented through sharia, classical Islam’s legal code. More accurately, sharia is Islam’s totalitarian framework for how human life is to be lived, with emphasis on the collective (the ummah) and instruction on all matters great and small, from political, financial, and military down to hygiene and relations between the sexes.

Of course, the Islamist wants his rule of law, too—but for the purpose of inhibiting the population, not unleashing its potential. And his brand of sharia is being imposed because upwards of two-thirds of the population wants it, as they have told us in poll after poll and now election after election.

There is perhaps no better reflection of this dynamic, of culture’s dominance over law, than life in Afghanistan under its new, “Made in the U.S.A.” constitution. When it entered into force in January 2004, the State Department, the document’s shadow author, was ecstatic. It was rife with Western law: sonorous paeans to universal freedom and equality. Liberty itself was portrayed as the “inviolable,” “natural right” of all human beings. Zalmay Khalilzad, then the top U.S. emissary in the region, cooed that the new constitution “set forth parallel commitments to Islam and to human rights.”

“Parallel” was an interesting choice of words. As the State Department knew, Afghans would not tolerate odes to nondiscrimination and the banning of “punishment contrary to human integrity” unless the constitution made the obvious explicit: These Western ideals would be subordinate to Islamic principles.

The new constitution extolled the virtues of “rightful jehad” (also known as jihad) in its very first sentence. Its first article declared a sovereign “Islamic Republic”; its second established Islam as the official “religion of the state”; and its third announced to the world that, within Afghanistan, “no law can be contrary to the beliefs and provisions of the sacred religion of Islam.” It mandated the promotion of Islamic education, and required that Islamic traditions be honored in family formation and child-rearing. It inscribed supremacist scripture on the national flag; dictated that all public ministers swear “to obey and safeguard the provisions of the sacred religion of Islam”; and permitted the judiciary, in lieu of any civil legal training, to be schooled exclusively in sharia.

Did the State Department figure a few tropes about human rights law would change the Afghans over time? That seems unlikely: The constitution expressly provided that one of its terms could never be altered: “The provisions of adherence to the fundamentals of the sacred religion of Islam and the regime of the Islamic Republic.” Within a few months, Abdul Rahman, a Christian who had converted from Islam years earlier, was imprisoned and put on trial for the capital offense of apostacizing from Islam.

Had the stakes not been life and death, had Western coalition forces not sacrificed blood and treasure only to usher in a constitution under which the Taliban itself could have governed without changing a comma, it might have been amusing to watch State Department officials and European ministers wring their hands as if there had been some terrible misunderstanding, as if someone in Afghanistan’s sharia-steeped judiciary must have forgotten that, as a U.S. government spokesman spluttered, “freedom of worship [and] freedom of expression . . . are bedrock principles of democracy . . . that are enshrined in the Afghan constitution.”

Only after some frenzied arm-twisting did Hamid Kharzai, the Afghan president dependent on the West for his survival, have Abdul Rahman quietly whisked out of the country before the death sentence could be executed. The oldest lie in the book was used to justify his extradition: the defendant was pronounced non compos mentis—the rabid public would accept no other explanation for overlooking a conversion away from Islam.

The embarrassing episode resulted in exactly zero movement to repeal the apostasy law in favor of the constitution’s ostensible safeguarding of freedom of conscience. Five years later, Sa’id Musa, an Afghan Red Cross worker, also had to be smuggled out of the country during a death-penalty case over his conversion to Christianity.

And just last year, Karzai’s office announced that he had magnanimously commuted the prison sentence of a nineteen-year-old woman serving a twelve-year term imposed after she was convicted of having sex out of wedlock—with a relative who had raped her. Karzai’s rationale for the pardon? The woman had cured her indiscretion by agreeing to marry the rapist, whose child she had borne during her jail term. In reporting the story, the Associated Press noted in passing that “about half of the 300 to 400 women jailed in Afghanistan are imprisoned for so-called ‘moral crimes’ such as sex outside marriage, or running away from their husbands.”

Contrast immutable Afghanistan with wavering America. For the promotion of liberty, it is hard to imagine a better law than the First Amendment: specifically, its unvarnished command that Congress could “make no law . . . abridging the freedom of speech.” Yet, the Obama administration has been colluding for nearly four years with the Organization of Islamic Cooperation to codify a global sharia suppression standard making it unlawful to engage in speech that would incite mere “hostility to religion.”

In effect, the resolution that would prohibit critical inquiry into the supremacist ideology, rooted in fundamentalist Islam, that openly calls for destruction of the West. It would make it illegal for us to defend ourselves—the most vital natural right of a free people.

Such stories could fill a book. But for present purposes, the point is that law follows culture, not the other way around. If a culture is authoritarian or becomes authoritarian, its law will rein in the public, not the government. It becomes an instrument of oppression, not a pillar of liberty.