Anumber of writers on the left and center-left have recently expressed alarm at cancel culture. In some respects, their concerns match those of conservatives who’ve railed against campus deplatforming, compelled speech, censorship, trigger warnings, safe spaces, micro-aggressions and victimhood culture. These malignancies of student coddling, reflecting years of helicopter parenting and depression-induced cognitive distortions, serve many on the right as prima facie evidence that political correctness run amok is threatening the cherished liberal values of free inquiry and the tolerance of dissent. Some critics see cancel culture as the predictable outcome of the postmodern takeover of the universities that began mandating diversity and inclusion in the 1980s and now seeks to impose campus culture on wider society. Some opponents trace the rise of postmodernism to Neo-Marxism or Marxism itself. Some even accuse the US Democratic Party of staffing postmodern Marxist indoctrination campuses.
There are elements of truth within all these accusations. As George Packer has recently argued, “certain commissars with large followings patrol the precincts of social media and punish thought criminals, but most progressives assent without difficulty to the stifling consensus of the moment and the intolerance it breeds—not out of fear, but because they want to be counted on the side of justice.” As we adapt to a pandemic without foreseeable end, these cultural questions matter even more than they do in normal times—as the effects of decisions made during historical crises tend to persist for generations.
For many on the left, neoliberalism is the root cause of economic inequality, the rise of demagogues, persisting racial disparities and our unfolding climate catastrophe. Some on the right share their concerns about our prevailing economic regime. Neoliberalism may also have helped create the conditions for cancel culture, Social Justice Warriors, the postmodern Neo-Marxists and the so-called radical left who, some claim, have taken over not only the Democratic Party but mainstream media and the Internet.
The term neoliberalism has become fraught. Some regard it as a pejorative hurled by those opposed to globalized capitalism, while others simply use it as John H. McWhorter does here. But, whatever we call it, we’ve all been immersed in deregulation, privatization, free trade, deficit-driven austerity and globalized financialization. As Luke Savage observes, since the 1980s, neoliberalism has become “a feature of our collective existence, so indelible many now seem unable to recall a time before it existed, let alone conceive a future that goes beyond it.”
Universities Were Already Wounded Before Postmodernism
Though the university has traditionally been a modernist project, guided by the values of the Enlightenment, it’s striking how readily post-structuralist and post-modernist perspectives came to predominate, particularly in the faculties of literature, education, social sciences and law. Gary Aylesworth defines postmodernism as: “a set of critical, strategic and rhetorical practices employing concepts such as difference, repetition, the trace, the simulacrum and hyperreality to destabilize other concepts such as presence, identity, historical progress, epistemic certainty and the univocity of meaning.” One needn’t be a conspiracy theorist or suggest apocalyptic implications to acknowledge that a philosophical/ideological shift has occurred within some academic fields and that this is reflected in the general language and culture of universities.
Since most university professors consider themselves leftists, and postmodernist writers condemn institutional domination over the marginalized, many have assumed that this explains academia’s embrace of both politically correct censorship and the prioritizing of group-level equity and inclusion over traditional academic discipline. The extent of postmodern academia’s influence on the culture wars is debatable, however, and evidence that professors are drawn to postmodernism’s supposed neo-Marxism seems equivocal at best. In practice, most postmodern-ish faculty merely strike radical poses, while the few actual Marxists left warn of the damage wrought by postmodernism to the entire academic enterprise. While some campus radicals of the 1960s went to graduate school and joined the academy, many more did not. So it fell to their more compliant, savvier, younger siblings to become the professors at elite institutions who saw theory and ideals as their path to tenure. Thus, tagging postmodernists as neo-Marxist obscures their far greater influence over mainstream progressives, who came of age during a new world order, when liberals began to settle for gestures and symbols in lieu of material outcomes and for social instead of income equality.
The role of liberal academics in the adoption of the inclusive administrative policies of the 1990s was probably far less consequential than has usually been assumed. To suggest that university administrations are motivated by fealty to their professors’ political ideology, let alone to their intellectual commitments, one must ignore forty years of the steady neutering of faculty power over institutional policies and priorities. Like employees everywhere, many—perhaps most—professors complain of their administrators’ preoccupation with bottom line realities. The funding of public universities prior to the 1970s was comparatively generous, but that business model was dismantled decades ago. The rollbacks of government support for education began during Ronald Reagan’s governorship when he famously argued that “the state should not fund intellectual curiosity.” Soon after Reagan’s 1981 inauguration, New Right think tanks advocated slashing federal education funding by more than 90%. The Democratic Party limited the bloodletting to about 50 percent, but this forever monetized the culture of US educational institutions.
The more relevant relationship since has thus not been between college administrators and their “cost centers” (aka faculty) but between universities as businesses and their primary revenue streams: extra-mural grants and contracts, corporate and individual donors and tuition-paying students. The ideal of the university as a community of scholars free enough from worldly concerns to devote themselves to the development of knowledge was effectively dashed as students were transformed into debt-burdened consumers of education as a product for sale.
Postmodernism Will Serve You Now
The universities’ need to fund themselves required steady tuition and fee hikes as well as ballooning student loan debt. However sympathetic most professors may have been to postmodernism, only a few of those who espoused such a view were well positioned to appeal to the biases, preferences and demands of student consumers. The postmodern approach, which celebrates difference, naturally blossomed in the post-1960s and spread easily within the humanities by the mid-1980s. These perspectives initially sought to inform area studies programs, but eventually came to dictate degree requirements, diversity mandates, speech codes, safe spaces, trigger warnings, de-platforming, callings-out and now cancellings. Students had to shoulder an ever larger share of university budgets and, as a result, understandably expected to be flattered, catered to and even coddled.
Sensing an opportunity, a coterie of 1980-90s faculty used postmodernist posturing to help them acquire positions in a dwindling academic marketplace. Sociologist Michèle Lamont argues that Jacques Derrida, for example, achieved prominence “by targeting his work to a large cultural public rather than to a shrinking group of academic philosophers,” while “in America, professional institutions and journals played a central role” in the diffusion of postmodernism. Christopher Lord has recently shown how Derrida’s signature incomprehensibility allowed nascent area studies departments to camouflage bold, unchallengeable assertions in “poetic nonsense,” thus obviating the need for traditional scholarship. As Lord reminds us, the primary enthusiasm for postmodernism was not found within philosophy departments, but in the safe spaces of Women’s Studies and Comparative Literature, freed from the confines of classical disciplines and exempted from dead white male standards. This gambit was called out at the time by Camille Paglia, who saw the importation of obscure French postmodernist thought as mere pretentious gloss to mark out people’s resumes. Noam Chomsky later criticized the cult-like insularity of the postmodernists, their irrelevance to lived experience and their abandonment of the oppressed to demagogues, even while they are, according to Chomsky, “quick to tell us that they are far more radical than thou.” Meanwhile, the careerism that Paglia and Chomsky derided had become a necessity, especially within the humanities and social sciences, due to prevailing economic realities.
There’s a cruel irony in the fact that classical liberals are now accusing postmodern socialists of single-handedly destroying the liberal values of the academy, years after the economic deconstruction of the Reagan revolution (and similar movements outside the US) turned the marketplace of ideas into a literal marketplace, where nakedly ambitious and savvy ideologues thrive, and a precarious “gig academy,” in which temporary staff, without job security, do most of the teaching and research. As in the corporations they now emulate and cater to, nearly everyone holding up the academic pyramid is micromanaged, overworked and undermined by an ever-expanding number of administrators, chosen for their business school knowledge, rather than their academic degrees. To miss the neoliberal productivity paradigm underlying academia is to be a fish unaware of the surrounding water.
From a wider perspective, then, the upheavals within modern universities are forgotten casualties of neoliberalism. Educational institutions on both sides of the pond suffered irreversible consequences from fiscal policies that upended previous social democratic traditions. Public education may have been one of the first beasts to be starved, but the neoliberal revolution was just getting started.
The Shift to Progressive Neoliberalism
Neoliberal economics—which began in the early twentieth century as the renewal and refinement of classical liberalism associated with Ludwig von Mises and Friedrich Hayek and gained prestige following the First World War, was marginalized in the wake of Franklin D. Roosevelt’s triumphant social liberalism and Keynesian welfare capitalism. The neoliberals agitated against every supposedly socialist program from the New Deal to Medicare in vain, until the early 1970s, when they returned to center stage in with the neoclassical monetarism of Hayek and Milton Friedman. Decades before postmodernists began exploiting gutted university budgets, the new monetarist neoliberals of the right (who would incite that gutting) sold themselves to governments as counter-revolutionary slayers of stagflation.
Less than a decade later in the US (and similarly in the UK and other social democracies), progressives on the left, alarmed by the rise of Reagan and his effective discrediting of the term liberal, began to adopt the previously right-leaning neoliberal label and much of the neoliberal philosophy, to distinguish themselves from bleeding heart profligates. Their goal was to signal acceptance of political reality, while maintaining a responsible social conscience. As McWhorter recalls, Charles Peters “helped usher in the new flavor of the word, as well as its reception from the left, with his aggressive ‘manifesto.’” Seemingly skeptical of the harsher laissez-faire of classical liberalism and trickledown economics, Peters’ Washington Monthly captured the zeitgeist’s growing distrust of unions and adulation of entrepreneurs, while also providing some of the most astute progressive opposition to Reaganism: his was a socially conscious but responsible liberalism, which took critics of the welfare state seriously. This viewpoint was eventually adopted as the guiding philosophy of the Democratic Leadership Council (DLC) and later provided both the strategy and priorities of the Bill Clinton presidency. Initially, the Democrats pointed out the growing federal deficit as a political cudgel to attack Reaganomics. They then proclaimed their prudent preoccupation with deficits and debt, making common cause with Pete Peterson cultists on the right, insisting that buying guns on credit is fine, but butter must be paid for. Influenced by the five-millennia-old deficit myth, the neoliberals prioritized monetary discipline: from Barack Obama’s grand bargain and EU troika austerity to Nancy Pelosi’s pay-go. As David Graeber explains in Debt: The First Five Thousand Years, “it is only in the current era … that we … see the creation of the first effective planetary administrative system largely in order to protect the interests of creditors.”
The DLC quickly pivoted from Peters’ noblesse oblige vision of neoliberal progressivism to an austerity focused, meritocracy-fetishizing corporatism, while dog-whistling to Reagan Democrats. The vaunted triangulations assumed to have insured Clinton’s political success were presented as the only way to protect liberal values—effectively rebranded as anti-discrimination and support for abortion rights—at the necessary cost of embracing conservative economics. By that time, as Gregg Easterbrook has noted, DC was firmly in the grip of the conservative American Enterprise Institute (AEI), Heritage and Cato think-tanks who, together, “routed a generation of assumptions about government” as an “intellectual competitor for the university system,” which rendered it “dependent on not offending corporate patrons.” Just five years after Reagan’s election, Easterbrook was bemoaning the fact that “conservative thinking has not only claimed the presidency; it has spread throughout our political and intellectual life and stands poised to become the dominant strain in American public policy.” He did not yet anticipate the long term consequences of bipartisan policies that would guide American-style capitalism and the end of history, both of which were enabled by the great moderation.
This new Democratic Party approach was also a reaction to an earlier electoral upheaval. Democratic social liberals of the previous era who had taken up the principled fight for black civil rights had done so at the cost of the votes of their historical white working class base. Following the 1968 presidential election, avowed segregationist George Wallace siphoned off a sizable proportion of former Democrat voters to form the basis of the Republican Southern strategy. The Democrats were thus left with a far more meager economics-oriented base of support. By McGovern’s 1972 rout, the Democrats all but owned the issue of minority rights. To unite their supporters, both parties began to foreground cultural, gender, ethnic and sexual rights and value signaling. By the mid-1980s, centrists from both parties from Gary Hart to Joe Biden seemed eager to bury Keynes and let Hayek and Friedman prevail. A bipartisan economic consensus thus coalesced to reduce the welfare state, globalize and deregulate capital economic policies, and facilitate monopolistic rent seeking. This process was initiated by Jimmy Carter in the late 70s and championed under Reagan and Margaret Thatcher in the 80s, but it took Bill Clinton to both end welfare as we know it and finally repeal Depression-era financial constraints in the 90s. As Nancy Fraser observes, the marriage of neoliberalism and identity politics begat the “oxymoronic” progressive neoliberalism of the New Democrats and the “less coherent” reactionary neoliberalism of the Republican Party (who have now mostly acquiesced to Trump’s reactionary populism). Fraser’s quadripartite distinctions between right and left neoliberals and left and right populists clarify this process.
Rights over Resources
Under neoliberalism, the wealth generated by the productivity gains of the post-World War II era began to shift sharply towards the very top of society, leaving the previously stabilizing middle and working classes of the west behind and bloating the investor sector. This was the result of government policies that encouraged the off-shoring of jobs and profits while shrinking manufacturing employment, and provoked resentment from those who felt they had lost both economic security and dignity. The gradual shift in liberal focus from labor rights to civil and immigrant rights mollified the professional-managerial class, but it also suggested misplaced priorities to many of those who were treading water economically. While US defenders of neoliberal politics label proposals such as universal health care “ponies” we “can’t afford,” reactionary populists like Trump exploit working class grievances by proclaiming solidarity with the working class against a rights-obsessed liberal class for whom rights seem actionable in ways that economic policies have ceased to be. The broadly felt chronic inability to effect actual political or economic change invited ideological and moral battles to fill the void.
For nearly thirty years, progressive neoliberals have evinced a rhetorically compassionate yet fiscally inexpensive identity-centric political correctness that lets us eat diversity, as group-based rights are mainly realized by modifying civil codes and administrative policies through legislation and litigation. Put simply, liberals have been increasing people’s rights rather than their resources. This practice can increase political polarization. By 2016, voter disappointment in Obama’s tepid challenge to the economic status quo had increased and the Democratic Party seemed once again to be offering empty rhetoric instead of fiscal support for middle and working class voters—all while highlighting the needs of the marginalized. Wage earners rightly feel economically vulnerable. Liberals who focus on social and identity rights are often at a rhetorical disadvantage, since rights are viewed as acutely personal and this inspires those who feel that their own rights are being neglected to counterattack.
Deregulated and Consolidated Media
Social media has been widely blamed for exacerbating the resulting polarization. The Internet has plainly devolved into a miasmic forum where good faith arguments are routinely buried in 24/7 avalanches of weaponized mimetic franca. Yet legacy media have also been mining cultural conflict all along. Originally, broadcast TV and radio offered curated, broadly centrist and consensual, public interest focused news and opinion with the occasional heated debate over contentious issues—though fringe conservative talk radio was also available. Soon after Reagan’s 1987 FCC repeal of the nearly 40-year-old fairness doctrine, polarization could be explicitly marketed via a newly unshackled medium. Rush Limbaugh’s approach to commentary was an early example of narrow-casting. Clinton’s deregulation of all media in the Telecommunications Act of 1996 further warped the tenuous lines between the dominant networks’ news and entertainment divisions and led to greater media consolidation. As universities were forced to adapt to austerity budgets by succumbing to corporate prerogatives, nightly news executives began to sell news as just another commodity. And, as journalist Matt Taibbi argues in Hate, Inc., the end of the Cold War removed the central conflict upon which mainstream news had been focused, making deliberate widening of the fissures of American identity necessary to keep viewers glued to their screens. By the end of the 1990s—well before social media, click-bait and memes—ideological possession was presented as cable TV’s freedom of choice. Fox News and Rush Limbaugh on the one hand and Jon Stewart and MSNBC on the other soon became the two highly lucrative echo chambers that later evolved into the silos of Facebook, Twitter and YouTube. The deregulated infotainment of the early 2000s yielded incentivized market-segment positioning, manufactured outrage over peripheral issues, nuance-free straw-manning and tribal-themed reality TV shows. Goaded by the media that stood to profit from these conflicts, being politically correct came to characterize left-wing identity, and being anti-liberal came to characterize right-wing identity. Almost all mainstream politicians adopted the same categories.
Cost-Free Identity and the Woke Currency of Cancellation
And so, as liberal and conservative politicians stopped substantively engaging in economic questions, group identity, group rights and value signaling became the primary bases of outreach and ideological loyalty for politicians and media alike. This bunkering escalated in social media with its attention economy, which instigated tribalism—as we saw during the cynical 2016 election campaign. Both the right and the left politics now pander to the rage of easily triggered voters, leaving fundamental economic policies mostly unchallenged and even unarticulated.
By 2018, after the corporate titans who’d been monetizing identity for generations were caught with their pants down, they quickly rehabilitated themselves by marketing woke capital. As writer Russ Douthat has observed,
corporate activism on social issues isn’t in tension with corporate self-interest on tax policy and corporate stinginess in paychecks. Rather [it] … increasingly exists to protect the self-interest and the stinginess—to justify the ways of CEOs to cultural power brokers, so that those same power brokers will leave them alone (and forgive their support for Trump’s economic agenda) in realms that matter more to the corporate bottom line.
Heralding both the arrival and true beneficiaries of cancel culture, he saw that “corporate interests themselves stand to lose little from these polarizing trends. Their wokeness buys them cover when liberalism is in power, and any backlash only helps prop up a GOP that has their back when it comes time to write our tax laws.” Substitute corporate interests for progressive neoliberals and it’s easy to see why progressive populists worry that the eventual victims of cancel culture may be the roughly 80% of us who’ve been harmed by neoliberalism’s Pareto-distributed world.
Even as postmodern historicism washes over the US, our economic context—miserly federal support during the worst economic conditions since the 1930s—has been ignored. Predictably, just as we’re told that bread is too costly, the identity-shaming circus arrives, channeling the quarantined energy of powerlessness, economic precarity and social isolation into sideshows of cancellation. Rather than support the millions of households facing sustained financial crisis, our political leaders’ clear priority has been to help the financial and corporate sectors. Now that those morally irresponsible sectors have received their financial backing, Congress remains deadlocked—no one any of the politicians personally know will be impacted by the loss of already barely sufficient supports. In the meantime, their patrons, blinded by quarterly-report myopia, persist in selling a dangerously false dichotomy between an economy that must supposedly be reopened and a virus whose spread is shrinking most markets.
Last spring’s Democratic Party panic at the electoral success of democratic socialists (who channel Roosevelt more than Marx) led them to bring their primary season to an abrupt conclusion so that, as Joe Biden put it “nothing will fundamentally change” and neoliberalism will remain the status quo. Having defeated the most credible progressive populist threat to neoliberalism in generations, the Democratic Party may once again use the touchstones of identity, race and gender to dismiss calls for serious economic debate as reductionist and insufficiently woke. If neoliberalism’s central role in the market-driven evolution of cancel culture were more widely recognized, its censorious morality could be seen as just another a tool to stoke pathological division among the economically disenfranchised and politically addicted. Until we begin to change the framework that divides the few haves from the many who have barely enough, the ersatz politics of identity will remain an effective diversion.
Uprooting Cancel Culture
Encouragingly, there is some common ground between non-ideologues on both right and left regarding the dangers of self-righteous identitarianism, the importance of democratic and pluralist principles and the economic needs of the broader majority. When Mark Fisher voiced his despair at the counterproductive call-out culture of left-wing Twitter in 2013, he was denounced by his colleagues. Now similar views have been expressed by the late Michael Brooks and by several other writers on the left.
Those who deny that cancel culture exists may be motivated by a wish to discredit its right-wing critics, but they fail to consider who benefits from the discord they sow. Yet, centrist liberals who ally themselves with conservatives in order to argue that totalitarian excess is a problem unique to the left may potentially further a century-old reactionary right-wing agenda. And, at the same time, a well-meaning but iatrogenic postmodernism is informing the worldviews of both neoliberal and populist progressives. To save the baby from this toxic bathwater we should recognize the validity of a variety of viewpoints, as Ken Wilber, Robert Wright and other integral thinkers have been attempting to do.
To solve this problem will require an honest and self-aware politics that advocates for material objectives while acknowledging differences of opinion and negotiating between different values. If those within mainstream institutions can recognize the negative outcomes of the neoliberal project, the political left should be humble enough to join forces with them. Now more than ever, we need empathetic and spirited conversation that prioritizes shared purposes and commitments and is characterized by comity and nuance, in order to lower the rhetorical heat within all our political silos.