The term “neoliberalism” is notoriously difficult to define. It is social as well as political and economic, it is related to capitalism but not exactly the same thing, it is at times paradoxical in nature, and it manifests in seemingly disparate ways: the rise of public figures like Donald Trump, the increase in our cultural dependency on technology, the three million people living in the United States who self-identify as “doomsday preppers,” (Kelly 95) the deterioration of public school systems, and the onslaught of wellness apps designed to keep us sane are all instances of neoliberalism in action, even if these examples might not seem related to one another at first glance. The cultural shift towards neoliberalism can be traced back to the end of World War II, but the watershed moment in which it transitioned more concretely from theory to praxis was during the years 1978 to 1980. In the introduction to his book, A Brief History of Neoliberalism, David Harvey claims that historians will look upon the years 1978 to 1980 as “a revolutionary turning-point in the world’s social and economic history” (16) because a combination of world events occurred that changed the global economy: Deng Xiaoping took the first steps towards the liberalization of a communist-ruled economy in China, Margaret Thatcher was elected to be Prime Minister of Britain in 1979, and Ronald Reagan was elected to the Presidency of the United States in 1980.
The culmination of events leading up to the election of Ronald Reagan began at the end of World War II when politicians and economists desired to put a new system in place that would prevent the events that caused WWII from occurring again. The new system, known as embedded liberalism, was a market process that surrounded entrepreneurial and corporate activities with a web of social and political constraints, regulating the economy (40-1). Although embedded liberalism gave way to “high rates of economic growth in the advanced capitalist countries during the 1950s and 60s,” (41) by the end of the 1960s, it began to break down when serious signs of capital accumulation, like unemployment and accelerating inflation, (43) became apparent. Initially, the response to the failure of embedded liberalism was to increase regulation and deepen state control, (44) but by the end of the 1970s, the tides had turned completely, and the majority of people supported the liberation of corporate and business power.
The shift in popular opinion during the 1970s from a regulated to an unregulated economy marks the beginning of the neoliberal paradigm. There were various reasons for this shift, but two of the main reasons were that, 1. during the 1970s, communist and socialist parties were gaining ground in much of Europe, threatening economic elites and ruling classes everywhere, (48-9) and 2. the upper classes in the United States lost a lot of wealth during the 1970s when the economic growth collapsed, (49-51) and because of this, they had to quickly change the economy if they wanted to protect their assets, leading French economists, Gérard Duménil and Dominque Lévy, to argue that neoliberalism, from the very beginning, was “a project to achieve the restoration of class power” (52). In A Brief History of Neoliberalism, David Harvey agrees, arguing that the intention of neoliberalism was to “re-establish the conditions for capital accumulation and to restore the power of economic elites” (59).
During the stagflation of the 1970s, corporations and business elites flocked together in order to find an alternative path to power, leading to alliances that were “based on class rather than particular interests,” (133) giving way to a dominant class constituency that would unite themselves behind the Republican Party in order to disseminate their own interests. However, the emergent business class knew that supporting the Republican Party would not be enough to win elections; they also needed to find a way to get the majority of Americans to support their prerogatives. In A Brief History of Neoliberalism, Harvey outlines a major way in which both the business class and the Republican Party altered the climate of opinion to be in favor of neoliberalism by the end of the 1970s: common sense.
Common sense, a term coined by Antonio Gramsci, describes a way of thinking that is “constructed out of longstanding practices of cultural socialization often rooted deep in regional or national traditions” (108). It can be profoundly misleading because it hides real problems under cultural prejudices. In the United States, the term “freedom” carries this kind of power. It resonates so deeply within the psyche of Americans that elites can use it as a way to ignite fear in order to justify certain actions. During the social changes of the 1960s, people were shifting away from traditional American values in favor of a new kind of “freedom” that valued the liberties of those individuals who had been historically oppressed: women’s rights were growing in importance, Civil Rights were an issue, and questions related to sexuality were being raised. These various concerns about individual liberties coincided to form a common enemy: powerful corporations that were in alliance with an interventionist state (114). The failure of the United States government to address social issues and respond adequately to diversity was an issue for many Americans at that time, creating an opportunity for the capitalists to intervene and offer their neoliberal paradigm as an appealing alternative to the oppressive nature of Big Brother. Corporations and businesses were also widely resented by Americans, but by harnessing the power of individual freedom and turning it against the interventionist and regulatory practices of the state, (115-6) they were able to introduce an economic model that claimed to liberate individual freedoms as a reasonable path forward.
By the end of the 1970s, not only was the global economy poised for neoliberalization, but so were the vast majority of Americans, leading to the election of Ronald Reagan to the Presidency of the United States. Ronald Reagan’s victory over Jimmy Carter was crucial because it led to the deregulation of the economy, a reduction in corporate taxes, and attacks on trade unions and professional power, (68-9) resulting in a dramatically different economy than what had existed during the prior decades. The Reagan administration was able to garner support for its neoliberal agenda because it promoted the new market as a “way to foster competition and innovation,” but it was actually just “a vehicle for the consolidation of monopoly power” (75) that would benefit a few while disenfranchising the rest. This can be exemplified by the fact that “the Federal minimum wage, which stood on a par with the poverty level in 1980, had fallen to 30 per cent below that level by 1990″ (74). In addition, “[c]orporate taxes were reduced dramatically, and the top personal tax rate was reduced from 70 to 28 per cent in what was billed as ‘the largest tax cut in history” (75). The 1980s was a pivotal decade for the United States because the federal government was not only shifting its attention away from individual liberties to focus on the deregulation and freedom of corporations, but it was actually framing this shift as something that was beneficial to individuals, despite the clear evidence to the contrary. Higher education was getting more expensive, people needed to work longer hours to earn lower wages, and federal programs were being privatized, but most importantly, the individual was “free”—free to choose, free to fail, free to succeed—but at what cost? What did deregulation mean for individual people?
In 1979, Michel Foucault gave a series of lectures, known collectively as The Birth of Biopolitics, at the Collége de France. These lectures give a prescient account of a new form of governmentality that was seeping into the bowels of democracy, altering the ways in which political reason and practice were being executed. They address, to a large extent, the impact of neoliberalism on the individual. On March 14, 1979, Foucault gave a lecture in which he discusses the emergence of neoliberalism in the West and its relationship to liberalism, especially in the United States. He clarifies how liberalism was introduced to European countries during the Enlightenment as an economic and political choice that was confined to the historical context of those governments, but in the United States, liberalism was weaved into the inception of the country’s independence, establishing itself as an essential component to individual freedom and agency, which eventually led to economic consequences, like the rise of homo oeconomicus, or the individual who exists to serve the “financialization of everything” (Harvey 92). The foundation of homo oeconomicus was built when the Founding Fathers took “political ideals of human dignity and individual freedom as fundamental,” (25) planting the seed for the “American Dream” and the belief that anyone can accomplish anything if she or he works hard enough and uses her resources both efficiently and systematically to maximize her gains. This led to the idea that the individual is a “machine” who can always be improved upon with the end goal of creating more capital.
Human capital, an economic theory that had been historically overlooked by economists, was introduced to the field of economics in 1971 when Theodore Schultz published a book called Investment in Human Capital (Foucault 220). Until this point, economists had only thought about the economy through the lens of three variables: land, capital, and labor; and the last variable, labor, was only thought about in terms of how many hours laborers worked, rather than thinking about the laborers themselves as forms of capital. As Foucault clarifies, a laborer does not sell his labor, he sells his labor power, and the “work performed by the worker is work that creates a value, part of which is extorted from him” (221). The worker is both the object of the supply and demand of labor while simultaneously functioning as an active economic subject, creating an important connection between the investment in human capital and income itself. Foucault claims that “an income is quite simply the product or return on a capital” and capital is “everything that in one way or another can be a source of future income,” (224) which means that the labor one invests in himself is capital because it can be converted into a future form of income that can then be reinvested into the individual to produce more capital, creating what Foucault calls a “machine, but a machine which cannot be separated from the worker himself” (224).
During the 1970s and 80s, economists, policymakers, and consumers were beginning to understand that the skills, knowledge, and experience—known collectively as symbolic capital—that an individual possesses can be converted into economic capital. During the 1980s, policymakers argued that globalization and deregulation were beneficial to individuals because they could more easily develop their symbolic capital—like pursuing higher education, developing productive social connections, or learning other languages; however, there was also an important consequence to this shift. In addition to the “freedom” of developing one’s symbolic capital, individuals were also expected to be responsible for their value or lack thereof. On one hand, individuals were investing more resources into the development of their personal skills and ability to perform well, turning themselves into productive entrepreneurial machines, but on the other hand, the individuals who were not developing these skills, fell behind, or had no access to the resources necessary to develop them were oftentimes framed as indolent and personally responsible for their failures, creating a fervor among Americans to be methodical workers who carefully stepped through hoop after hoop to achieve power and status. Since the 1980s, the obsession with efficiency and productivity in the United States has grown to the point in which people are willing to sacrifice their personal sanity and happiness to do it, conveniently providing social programs and movements, such as the mindfulness movement, with swaths of desperate and anxious people to whom it can cater its services (at a cost, of course). The connection between personal responsibility and neoliberalism is a convenient tool for the institutions themselves because it suggests that the discomfort and unhappiness experienced by people who participate in the system is not a fault of the system itself, but rather, a fault in the individuals.
The emergence of neoliberalism in the United States has been a coercive and oftentimes paradoxical process that has resulted in many Americans not understanding the term or how they play a role in its execution. Since the 1980s, neoliberalism has changed shape, but if anything, the most dire consequence of its continued existence has been its normalization. The fact that people take out massive loans in order to pursue higher education; or that it is normal for people to check their work emails after leaving the office; or that people track almost all their daily activities, including their sleep, to maximize their productivity all demonstrate how insidious and normalized neoliberalism has become in the lives of Americans. However, to change the process would require people to relinquish the “value” that they have accumulated from the system, making it feel like there is no option but to just keep playing the game and hope it gets better.
Foucault, Michel.The Birth of Biopolitics: Lectures at the Collége de France, 1978-79, edited by Michel Senellart, translated by Graham Burchell, Palgrave Macmillan, 2004.
Harvey, David. A Brief History of Neoliberalism. Oxford University Press, 2005. E-book.