The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease.
American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.
Article
Joshua L. Rosenbloom
The United States economy underwent major transformations between American independence and the Civil War through rapid population growth, the development of manufacturing, the onset of modern economic growth, increasing urbanization, the rapid spread of settlement into the trans-Appalachian west, and the rise of European immigration. These decades were also characterized by an increasing sectional conflict between free and slave states that culminated in 1861 in Southern secession from the Union and a bloody and destructive Civil War. Labor markets were central to each of these developments, directing the reallocation of labor between sectors and regions, channeling a growing population into productive employment, and shaping the growing North–South division within the country. Put differently, labor markets influenced the pace and character of economic development in the antebellum United States. On the one hand, the responsiveness of labor markets to economic shocks helped promote economic growth; on the other, imperfections in labor market responses to these shocks significantly affected the character and development of the national economy.
Article
Utopia—the term derived from Thomas More’s 1516 volume by that name—always suggested a place that was both non-existent, a product of the imagination usually depicted fictionally as far distant in time or space, and better than the real and familiar world. In modern times, it has served as a mode of anti-capitalist critique and also, despite its supposed “unreality,” as a disposition joined to actual social movements for dramatic reform. Utopian alternatives to American capitalism, both in the sense of literary works projecting visions of ideal social relations and in real efforts to establish viable communitarian settlements, have long been a significant part of the nation’s cultural and political history. In the 1840s, American followers of the French “utopian socialist” Charles Fourier established dozens of communities based at least in part on Fourier’s principles, and those principles filtered down to the world’s most influential modern utopian novel, Edward Bellamy’s Looking Backward of 1888. Utopian community-building and the writing of anti-capitalist utopian texts surged and declined in successive waves from the 19th to the 21st century, and while the recent surges have never equaled the impact borne by Fourierism or Bellamy, the appeal of the utopian imagination has again surfaced, since the Great Recession of 2008 provoked new doubts about the viability or justice of capitalist economic and social relations.
Article
Daniel Scroop
Antimonopoly, meaning opposition to the exclusive or near-exclusive control of an industry or business by one or a very few businesses, played a relatively muted role in the history of the post-1945 era, certainly compared to some earlier periods in American history. However, the subject of antimonopoly is important because it sheds light on changing attitudes toward concentrated power, corporations, and the federal government in the United States after World War II.
Paradoxically, as antimonopoly declined as a grass-roots force in American politics, the technical, expert-driven field of antitrust enjoyed a golden age. From the 1940s to the 1960s, antitrust operated on principles that were broadly in line with those that inspired its creation in the late 19th and early 20th century, acknowledging the special contribution small-business owners made to US democratic culture. In these years, antimonopoly remained sufficiently potent as a political force to sustain the careers of national-level politicians such as congressmen Wright Patman and Estes Kefauver and to inform the opinions of Supreme Court justices such as Hugo Black and William O. Douglas. Antimonopoly and consumer politics overlapped in this period. From the mid-1960s onward, Ralph Nader repeatedly tapped antimonopoly ideas in his writings and consumer activism, skillfully exploiting popular anxieties about concentrated economic power. At the same time, as part of the United States’ rise to global hegemony, officials in the federal government’s Antitrust Division exported antitrust overseas, building it into the political, economic, and legal architecture of the postwar world.
Beginning in the 1940s, conservative lawyers and economists launched a counterattack against the conception of antitrust elaborated in the progressive era. By making consumer welfare—understood in terms of low prices and market efficiency—the determining factor in antitrust cases, they made a major intellectual and political contribution to the rightward thrust of US politics in the 1970s and 1980s. Robert Bork’s The Antitrust Paradox, published in 1978, popularized and signaled the ascendency of this new approach.
In the 1980s and 1990s antimonopoly drifted to the margin of political debate. Fear of big government now loomed larger in US politics than the specter of monopoly or of corporate domination. In the late 20th century, Americans, more often than not, directed their antipathy toward concentrated power in its public, rather than its private, forms. This fundamental shift in the political landscape accounts in large part for the overall decline of antimonopoly—a venerable American political tradition—in the period 1945 to 2000.
Article
Daniel Clark
Since the introduction of “Fordism” in the early 1910s, which emphasized technological improvements and maximizing productive efficiency, US autoworkers have struggled with repetitive, exhausting, often dangerous jobs. Yet beginning with Ford’s Five Dollar Day, introduced in 1914, auto jobs have also provided higher pay than most other wage work, attracting hundreds of thousands of people, especially to Detroit, Michigan, through the 1920s, and again from World War II until the mid-1950s. Successful unionization campaigns by the United Auto Workers (UAW) in the 1930s and early 1940s resulted in contracts that guaranteed particular wage increases, reduced the power of foremen, and created a process for resolving workplace conflicts. In the late 1940s and early 1950s UAW president Walter Reuther negotiated generous medical benefits and pensions for autoworkers. The volatility of the auto industry, however, often brought layoffs that undermined economic security. By the 1950s overproduction and automation contributed heavily to instability for autoworkers. The UAW officially supported racial and gender equality, but realities in auto plants and the makeup of union leadership often belied those principles. Beginning in the 1970s US autoworkers faced disruptions caused by high oil prices, foreign competition, and outsourcing to Mexico. Contract concessions at unionized plants began in the late 1970s and continued into the 2000s. By the end of the 20th century, many American autoworkers did not belong to the UAW because they were employed by foreign automakers, who built factories in the United States and successfully opposed unionization. For good reason, autoworkers who survived the industry’s turbulence and were able to retire with guaranteed pensions and medical care look back fondly on all that they gained from working in the industry under UAW contracts. Countless others left auto work permanently and often reluctantly in periodic massive layoffs and the continuous loss of jobs from automation.
Article
David Schley
Baltimore, Maryland, rose to prominence in the late 18th century as a hub for the Atlantic wheat trade. A slave city in a slave state, Baltimore was home to the largest free Black community in antebellum America. Nineteenth-century Baltimore saw trend-setting experiments in railroading as well as frequent episodes of collective violence that left the city with the nickname, “mobtown”; one such riot, in 1861, led to the first bloodshed of the Civil War. After the war, Baltimore’s African American community waged organized campaigns to realize civil rights. Residential segregation—both de jure and de facto—posed a particular challenge. Initiatives in Baltimore such as a short-lived segregation ordinance and racial covenants in property deeds helped establish associations between race and property values that shaped federal housing policy during the New Deal. The African American population grew during World War II and strained against the limited housing available to them, prompting protests, often effective, against segregation. Nonetheless, suburbanization, deindustrialization, and redlining have left the city with challenging legacies to confront.
Article
Sharon Ann Murphy
In creating a new nation, the United States also had to create a financial system from scratch. During the period from the Revolution to the Civil War, the country experimented with numerous options. Although the Constitution deliberately banned the issuance of paper money by either Congress or the states, states indirectly reclaimed this power by incorporating state-chartered banks with the ability to print banknotes. These provided Americans with a medium of exchange to facilitate trade and an expansionary money supply to meet the economic needs of a growing nation. The federal government likewise entered into the world of money and finance with the incorporation of the First and Second Banks of the United States. Not only did critics challenge the constitutionality of these banks, but contemporaries likewise debated whether any banking institutions promoted the economic welfare of the nation or if they instead introduced unnecessary instability into the economy. These debates became particularly heated during moments of crisis. Periods of war, including the Revolutionary War, the War of 1812, and the Civil War, highlighted the necessity of a robust financial system to support the military effort, while periods of economic panic such as the Panic of 1819, the Panics of 1837 and 1839, and the Panic of 1857 drew attention to the weaknesses inherent in this decentralized, largely unregulated system. Whereas Andrew Jackson succeeded in destroying the Second Bank of the United States during the Bank War, state-chartered commercial banks, savings banks, and investment banks still multiplied rapidly throughout the period. Numerous states introduced regulations intended to control the worst excesses of these banks, but the most comprehensive legislation occurred with the federal government’s Civil War-era Banking Acts, which created the first uniform currency for the nation.
Article
Chloe E. Taft
Bethlehem, Pennsylvania, a city of seventy-five thousand people in the Lehigh Valley, was settled on the traditional homelands of the Lenape in 1741 as a Moravian religious settlement. The Moravian community on the North Side of the Lehigh River was closed to outsiders and was characterized by orderly stone buildings and a communitarian economy. The settlement opened and expanded on the South Side of the river as an industrial epicenter beginning in the mid-19th century and was ultimately home to the headquarters of the Bethlehem Steel Corporation. By the late 1930s, the city’s 1,800-acre steel plant was ramping up to peak production with employment of more than thirty thousand. When Bethlehem Steel began a long, slow decline after 1950 until the plant’s closure in 1998, Bethlehem evolved into an archetype of a postindustrial city drawing on its long history of heritage tourism and an increasingly diversified economy in healthcare, education, and distribution, among other sectors. The city’s population has roots in multiple waves of migration—the Germanic Moravians in the 18th century, throngs of European immigrants who arrived in the late 19th and early 20th centuries, and a Latino/a population that grew after World War II to represent an increasingly large share of residents. The city’s landscape, culture, and economy are imbued with a multifaceted history that is both deeply local and reflective of the city’s position since its founding as an important node in regional and global networks.
Article
Gavin Benke
“Corporate social responsibility” is a term that first began to circulate widely in the late 1960s and early 1970s. Though it may seem to be a straightforward concept, the phrase can imply a range of activities, from minority hiring initiatives and environmentally sound operations, to funding local nonprofits and cultural institutions. The idea appeared to have developed amid increasing demands made of corporations by a number of different groups, such as the consumer movement. However, American business managers engaged in many of these practices well before that phrase was coined. As far back as the early 19th century, merchants and business owners envisioned a larger societal role. However, broader political, social, and economic developments, from the rise of Gilded Age corporations to the onset of the Cold War, significantly influenced understandings of business social responsibility. Likewise, different managers and corporations have had different motives for embracing social responsibility initiatives. Some embraced social responsibility rhetoric as a public relations tool. Others saw the concept as a way to prevent government regulation. Still others undertook social responsibility efforts because they fit well with their own socially progressive ethos. Though the terms and understandings of a business’s social responsibilities have shifted over time, the basic idea has been a perennial feature of commercial life in the United States.
Article
David Blanke
The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.
Article
Emily Remus
The central business district, often referred to as the “downtown,” was the economic nucleus of the American city in the 19th and 20th centuries. It stood at the core of urban commercial life, if not always the geographic center of the metropolis. Here was where the greatest number of offices, banks, stores, and service institutions were concentrated—and where land values and building heights reached their peaks. The central business district was also the most easily accessible point in a city, the place where public transit lines intersected and brought together masses of commuters from outlying as well as nearby neighborhoods. In the downtown, laborers, capitalists, shoppers, and tourists mingled together on bustling streets and sidewalks. Not all occupants enjoyed equal influence in the central business district. Still, as historian Jon C. Teaford explained in his classic study of American cities, the downtown was “the one bit of turf common to all,” the space where “the diverse ethnic, economic, and social strains of urban life were bound together, working, spending, speculating, and investing.”
The central business district was not a static place. Boundaries shifted, expanding and contracting as the city grew and the economy evolved. So too did the primary land uses. Initially a multifunctional space where retail, wholesale, manufacturing, and financial institutions crowded together, the central business district became increasingly segmented along commercial lines in the 19th century. By the early 20th century, rising real estate prices and traffic congestion drove most manufacturing and processing operations to the periphery. Remaining behind in the city center were the bulk of the nation’s offices, stores, and service institutions. As suburban growth accelerated in the mid-20th century, many of these businesses also vacated the downtown, following the flow of middle-class, white families. Competition with the suburbs drained the central business district of much of its commercial vitality in the second half of the 20th century. It also inspired a variety of downtown revitalization schemes that tended to reinforce inequalities of race and class.
Article
Daniel Pope
Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995.
Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament.
The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects.
Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.
Article
Frederick Rowe Davis
The history of DDT and pesticides in America is overshadowed by four broad myths. The first myth suggests that DDT was the first insecticide deployed widely by American farmers. The second indicates that DDT was the most toxic pesticide to wildlife and humans alike. The third myth assumes that Rachel Carson’s Silent Spring (1962) was an exposé of the problems of DDT rather than a broad indictment of American dependency on chemical insecticides. The fourth and final myth reassures Americans that the ban on DDT late in 1972 resolved the pesticide paradox in America. Over the course of the 20th century, agricultural chemists have developed insecticides from plants with phytotoxic properties (“botanical” insecticides) and a range of chemicals including heavy metals such as lead and arsenic, chlorinated hydrocarbons like DDT, and organophosphates like parathion. All of the synthetic insecticides carried profound unintended consequences for landscapes and wildlife alike. More recently, chemists have returned to nature and developed chemical analogs of the botanical insecticides, first with the synthetic pyrethroids and now with the neonicotinoids. Despite recent introduction, neonics have become widely used in agriculture and there are suspicions that these chemicals contribute to declines in bees and grassland birds.
Article
Michael K. Rosenow
In the broader field of thanatology, scholars investigate rituals of dying, attitudes toward death, evolving trajectories of life expectancy, and more. Applying a lens of social class means studying similar themes but focusing on the men, women, and children who worked for wages in the United States. Working people were more likely to die from workplace accidents, occupational diseases, or episodes of work-related violence. In most periods of American history, it was more dangerous to be a wage worker than it was to be a soldier. Battlegrounds were not just the shop floor but also the terrain of labor relations. American labor history has been filled with violent encounters between workers asserting their views of economic justice and employers defending their private property rights. These clashes frequently turned deadly. Labor unions and working-class communities extended an ethos of mutualism and solidarity from the union halls and picket lines to memorial services and gravesites. They lauded martyrs to movements for human dignity and erected monuments to honor the fallen. Aspects of ethnicity, race, and gender added layers of meaning that intersected with and refracted through individuals’ economic positions. Workers’ encounters with death and the way they made sense of loss and sacrifice in some ways overlapped with Americans from other social classes in terms of religious custom, ritual practice, and material consumption. Their experiences were not entirely unique but diverged in significant ways.
Article
The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.
Article
Christopher W. Calvo
The conspicuous timing of the publication of Adam Smith’s The Wealth of Nations and America’s Declaration of Independence, separated by only a few months in 1776, has attracted a great deal of historical attention. America’s revolution was in large part motivated by the desire to break free from British mercantilism and engage the principles, both material and ideological, found in Smith’s work. From 1776 to the present day, the preponderance of capitalism in American economic history and the influence of The Wealth of Nations in American intellectual culture have contributed to the conventional wisdom that America and Smith enjoy a special relationship. After all, no nation has consistently pursued the tenets of Smithian-inspired capitalism, mainly free and competitive markets, a commitment to private property, and the pursuit of self-interests and profits, more than the United States.
The shadow of Smith’s The Wealth of Nations looms large over America. But a closer look at American economic thought and practice demonstrates that Smith’s authority was not as dominant as the popular history assumes. Although most Americans accepted Smith’s work as the foundational text in political economy and extracted from it the cardinal principles of intellectual capitalism, its core values were twisted, turned, and fused together in contorted, sometimes contradictory fashions. American economic thought also reflects the widespread belief that the nation would trace an exceptional course, distinct from the Old World, and therefore necessitating a political economy suited to American traditions and expectations. Hybrid capitalist ideologies, although rooted in Smithian-inspired liberalism, developed within a dynamic domestic discourse that embraced ideological diversity and competing paradigms, exactly the kind expected from a new nation trying to understand its economic past, establish its present, and project its future.
Likewise, American policymakers crafted legislation that brought the national economy both closer to and further from the Smithian ideal. Hybrid intellectual capitalism—a compounded ideological approach that antebellum American economic thinkers deployed to help rationalize the nation’s economic development—imitated the nation’s emergent hybrid material capitalism. Labor, commodity, and capital markets assumed amalgamated forms, combining, for instance, slave and free labor, private and public enterprises, and open and protected markets. Americans constructed different types of capitalism, reflecting a preference for mixtures of practical thought and policy that rarely conformed to strict ideological models. Historians of American economic thought and practice study capitalism as an evolutionary, dynamic institution with manifestations in traditional, expected corners, but historians also find capitalism demonstrated in unorthodox ways and practiced in obscure corners of market society that blended capitalist with non-capitalist experiences. In the 21st century, the benefits of incorporating conventional economic analysis with political, social, and cultural narratives are widely recognized. This has helped broaden scholars’ understanding of what exactly constitutes capitalism. And in doing so, the malleability of American economic thought and practice is put on full display, improving scholars’ appreciation for what remains the most significant material development in world history.
Article
Aaron Slater
Identifying and analyzing a unified system called the “economy of colonial British America” presents a number of challenges. The regions that came to constitute Britain’s North American empire developed according to a variety of factors, including climate and environment, relations with Native peoples, international competition and conflict, internal English/British politics, and the social system and cultural outlook of the various groups that settled each colony. Nevertheless, while there was great diversity in the socioeconomic organization across colonial British America, a few generalizations can be made. First, each region initially focused economic activity on some form of export-oriented production that tied it to the metropole. New England specialized in timber, fish, and shipping services, the Middle Colonies in furs, grains, and foodstuffs, the Chesapeake in tobacco, the South in rice, indigo, and hides, and the West Indies in sugar. Second, the maturation of the export-driven economy in each colony eventually spurred the development of an internal economy directed toward providing the ancillary goods and services necessary to promote the export trade. Third, despite variations within and across colonies, colonial British America underwent more rapid economic expansion over the course of the 17th and 18th centuries than did its European counterparts, to the point that, on the eve of the American Revolution, white settlers in British America enjoyed one of the highest living standards in the world at the time.
A final commonality that all the regions shared was that this robust economic growth spurred an almost insatiable demand for land and labor. With the exception of the West Indies, where the Spanish had largely exterminated the Native inhabitants by the time the English arrived, frontier warfare was ubiquitous across British America, as land-hungry settlers invaded Indian territory and expropriated their lands. The labor problem, while also ubiquitous, showed much greater regional variation. The New England and the Middle colonies largely supplied their labor needs through a combination of family immigration, natural increase, and the importation of bound European workers known as indentured servants. The Chesapeake, Carolina, and West Indian colonies, on the other hand, developed “slave societies,” where captive peoples of African descent were imported in huge numbers and forced to serve as enslaved laborers on colonial plantations. Despite these differences, it should be emphasized that, by the outbreak of the American Revolution, the institution of slavery had, to a greater or lesser extent, insinuated itself into the economy of every British American colony. The expropriation of land from Indians and labor from enslaved Africans thus shaped the economic history of all the colonies of British America.
Article
Judge Glock
Despite almost three decades of strong and stable growth after World War II, the US economy, like the economies of many developed nations, faced new headwinds and challenges after 1970. Although the United States eventually overcame many of them, and continues to be one of the most dynamic in the world, it could not recover its mid-century economic miracle of rapid and broad-based economic growth.
There are three major ways the US economy changed in this period. First, the US economy endured and eventually conquered the problem of high inflation, even as it instituted new policies that prioritized price stability over the so-called “Keynesian” goal of full employment. Although these new policies led to over two decades of moderate inflation and stable growth, the 2008 financial crisis challenged the post-Keynesian consensus and led to new demands for government intervention in downturns.
Second, the government’s overall influence on the economy increased dramatically. Although the government deregulated several sectors in the 1970s and 1980s, such as transportation and banking, it also created new types of social and environmental regulation that were more pervasive. And although it occasionally cut spending, on the whole government spending increased substantially in this period, until it reached about 35 percent of the economy.
Third, the US economy became more open to the world, and it imported more manufactured goods, even as it became more based on “intangible” products and on services rather than on manufacturing. These shifts created new economic winners and losers. Some institutions that thrived in the older economy, such as unions, which once compromised over a third of the workforce, became shadows of their former selves. The new service economy also created more gains for highly educated workers and for investors in quickly growing businesses, while blue-collar workers’ wages stagnated, at least in relative terms.
Most of the trends that affected the US economy in this period were long-standing and continued over decades. Major national and international crises in this period, from the end of the Cold War, to the first Gulf War in 1991, to the September 11 attacks of 2001, seemed to have only a mild or transient impact on the economy. Two events that were of lasting importance were, first, the United States leaving the gold standard in 1971, which led to high inflation in the short term and more stable monetary policy over the long term; and second, the 2008 financial crisis, which seemed to permanently decrease American economic output even while it increased political battles about the involvement of government in the economy.
The US economy at the beginning of the third decade of the 21st century was richer than it had ever been, and remained in many respects the envy of the world. But widening income gaps meant many Americans felt left behind in this new economy, and led some to worry that the stability and predictability of the old economy had been lost.
Article
Christoph Nitschke and Mark Rose
U.S. history is full of frequent and often devastating financial crises. They have coincided with business cycle downturns, but they have been rooted in the political design of markets. Financial crises have also drawn from changes in the underpinning cultures, knowledge systems, and ideologies of marketplace transactions. The United States’ political and economic development spawned, guided, and modified general factors in crisis causation. Broadly viewed, the reasons for financial crises have been recurrent in their form but historically specific in their configuration: causation has always revolved around relatively sudden reversals of investor perceptions of commercial growth, stock market gains, monetary availability, currency stability, and political predictability. The United States’ 19th-century financial crises, which happened in rapid succession, are best described as disturbances tied to market making, nation building, and empire creation. Ongoing changes in America’s financial system aided rapid national growth through the efficient distribution of credit to a spatially and organizationally changing economy. But complex political processes—whether Western expansion, the development of incorporation laws, or the nation’s foreign relations—also underlay the easy availability of credit. The relationship between systemic instability and ideas and ideals of economic growth, politically enacted, was then mirrored in the 19th century. Following the “Golden Age” of crash-free capitalism in the two decades after the Second World War, the recurrence of financial crises in American history coincided with the dominance of the market in statecraft. Banking and other crises were a product of political economy. The Global Financial Crisis of 2007–2008 not only once again changed the regulatory environment in an attempt to correct past mistakes, but also considerably broadened the discursive situation of financial crises as academic topics.
Article
Gabriella M. Petrick
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles.
The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods.
Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.