Helen Zoe Veit
The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease.
American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.
The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.
Gabriella M. Petrick
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles.
The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods.
Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.
Benjamin C. Waterhouse
Political lobbying has always played a key role in American governance, but the concept of paid influence peddling has been marked by a persistent tension throughout the country’s history. On the one hand, lobbying represents a democratic process by which citizens maintain open access to government. On the other, the outsized clout of certain groups engenders corruption and perpetuates inequality. The practice of lobbying itself has reflected broader social, political, and economic changes, particularly in the scope of state power and the scale of business organization. During the Gilded Age, associational activity flourished and lobbying became increasingly the province of organized trade associations. By the early 20th century, a wide range at political reforms worked to counter the political influence of corporations. Even after the Great Depression and New Deal recast the administrative and regulatory role of the federal government, business associations remained the primary vehicle through which corporations and their designated lobbyists influenced government policy. By the 1970s, corporate lobbyists had become more effective and better organized, and trade associations spurred a broad-based political mobilization of business. Business lobbying expanded in the latter decades of the 20th century; while the number of companies with a lobbying presence leveled off in the 1980s and 1990s, the number of lobbyists per company increased steadily and corporate lobbyists grew increasingly professionalized. A series of high-profile political scandals involving lobbyists in 2005 and 2006 sparked another effort at regulation. Yet despite popular disapproval of lobbying and distaste for politicians, efforts to substantially curtail the activities of lobbyists and trade associations did not achieve significant success.
Wendy L. Wall
The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state.
The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy.
Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”
Between passage of the National Banking Acts near the end of the US Civil War and the outbreak of the Great War and implementation of the Federal Reserve System in 1914, a large, vibrant financial system based on the gold standard and composed of markets and intermediaries supported the rapid growth and development of the American economy. Markets included over-the-counter markets and formal exchanges for financial securities, including bills of exchange (foreign currencies), cash (short-term debt), debt (corporate and government bonds), and equities (ownership shares in corporations), initial issuance of which increasingly fell to investment banks. Intermediaries included various types of insurers (marine, fire, and life, plus myriad specialists like accident and wind insurers) and true depository institutions, which included trust companies, mutual and stock savings banks, and state- and federally-chartered commercial banks. Nominal depository institutions also operated, and included building and loan associations and, eventually, credit unions and Morris Plan and other industrial banks. Non-depository lenders included finance and mortgage companies, provident loan societies, pawn brokers, and sundry other small loan brokers. Each type of “bank,” broadly construed, catered to customers differentiated by their credit characteristics, gender, race/ethnicity, country of birth, religion, and/or socioeconomic class, had distinctive balance sheets and loan application and other operating procedures, and reacted differently to the three major postbellum financial crises in 1873, 1892, and 1907.
President Abraham Lincoln signed the law that established the Department of Agriculture in 1862 and in 1889, President Grover Cleveland signed the law that raised the Department to Cabinet status. Thus, by 1900 the US Department of Agriculture had been established for nearly four decades, had been a Cabinet-level department for one, and was recognized as a rising star among agricultural science institutions. Over the first half of the next century, the USDA would grow beyond its scientific research roots to assume a role in supporting rural and farm life more broadly, with a presence that reached across the nation. The Department acquired regulatory responsibilities in plant and animal health and food safety and quality, added research in farm management and agricultural economics, provided extension services to reach farms and rural communities in all regions, and created conservation and forestry programs to protect natural resources and prevent soil erosion and flooding across the geographical diversity of rural America. The Department gained additional responsibility for delivering credit, price supports, supply management, and rural rehabilitation programs during the severe economic depression that disrupted the agricultural economy and rural life from 1920 to 1940, while building efficient systems for encouraging production and facilitating distribution of food during the crises of World War I and World War II that bounded those decades. In the process, the Department became a pioneer in developing the regulatory state as well as in piloting programs and bureaucratic systems that empowered cooperative leadership at the federal, state, and local levels and democratic participation in implementing programs in local communities.
Paul V. Murphy
Americans grappled with the implications of industrialization, technological progress, urbanization, and mass immigration with startling vigor and creativity in the 1920s even as wide numbers kept their eyes as much on the past as on the future. American industrial engineers and managers were global leaders in mass production, and millions of citizens consumed factory-made products, including electric refrigerators and vacuum cleaners, technological marvels like radios and phonographs, and that most revolutionary of mass-produced durables, the automobile. They flocked to commercial amusements (movies, sporting events, amusement parks) and absorbed mass culture in their homes, through the radio and commercial recordings. In the major cities, skyscrapers drew Americans upward while thousands of new miles of roads scattered them across the country. Even while embracing the dynamism of modernity, Americans repudiated many of the progressive impulses of the preceding era. The transition from war to peace in 1919 and 1920 was tumultuous, marked by class conflict, a massive strike wave, economic crisis, and political repression. Exhausted by reform, war, and social experimentation, millions of Americans recoiled from central planning and federal power and sought determinedly to bypass traditional politics in the 1920s. This did not mean a retreat from active and engaged citizenship; Americans fought bitterly over racial equality, immigration, religion, morals, Prohibition, economic justice, and politics. In a greatly divided nation, citizens experimented with new forms of nationalism, cultural identity, and social order that could be alternatively exclusive and pluralistic. Whether repressive or tolerant, such efforts held the promise of unity amid diversity; even those in the throes of reaction sought new ways of integration. The result was a nation at odds with itself, embracing modernity, sometimes heedlessly, while seeking desperately to retain a grip on the past.
Between 1880 and 1929, industrialization and urbanization expanded in the United States faster than ever before. Industrialization, meaning manufacturing in factory settings using machines plus a labor force with unique, divided tasks to increase production, stimulated urbanization, meaning the growth of cities in both population and physical size. During this period, urbanization spread out into the countryside and up into the sky, thanks to new methods of building taller buildings. Having people concentrated into small areas accelerated economic activity, thereby producing more industrial growth. Industrialization and urbanization thus reinforced one another, augmenting the speed with which such growth would have otherwise occurred.
Industrialization and urbanization affected Americans everywhere, but especially in the Northeast and Midwest. Technological developments in construction, transportation, and illumination, all connected to industrialization, changed cities forever, most immediately those north of Washington, DC and east of Kansas City. Cities themselves fostered new kinds of industrial activity on large and small scales. Cities were also the places where businessmen raised the capital needed to industrialize the rest of the United States. Later changes in production and transportation made urbanization less acute by making it possible for people to buy cars and live further away from downtown areas in new suburban areas after World War II ended.
Laura Phillips Sawyer
The key pieces of antitrust legislation in the United States—the Sherman Antitrust Act of 1890 and the Clayton Act of 1914—contain broad language that has afforded the courts wide latitude in interpreting and enforcing the law. This article chronicles the judiciary’s shifting interpretations of antitrust law and policy over the past 125 years. It argues that jurists, law enforcement agencies, and private litigants have revised their approaches to antitrust to accommodate economic shocks, technological developments, and predominant economic wisdom. Over time an economic logic that prioritizes lowest consumer prices as a signal of allocative efficiency—known as the consumer welfare standard—has replaced the older political objectives of antitrust, such as protecting independent proprietors or small businesses, or reducing wealth transfers from consumers to producers. However, a new group of progressive activists has again called for revamping antitrust so as to revive enforcement against dominant firms, especially in digital markets, and to refocus attention on the political effects of antitrust law and policy. This shift suggests that antitrust may remain a contested field for scholarly and popular debate.