1-15 of 15 Results  for:

  • Cultural History x
  • Economic History x
Clear all

Article

American Food, Cooking, and Nutrition, 1900–1945  

Helen Zoe Veit

The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease. American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.

Article

Anti-capitalist Thought and Utopian Alternatives in America  

Howard Brick

Utopia—the term derived from Thomas More’s 1516 volume by that name—always suggested a place that was both non-existent, a product of the imagination usually depicted fictionally as far distant in time or space, and better than the real and familiar world. In modern times, it has served as a mode of anti-capitalist critique and also, despite its supposed “unreality,” as a disposition joined to actual social movements for dramatic reform. Utopian alternatives to American capitalism, both in the sense of literary works projecting visions of ideal social relations and in real efforts to establish viable communitarian settlements, have long been a significant part of the nation’s cultural and political history. In the 1840s, American followers of the French “utopian socialist” Charles Fourier established dozens of communities based at least in part on Fourier’s principles, and those principles filtered down to the world’s most influential modern utopian novel, Edward Bellamy’s Looking Backward of 1888. Utopian community-building and the writing of anti-capitalist utopian texts surged and declined in successive waves from the 19th to the 21st century, and while the recent surges have never equaled the impact borne by Fourierism or Bellamy, the appeal of the utopian imagination has again surfaced, since the Great Recession of 2008 provoked new doubts about the viability or justice of capitalist economic and social relations.

Article

Bethlehem, Pennsylvania  

Chloe E. Taft

Bethlehem, Pennsylvania, a city of seventy-five thousand people in the Lehigh Valley, was settled on the traditional homelands of the Lenape in 1741 as a Moravian religious settlement. The Moravian community on the North Side of the Lehigh River was closed to outsiders and was characterized by orderly stone buildings and a communitarian economy. The settlement opened and expanded on the South Side of the river as an industrial epicenter beginning in the mid-19th century and was ultimately home to the headquarters of the Bethlehem Steel Corporation. By the late 1930s, the city’s 1,800-acre steel plant was ramping up to peak production with employment of more than thirty thousand. When Bethlehem Steel began a long, slow decline after 1950 until the plant’s closure in 1998, Bethlehem evolved into an archetype of a postindustrial city drawing on its long history of heritage tourism and an increasingly diversified economy in healthcare, education, and distribution, among other sectors. The city’s population has roots in multiple waves of migration—the Germanic Moravians in the 18th century, throngs of European immigrants who arrived in the late 19th and early 20th centuries, and a Latino/a population that grew after World War II to represent an increasingly large share of residents. The city’s landscape, culture, and economy are imbued with a multifaceted history that is both deeply local and reflective of the city’s position since its founding as an important node in regional and global networks.

Article

Death and Dying in the Working Class  

Michael K. Rosenow

In the broader field of thanatology, scholars investigate rituals of dying, attitudes toward death, evolving trajectories of life expectancy, and more. Applying a lens of social class means studying similar themes but focusing on the men, women, and children who worked for wages in the United States. Working people were more likely to die from workplace accidents, occupational diseases, or episodes of work-related violence. In most periods of American history, it was more dangerous to be a wage worker than it was to be a soldier. Battlegrounds were not just the shop floor but also the terrain of labor relations. American labor history has been filled with violent encounters between workers asserting their views of economic justice and employers defending their private property rights. These clashes frequently turned deadly. Labor unions and working-class communities extended an ethos of mutualism and solidarity from the union halls and picket lines to memorial services and gravesites. They lauded martyrs to movements for human dignity and erected monuments to honor the fallen. Aspects of ethnicity, race, and gender added layers of meaning that intersected with and refracted through individuals’ economic positions. Workers’ encounters with death and the way they made sense of loss and sacrifice in some ways overlapped with Americans from other social classes in terms of religious custom, ritual practice, and material consumption. Their experiences were not entirely unique but diverged in significant ways.

Article

Deindustrialization and the Postindustrial City, 1950–Present  

Chloe E. Taft

The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.

Article

The Department Store  

Traci Parker

Department stores were the epicenter of American consumption and modernity in the late 19th and through the 20th century. Between 1846 and 1860 store merchants and commercial impresarios remade dry goods stores and small apparel shops into department stores—downtown emporiums that departmentalized its vast inventory and offered copious services and amenities. Their ascendance corresponded with increased urbanization, immigration, industrialization, and the mass production of machine-made wares. Urbanization and industrialization also helped to birth a new White middle class who were eager to spend their money on material comforts and leisure activities. And department stores provided them with a place where they could do so. Stores sold shoppers an astounding array of high-quality, stylish merchandise including clothing, furniture, radios, sporting equipment, musical instruments, luggage, silverware, china, and books. They also provided an array of services and amenities, including public telephones, postal services, shopping assistance, free delivery, telephone-order and mail-order departments, barber shops, hair salons, hospitals and dental offices, radio departments, shoe-shining stands, wedding gift registries and wedding secretary services, tearooms, and restaurants. Stores enthroned consumption as the route to democracy and citizenship, inviting everybody—regardless of race, gender, age, and class—to enter, browse, and purchase material goods. They were major employers of white-collar workers and functioned as a new public space for women as workers and consumers. The 20th century brought rapid and significant changes and challenges. Department stores weathered economic crises; two world wars; new and intense competition from neighborhood, chain, and discount stores; and labor and civil rights protests that threatened to damage their image and displace them as the nation’s top retailers. They experienced cutbacks, consolidated services, and declining sales during the Great Depression, played an essential role in the war effort, and contended with the Office of Price Administration’s Emergency Price Control Act during the Second World War. In the postwar era, they opened branch locations in suburban neighborhoods where their preferred clientele—the White middle class—now resided and shaped the development and proliferation of shopping centers. They hastened the decline of downtown shopping as a result. The last three decades of the 20th century witnessed a wave of department store closures, mergers, and acquisitions because of changing consumer behaviors, shifts in the retail landscape, and evolving market dynamics. Department stores would continue to suffer into the 21st century as online retailing exploded.

Article

Financial Crises in American History  

Christoph Nitschke and Mark Rose

U.S. history is full of frequent and often devastating financial crises. They have coincided with business cycle downturns, but they have been rooted in the political design of markets. Financial crises have also drawn from changes in the underpinning cultures, knowledge systems, and ideologies of marketplace transactions. The United States’ political and economic development spawned, guided, and modified general factors in crisis causation. Broadly viewed, the reasons for financial crises have been recurrent in their form but historically specific in their configuration: causation has always revolved around relatively sudden reversals of investor perceptions of commercial growth, stock market gains, monetary availability, currency stability, and political predictability. The United States’ 19th-century financial crises, which happened in rapid succession, are best described as disturbances tied to market making, nation building, and empire creation. Ongoing changes in America’s financial system aided rapid national growth through the efficient distribution of credit to a spatially and organizationally changing economy. But complex political processes—whether Western expansion, the development of incorporation laws, or the nation’s foreign relations—also underlay the easy availability of credit. The relationship between systemic instability and ideas and ideals of economic growth, politically enacted, was then mirrored in the 19th century. Following the “Golden Age” of crash-free capitalism in the two decades after the Second World War, the recurrence of financial crises in American history coincided with the dominance of the market in statecraft. Banking and other crises were a product of political economy. The Global Financial Crisis of 2007–2008 not only once again changed the regulatory environment in an attempt to correct past mistakes, but also considerably broadened the discursive situation of financial crises as academic topics.

Article

Food and Agriculture in the 20th and 21st Centuries  

Gabriella M. Petrick

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles. The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods. Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.

Article

The History of Route 66  

Stephen Mandrgoc and David Dunaway

During its existence from 1926 to its formal decommissioning in 1985, US Highway 66, or Route 66, came to occupy a special place in the American imagination. For a half-century and more, it symbolized American individualism, travel, and the freedom of the open road with the transformative rise of America’s automobile culture. Route 66 was an essential connection between the Midwest and the West for American commercial, military, and civilian transportation. It chained together small towns and cities across the nation as America’s “Main Street.” Following the path of older trails and railroads, Route 66 hosted travelers in many different eras: the adventurous motorist in his Ford Model A in the 1920s, the Arkies and Okies desperate for a new start in California in the 1930s, trucks carrying wartime soldiers and supplies in the 1940s, and postwar tourists and travelers from the 1950s onward. By its nature, it brought together diverse cultures of different regions, introducing Americans to the “others” that were their regional neighbors, and exposing travelers to new arts, music, foods, and traditions. It became firmly embedded in pop culture through songs, books, television, and advertisements for its attractions as America’s most famous road. Travel on Highway 66 steadily declined with the development of controlled-access interstate highways in the 1960s and 1970s. The towns and cities it connected and the many businesses and attractions dependent on its traffic and tourism protested the removal of the highway designation by the US Transportation Department in 1985, but their efforts failed. Nonetheless, revivalists who treasured the old road worked to preserve the road sections and attractions that remained, as well as founding a wide variety of organizations and donating to museums and libraries to preserve Route 66 ephemera. In the early 21st century, Route 66 is an international icon of America, traveled by fans from all over the world.

Article

Milwaukee  

Amanda I. Seligman

Milwaukee means “the good land” in Anishinaabemowin, the language group of the Indigenous people who have lived in the region since the 17th century. Milwaukee is nestled between a subcontinental divide and the western shoreline of Lake Michigan. Some 10,000 years ago, the retreating Wisconsin glacier shaped the region’s topography: at sea level and relatively flat near Lake Michigan and rolling hills in the Kettle Moraine area north and west of the city’s site. The Milwaukee, Kinnickinnic, and Menomonee rivers converge in the city. The water made the land fertile and defined its promise as a transportation hub. Milwaukee grew from this rich potential into the largest and most diverse city in Wisconsin. During the 19th century, it transformed from a collection of Indigenous villages with a Metis trading post into an industrial powerhouse specialized in heavy manufacturing and brewing. European immigrants (especially from Germany and Poland) and migrants from the eastern United States staffed Milwaukee’s businesses and settled the region with farming hamlets and suburban municipalities. By the early 20th century, Milwaukee consistently ranked among the top twenty US cities by population. But because the city’s area was so compact, it was also one of the most densely populated. For half of the 20th century, Socialists governed Milwaukee. Unusually among Midwestern cities, Milwaukee’s Socialists waged a campaign to annex surrounding areas, leading to a wave of defensive suburban incorporations after World War II. In the second half of the 20th century, the third wave of the Great Migration brought large numbers of African Americans to Milwaukee’s North Side, and Mexican Americans settled permanently on the South Side. At the same time, the city and its industrial suburbs began to shed manufacturing jobs and decreased the white population. Although the suburbs maintained separate governance, Milwaukee and its surrounding counties (Ozaukee, Washington, and Waukesha) grew into an interconnected metropolitan whole. In the 21st century, the city’s population stabilized at a bit under 600,000 residents, and local government spearheaded downtown revitalization efforts. Approximately one million people lived in the surrounding counties.

Article

The New Deal  

Wendy L. Wall

The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state. The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy. Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”

Article

Professional Team Sports in the United States  

Steven A. Riess

Professional sports teams are athletic organizations comprising talented, expert players hired by club owners whose revenues originally derived from admission fees charged to spectators seeing games in enclosed ballparks or indoor arenas. Teams are usually members of a league that schedules a championship season, although independent teams also can arrange their own contests. The first professional baseball teams emerged in the east and Midwest in 1860s, most notably the all-salaried undefeated Cincinnati Red Stockings of 1869. The first league was the haphazardly organized National Association of Professional Base Ball Players (1871), supplanted five years later by the more profit-oriented National League (NL) that set up strict rules for franchise locations, financing, and management–employee relations (including a reserve clause in 1879, which bound players to their original employer), and barred African Americans after 1884. Once the NL prospered, rival major leagues also sprang up, notably the American Association in 1882 and the American League in 1901. Major League Baseball (MLB) became a model for the professionalization of football, basketball, and hockey, which all had short-lived professional leagues around the turn of the century. The National Football League and the National Hockey League of the 1920s were underfinanced regional operations, and their teams often went out of business, while the National Basketball Association was not even organized until 1949. Professional team sports gained considerable popularity after World War II. The leagues dealt with such problems as franchise relocations and nationwide expansion, conflicts with interlopers, limiting player salaries, and racial integration. The NFL became the most successful operation by securing rich national television contracts, supplanting baseball as the national pastime in the 1970s. All these leagues became lucrative investments. With the rise of “free agency,” professional team athletes became extremely well paid, currently averaging more than $2 million a year.

Article

Skyscrapers and Tall Buildings  

Elihu Rubin

The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.

Article

The Counterculture of the 1960s and 1970s  

Blake Slonecker

In the decade after 1965, radicals responded to the alienating features of America’s technocratic society by developing alternative cultures that emphasized authenticity, individualism, and community. The counterculture emerged from a handful of 1950s bohemian enclaves, most notably the Beat subcultures in the Bay Area and Greenwich Village. But new influences shaped an eclectic and decentralized counterculture after 1965, first in San Francisco’s Haight-Ashbury district, then in urban areas and college towns, and, by the 1970s, on communes and in myriad counter-institutions. The psychedelic drug cultures around Timothy Leary and Ken Kesey gave rise to a mystical bent in some branches of the counterculture and influenced counterculture style in countless ways: acid rock redefined popular music; tie dye, long hair, repurposed clothes, and hip argot established a new style; and sexual mores loosened. Yet the counterculture’s reactionary elements were strong. In many counterculture communities, gender roles mirrored those of mainstream society, and aggressive male sexuality inhibited feminist spins on the sexual revolution. Entrepreneurs and corporate America refashioned the counterculture aesthetic into a marketable commodity, ignoring the counterculture’s incisive critique of capitalism. Yet the counterculture became the basis of authentic “right livelihoods” for others. Meanwhile, the politics of the counterculture defy ready categorization. The popular imagination often conflates hippies with radical peace activists. But New Leftists frequently excoriated the counterculture for rejecting political engagement in favor of hedonistic escapism or libertarian individualism. Both views miss the most important political aspects of the counterculture, which centered on the embodiment of a decentralized anarchist bent, expressed in the formation of counter-institutions like underground newspapers, urban and rural communes, head shops, and food co-ops. As the counterculture faded after 1975, its legacies became apparent in the redefinition of the American family, the advent of the personal computer, an increasing ecological and culinary consciousness, and the marijuana legalization movement.

Article

The United States in the 1920s  

Paul V. Murphy

Americans grappled with the implications of industrialization, technological progress, urbanization, and mass immigration with startling vigor and creativity in the 1920s even as wide numbers kept their eyes as much on the past as on the future. American industrial engineers and managers were global leaders in mass production, and millions of citizens consumed factory-made products, including electric refrigerators and vacuum cleaners, technological marvels like radios and phonographs, and that most revolutionary of mass-produced durables, the automobile. They flocked to commercial amusements (movies, sporting events, amusement parks) and absorbed mass culture in their homes, through the radio and commercial recordings. In the major cities, skyscrapers drew Americans upward while thousands of new miles of roads scattered them across the country. Even while embracing the dynamism of modernity, Americans repudiated many of the progressive impulses of the preceding era. The transition from war to peace in 1919 and 1920 was tumultuous, marked by class conflict, a massive strike wave, economic crisis, and political repression. Exhausted by reform, war, and social experimentation, millions of Americans recoiled from central planning and federal power and sought determinedly to bypass traditional politics in the 1920s. This did not mean a retreat from active and engaged citizenship; Americans fought bitterly over racial equality, immigration, religion, morals, Prohibition, economic justice, and politics. In a greatly divided nation, citizens experimented with new forms of nationalism, cultural identity, and social order that could be alternatively exclusive and pluralistic. Whether repressive or tolerant, such efforts held the promise of unity amid diversity; even those in the throes of reaction sought new ways of integration. The result was a nation at odds with itself, embracing modernity, sometimes heedlessly, while seeking desperately to retain a grip on the past.