61-66 of 66 Results  for:

  • Economic History x
Clear all

Article

The New Deal  

Wendy L. Wall

The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state. The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy. Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”

Article

Foreign Trade Policy from the Revolution to World War I  

Marc-William Palen

Economic nationalism tended to dominate U.S. foreign trade policy throughout the long 19th century, from the end of the American Revolution to the beginning of World War I, owing to a pervasive American sense of economic and geopolitical insecurity and American fear of hostile powers, especially the British but also the French and Spanish and even the Barbary States. Following the U.S. Civil War, leading U.S. protectionist politicians sought to curtail European trade policies and to create a U.S.-dominated customs union in the Western Hemisphere. American proponents of trade liberalization increasingly found themselves outnumbered in the halls of Congress, as the “American System” of economic nationalism grew in popularity alongside the perceived need for foreign markets. Protectionist advocates in the United States viewed the American System as a panacea that not only promised to provide the federal government with revenue but also to artificially insulate American infant industries from undue foreign-market competition through high protective tariffs and subsidies, and to retaliate against real and perceived threats to U.S. trade. Throughout this period, the United States itself underwent a great struggle over foreign trade policy. By the late 19th century, the era’s boom-and-bust global economic system led to a growing perception that the United States needed more access to foreign markets as an outlet for the country’s surplus goods and capital. But whether the United States would obtain foreign market access through free trade or through protectionism led to a great debate over the proper course of U.S. foreign trade policy. By the time that the United States acquired a colonial empire from the Spanish in 1898, this same debate over U.S. foreign trade policy had effectively merged into debates over the course of U.S. imperial expansion. The country’s more expansionist-minded economic nationalists came out on top. The overwhelming 1896 victory of William McKinley—the Republican party’s “Napoleon of Protection”—marked the beginning of substantial expansion of U.S. foreign trade through a mixture of protectionism and imperialism in the years leading up to World War I.

Article

Fur Trades  

Carolyn Podruchny and Stacy Nation-Knapper

From the 15th century to the present, the trade in animal fur has been an economic venture with far-reaching consequences for both North Americans and Europeans (in which North Americans of European descent are included). One of the earliest forms of exchange between Europeans and North Americans, the trade in fur was about the garment business, global and local politics, social and cultural interaction, hunting, ecology, colonialism, gendered labor, kinship networks, and religion. European fashion, specifically the desire for hats that marked male status, was a primary driver for the global fur-trade economy until the late 19th century, while European desires for marten, fox, and other luxury furs to make and trim clothing comprised a secondary part of the trade. Other animal hides including deer and bison provided sturdy leather from which belts for the machines of the early Industrial Era were cut. European cloth, especially cotton and wool, became central to the trade for Indigenous peoples who sought materials that were lighter and dried faster than skin clothing. The multiple perspectives on the fur trade included the European men and indigenous men and women actually conducting the trade; the indigenous male and female trappers; European trappers; the European men and women producing trade goods; indigenous “middlemen” (men and women) who were conducting their own fur trade to benefit from European trade companies; laborers hauling the furs and trade goods; all those who built, managed, and sustained trading posts located along waterways and trails across North America; and those Europeans who manufactured and purchased the products made of fur and the trade goods desired by Indigenous peoples. As early as the 17th century, European empires used fur-trade monopolies to establish colonies in North America and later fur trading companies brought imperial trading systems inland, while Indigenous peoples drew Europeans into their own patterns of trade and power. By the 19th century, the fur trade had covered most of the continent and the networks of business, alliances, and families, and the founding of new communities led to new peoples, including the Métis, who were descended from the mixing of European and Indigenous peoples. Trading territories, monopolies, and alliances with Indigenous peoples shaped how European concepts of statehood played out in the making of European-descended nation-states, and the development of treaties with Indigenous peoples. The fur trade flourished in northern climes until well into the 20th century, after which time economic development, resource exploitation, changes in fashion, and politics in North America and Europe limited its scope and scale. Many Indigenous people continue today to hunt and trap animals and have fought in courts for Indigenous rights to resources, land, and sovereignty.

Article

Industrialization and Urbanization in the United States, 1880–1929  

Jonathan Rees

Between 1880 and 1929, industrialization and urbanization expanded in the United States faster than ever before. Industrialization, meaning manufacturing in factory settings using machines plus a labor force with unique, divided tasks to increase production, stimulated urbanization, meaning the growth of cities in both population and physical size. During this period, urbanization spread out into the countryside and up into the sky, thanks to new methods of building taller buildings. Having people concentrated into small areas accelerated economic activity, thereby producing more industrial growth. Industrialization and urbanization thus reinforced one another, augmenting the speed with which such growth would have otherwise occurred. Industrialization and urbanization affected Americans everywhere, but especially in the Northeast and Midwest. Technological developments in construction, transportation, and illumination, all connected to industrialization, changed cities forever, most immediately those north of Washington, DC and east of Kansas City. Cities themselves fostered new kinds of industrial activity on large and small scales. Cities were also the places where businessmen raised the capital needed to industrialize the rest of the United States. Later changes in production and transportation made urbanization less acute by making it possible for people to buy cars and live further away from downtown areas in new suburban areas after World War II ended.

Article

Smuggling in Early America  

Christian J. Koot

Smuggling was a regular feature of the economy of colonial British America in the 17th and 18th centuries. Though the very nature of illicit commerce means that the extent of this trade is incalculable, a wide variety of British and colonial sources testify to the ability of merchants to trade where they pleased and to avoid paying duties in the process. Together admiralty proceedings, merchant correspondence and account books, customs reports, and petitions demonstrate that illicit trade enriched individuals and allowed settlers to shape their colonies’ development. Smuggling formed in resistance to British economic and political control. British authorities attempted to harness the trade of their Atlantic colonies by employing a series of laws that restricted overseas commerce (often referred to as the Navigation Acts). This legislation created the opportunity for illicit trade by raising the costs of legal trade. Hampered by insufficient resources, thousands of miles of coastline, and complicit local officials, British customs agents could not prevent smuggling. Economic self-interest and the pursuit of profit certainly motivated smugglers, but because it was tied to a larger transatlantic debate about the proper balance between regulation and free trade, smuggling was also a political act. Through smuggling colonists rejected what they saw as capricious regulations designed to enrich Britain at their expense.

Article

Food and Agriculture in the 20th and 21st Centuries  

Gabriella M. Petrick

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles. The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods. Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.