You are looking at 81-100 of 375 articles
Employers began organizing with one another to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops, workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the 20th century’s first three decades, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was mostly successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right-to-work,” insisting that individual laborers must enjoy freedom from so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, begin passing “right-to-work” laws, which made union organizing more difficult because workers were not obligated to join unions or pay their “fair share” of dues to them. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.
Energy systems have played a significant role in U.S. history; some scholars claim that they have determined a number of other developments. From the colonial period to the present, Americans have shifted from depending largely on wood and their own bodies, as well as the labor of draft animals; to harnessing water power; to building steam engines; to extracting fossil fuels—first coal and then oil; to distributing electrical power through a grid. Each shift has been accompanied by a number of other striking changes, especially in the modern period associated with fossil fuels. By the late 19th century, in part thanks to new energy systems, Americans were embracing industrialization, urbanization, consumerism, and, in a common contemporary phrase, “the annihilation of space and time.” Today, in the era of climate change, the focus tends to be on the production or supply side of energy systems, but a historical perspective reminds us to consider the consumption or demand side as well. Just as important as the striking of oil in Beaumont, Texas, in 1901, was the development of new assumptions about how much energy people needed to sustain their lives and how much work they could be expected to do. Clearly, Americans are still grappling with the question of whether their society’s heavy investment in coal- and petroleum-based energy systems has been worthwhile.
John M. Dixon
The Enlightenment, a complex cultural phenomenon that lasted approximately from the late seventeenth century until the early nineteenth century, contained a dynamic mix of contrary beliefs and epistemologies. Its intellectual coherence arguably came from its distinctive historical sensibility, which was rooted in the notion that advances in the natural sciences had gifted humankind with an exceptional opportunity in the eighteenth century for self-improvement and societal progress. That unifying historical outlook was flexible and adaptable. Consequently, many aspects of the Enlightenment were left open to negotiation at local and transnational levels. They were debated by the philosophes who met in Europe’s coffeehouses, salons, and scientific societies. Equally, they were contested outside of Europe through innumerable cross-cultural exchanges as well as via long-distance intellectual interactions.
America—whether it is understood expansively as the two full continents and neighboring islands within the Western Hemisphere or, in a more limited way, as the territory that now constitutes the United States—played an especially prominent role in the Enlightenment. The New World’s abundance of plants, animals, and indigenous peoples fascinated early modern natural historians and social theorists, stimulated scientific activity, and challenged traditional beliefs. By the eighteenth century, the Western Hemisphere was an important site for empirical science and also for the intersection of different cultures of knowledge. At the same time, European conceptions of the New World as an undeveloped region inhabited by primitive savages problematized Enlightenment theories of universal progress. Comparisons of Native Americans to Africans, Asians, and Europeans led to speculation about the existence of separate human species or races. Similarly, the prevalence and profitability of American slavery fueled new and increasingly scientific conceptions of race. Eighteenth-century analyses of human differences complicated contemporary assertions that all men possessed basic natural rights. Toward the end of the eighteenth century, the American Revolution focused international attention on man’s innate entitlement to life, liberty, and happiness. Yet, in a manner that typified the contradictions and paradoxes of the Enlightenment, the founders of the United States opted to preserve slavery and social inequality after winning political freedom from Britain.
Robert R. Gioielli
By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.
The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.
David S. Jones
Few developments in human history match the demographic consequences of the arrival of Europeans in the Americas. Between 1500 and 1900 the human populations of the Americas were traBnsformed. Countless American Indians died as Europeans established themselves, and imported Africans as slaves, in the Americas. Much of the mortality came from epidemics that swept through Indian country. The historical record is full of dramatic stories of smallpox, measles, influenza, and acute contagious diseases striking American Indian communities, causing untold suffering and facilitating European conquest. Some scholars have gone so far as to invoke the irresistible power of natural selection to explain what happened. They argue that the long isolation of Native Americans from other human populations left them uniquely susceptible to the Eurasian pathogens that accompanied European explorers and settlers; nothing could have been done to prevent the inevitable decimation of American Indians. The reality, however, is more complex. Scientists have not found convincing evidence that American Indians had a genetic susceptibility to infectious diseases. Meanwhile, it is clear that the conditions of life before and after colonization could have left Indians vulnerable to a host of diseases. Many American populations had been struggling to subsist, with declining populations, before Europeans arrived; the chaos, warfare, and demoralization that accompanied colonization made things worse. Seen from this perspective, the devastating mortality was not the result of the forces of evolution and natural selection but rather stemmed from social, economic, and political forces at work during encounter and colonization. Getting the story correct is essential. American Indians in the United States, and indigenous populations worldwide, still suffer dire health inequalities. Although smallpox is gone and many of the old infections are well controlled, new diseases have risen to prominence, especially heart disease, diabetes, cancer, substance abuse, and mental illness. The stories we tell about the history of epidemics in Indian country influence the policies we pursue to alleviate them today.
The Equal Rights Amendment (ERA), designed to enshrine in the Constitution of the United States a guarantee of equal rights to women and men, has had a long and volatile history. When first introduced in Congress in 1923, three years after ratification of the woman suffrage amendment to the US Constitution, the ERA faced fierce opposition from the majority of former suffragists. These progressive women activists opposed the ERA because it threatened hard-won protective labor legislation for wage-earning women. A half century later, however, the amendment enjoyed such broad support that it was passed by the requisite two-thirds of Congress and, in 1972, sent to the states for ratification. Unexpectedly, virulent opposition emerged during the ratification process, not among progressive women this time but among conservatives, whose savvy organizing prevented ratification by a 1982 deadline. Many scholars contend that despite the failure of ratification, equal rights thinking so triumphed in the courts and legislatures by the 1990s that a “de facto ERA” was in place. Some feminists, distrustful of reversible court decisions and repealable legislation, continued to agitate for the ERA; others voiced doubt that ERA would achieve substantive equality for women. Because support for an ERA noticeably revived in the 2010s, this history remains very much in progress.
Ethnicity is a concept employed to understand the social, cultural, and political processes whereby immigrants and their children cease to be “foreign” and yet retain practices and networks that connect them, at least imaginatively, with places of origin. From an early juncture in American history, ethnic neighborhoods were an important part of such processes. Magnets for new arrivals, city neighborhoods both emerged from and reinforced connections among people of common origins. Among the first notable immigrant neighborhoods in American cities were those composed of people from the German-speaking states of Europe. In the second half of the 19th century, American cities grew rapidly and millions of immigrants arrived to the country from a wider array of origins; neighborhoods such as the New York’s Jewish Lower East Side and San Francisco’s Chinatown supported dense and institutionally complex ethnic networks. In the middle decades of the 20th century, immigration waned as a result of legislative restriction, economic depression, and war. Many former immigrant neighborhoods emptied of residents as cities divided along racial lines and “white ethnics” dispersed to the suburbs. However, some ethnic enclaves endured, while others emerged after the resumption of mass immigration in the 1960s. By the turn of the 21st century ethnic neighborhoods were once again an important facet of American urban life, although they took new forms within the reconfigured geography and economy of a suburbanized nation.
Kelly N. Fong
The Sacramento Delta is an agricultural region in northern California with deep historic significance for Asian Americans. Asian American laborers were instrumental to the development of Sacramento Delta, transforming the swampy peat bog into one of the richest agricultural areas in California. Beginning in the mid-19th century, Chinese laborers constructed levees, dikes, and ditches along the Sacramento and San Joaquin Rivers before breaking the fertile soil to grow fruit and vegetables including pears and asparagus. Asian Americans continued a permanent and transient presence in the Sacramento Delta on farms as migrant farm laborers, permanent farmworkers, and overseers, and in the small delta towns such as Isleton that emerged as merchants, restaurant operators, boardinghouse operators, and other business owners catering to the local community.
Cody R. Melcher and Michael Goldfield
The failure of labor unions to succeed in the American South, largely because national unions proved unable or unwilling to confront white supremacy head on, offers an important key to understanding post–World War II American politics, especially the rise of the civil rights movement. Looking at the 1930s and 1940s, it is clear that the failure was not the result of a cultural aversion to collective action on the part of white workers in the South, as several histories have suggested, but rather stemmed from the refusal of the conservative leadership in the Congress of Industrial Organizations (CIO) to organize an otherwise militant southern workforce composed of both whites and Blacks. These lost opportunities, especially among southern woodworkers and textile workers, contrasts sharply with successful interracial union drives among southern coal miners and steelworkers, especially in Alabama. Counterfactual examples of potentially durable civil rights unionism illustrate how the labor movement could have affected the civil rights movement and transformed politics had the South been unionized.
N. Bruce Duthu
United States law recognizes American Indian tribes as distinct political bodies with powers of self-government. Their status as sovereign entities predates the formation of the United States and they are enumerated in the U.S. Constitution as among the subjects (along with foreign nations and the several states) with whom Congress may engage in formal relations. And yet, despite this long-standing recognition, federal Indian law remains curiously ambivalent, even conflicted, about the legal and political status of Indian tribes within the U.S. constitutional structure. On the one hand, tribes are recognized as sovereign bodies with powers of self-government within their lands. On the other, long-standing precedents of the Supreme Court maintain that Congress possesses plenary power over Indian tribes, with authority to modify or even eliminate their powers of self-government. These two propositions are in tension with one another and are at the root of the challenges faced by political leaders and academics alike in trying to understand and accommodate the tribal rights to self-government. The body of laws that make up the field of federal Indian law include select provisions of the U.S. Constitution (notably the so-called Indian Commerce Clause), treaties between the United States and various Indian tribes, congressional statutes, executive orders, regulations, and a complex and rich body of court decisions dating back to the nation’s formative years. The noted legal scholar Felix Cohen brought much-needed coherence and order to this legal landscape in the 1940s when he led a team of scholars within the Office of the Solicitor in the Department of the Interior to produce a handbook on federal Indian law. The revised edition of Cohen’s Handbook of Federal Indian Law is still regarded as the seminal treatise in the field. Critically, however, this rich body of law only hints at the real story in federal Indian law. The laws themselves serve as historical and moral markers in the ongoing clash between indigenous and nonindigenous societies and cultures still seeking to establish systems of peaceful coexistence in shared territories. It is a story about the limits of legal pluralism and the willingness of a dominant society and nation to acknowledge and honor its promises to the first inhabitants and first sovereigns.
Alison L. LaCroix
Federalism refers to the constitutional and political structure of the United States of America, according to which political power is divided among multiple levels of government: the national level of government (also referred to as the “federal” or “general” government) and that of the states. It is a multilayered system of government that reserves some powers to component entities while also establishing an overarching level of government with a specified domain of authority. The structures of federalism are set forth in the Constitution of the United States, although some related ideas and practices predated the founding period and others have developed since. The balance between federal and state power has shifted throughout U.S. history, with assertions of broad national power meeting challenges from supporters of states’ rights and state sovereignty. Federalism is a fundamental value of the American political system, and it has been a controversial political and legal question since the founding period.
Adam J. Hodges
The first Red Scare, which occurred in 1919–1920, emerged out of longer clashes in the United States over the processes of industrialization, immigration, and urbanization as well as escalating conflict over the development of a labor movement challenging elite control of the economy. More immediately, the suppression of dissent during World War I and shock over a revolution in Russia that energized anti-capitalist radicals spurred further confrontations during an ill-planned postwar demobilization of the armed forces and economy.
A general strike in Seattle in February 1919 that grew out of wartime grievances among shipbuilders raised the specter of Bolshevik insurrection in the United States. National press attention fanned the flames and continued to do so throughout the year. In fact, 1919 became a record strike year. Massive coal and steel walkouts in the fall shook the industrial economy, while a work stoppage by Boston police became a national sensation and spread fears of a revolutionary breakdown in public order. Ultimately, however, much of the union militancy of the war era was crushed by the end of 1919 and the labor movement entered a period of retrenchment after 1922 that lasted until the 1930s.
Fall 1919 witnessed the creation of two competing Communist parties in the United States after months of press focus on bombs, riots, and strikes. Federal anti-radical investigative operations, which had grown enormously during World War I and continued into 1919, peaked in the so-called “Palmer Raids” of November 1919 and January 1920, named for US Attorney General A. Mitchell Palmer, who authorized them. The excesses of the Department of Justice and the decline of labor militancy caused a shift in press and public attention in 1920, though another Red Scare would escalate after World War II, with important continuities between the two.
Gabriella M. Petrick
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles.
The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods.
Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.
Cindy R. Lobel
Over the course of the 19th century, American cities developed from small seaports and trading posts to large metropolises. Not surprisingly, foodways and other areas of daily life changed accordingly. In 1800, the dietary habits of urban Americans were similar to those of the colonial period. Food provisioning was very local. Farmers, hunters, fishermen, and dairymen from a few miles away brought food by rowboats and ferryboats and by horse carts to centralized public markets within established cities. Dietary options were seasonal as well as regional. Few public dining options existed outside of taverns, which offered lodging as well as food. Most Americans, even in urban areas, ate their meals at home, which in many cases were attached to their workshops, countinghouses, and offices.
These patterns changed significantly over the course of the19th century, thanks largely to demographic changes and technological developments. By the turn of the 20th century, urban Americans relied on a food-supply system that was highly centralized and in the throes of industrialization. Cities developed complex restaurant sectors, and majority immigrant populations dramatically shaped and reshaped cosmopolitan food cultures. Furthermore, with growing populations, lax regulation, and corrupt political practices in many cities, issues arose periodically concerning the safety of the food supply. In sum, the roots of today’s urban food systems were laid down over the course of the 19th century.
Changing foodways, the consumption and production of food, access to food, and debates over food shaped the nature of American cities in the 20th century. As American cities transformed from centers of industrialization at the start of the century to post-industrial societies at the end of the 20th century, food cultures in urban America shifted in response to the ever-changing urban environment. Cities remained centers of food culture, diversity, and food reform despite these shifts.
Growing populations and waves of immigration changed the nature of food cultures throughout the United States in the 20th century. These changes were significant, all contributing to an evolving sense of American food culture. For urban denizens, however, food choice and availability were dictated and shaped by a variety of powerful social factors, including class, race, ethnicity, gender, and laboring status. While cities possessed an abundance of food in a variety of locations to consume food, fresh food often remained difficult for the urban poor to obtain as the 20th century ended.
As markets expanded from 1900 to 1950, regional geography became a less important factor in determining what types of foods were available. In the second half of the 20th century, even global geography became less important to food choices. Citrus fruit from the West Coast was readily available in northeastern markets near the start of the century, and off-season fruits and vegetables from South America filled shelves in grocery stores by the end of the 20th century. Urban Americans became further disconnected from their food sources, but this dislocation spurred counter-movements that embraced ideas of local, seasonal foods and a rethinking of the city’s relationship with its food sources.
Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.
According to the First Amendment of the US Constitution, Congress is barred from abridging the freedom of the press (“Congress shall make no law . . . abridging the freedom of speech, or of the press”). In practice, the history of press freedom is far more complicated than this simple constitutional right suggests. Over time, the meaning of the First Amendment has changed greatly. The Supreme Court largely ignored the First Amendment until the 20th century, leaving the scope of press freedom to state courts and legislatures. Since World War I, jurisprudence has greatly expanded the types of publication protected from government interference. The press now has broad rights to publish criticism of public officials, salacious material, private information, national security secrets, and much else. To understand the shifting history of press freedom, however, it is important to understand not only the expansion of formal constitutional rights but also how those rights have been shaped by such factors as economic transformations in the newspaper industry, the evolution of professional standards in the press, and the broader political and cultural relations between politicians and the press.
Alexander B. Haskell
Bacon’s Rebellion (1676–1677) was an uprising in the Virginia colony that its participants experienced as both a civil breakdown and a period of intense cosmic disorder. Although Thomas Hobbes had introduced his theory of state sovereignty a quarter century earlier, the secularizing connotations of his highly naturalized conceptualization of power had yet to make major inroads on a post-Reformation culture that was only gradually shifting from Renaissance providentialism to Enlightenment rationalism. Instead, the period witnessed a complicated interplay of providential beliefs and Hobbist doctrines. In the aftermath of the English civil war (1642–1651), this mingling of ideologies had prompted the Puritans’ own experimentation with Hobbes’s ideas, often in tandem with a Platonic spiritualism that was quite at odds with Hobbes’s own philosophical skepticism. The Restoration of 1660 had given an additional boost to Hobbism as his ideas won a number of prominent adherents in Charles II’s government.
The intermingling of providentialism and Hobbism gave Bacon’s Rebellion its particular aura of heightened drama and frightening uncertainty. In the months before the uprising, the outbreak of a war on the colony’s frontier with the Doeg and Susquehannock peoples elicited fears in the frontier counties of a momentous showdown between faithful planters and God’s enemies. In contrast, Governor Sir William Berkeley’s establishmentarian Protestantism encouraged him to see the frontiersmen’s vigilantism as impious, and the government’s more measured response to the conflict as inherently godlier because tied to time-tested hierarchies and institutions. Greatly complicating this already confusing scene, the colony also confronted a further destabilizing force in the form of the new Hobbist politics emerging from the other side of the ocean. In addition to a number of alarming policies emanating from Charles II’s court in the 1670s that sought to enhance the English state’s supremacy over the colonies, Hobbes’s doctrines also informed the young Nathaniel Bacon Jr.’s stated rationale for leading frontiersmen against local Indian communities without Berkeley’s authorization. Drawing on the Hobbes-influenced civil war-era writings of his relation the Presbyterian lawyer Nathaniel Bacon, the younger Bacon made the protection of the colony’s Christian brotherhood a moral priority that outweighed even the preservation of existing civil relations and public institutions.
While Berkeley’s antagonism toward this Hobbesian argument led him to lash out forcibly against Bacon as a singularly great threat to Virginia’s commonwealth, it was ordinary Virginians who most consequentially resisted Bacon’s strange doctrines. Yet a division persisted. Whereas the interior counties firmly rejected Bacon’s Hobbism in favor of the colony’s more traditional bonds to God and king, the frontier counties remained more open to a Hobbesian politics that promised their protection.
Carolyn Podruchny and Stacy Nation-Knapper
From the 15th century to the present, the trade in animal fur has been an economic venture with far-reaching consequences for both North Americans and Europeans (in which North Americans of European descent are included). One of the earliest forms of exchange between Europeans and North Americans, the trade in fur was about the garment business, global and local politics, social and cultural interaction, hunting, ecology, colonialism, gendered labor, kinship networks, and religion. European fashion, specifically the desire for hats that marked male status, was a primary driver for the global fur-trade economy until the late 19th century, while European desires for marten, fox, and other luxury furs to make and trim clothing comprised a secondary part of the trade. Other animal hides including deer and bison provided sturdy leather from which belts for the machines of the early Industrial Era were cut. European cloth, especially cotton and wool, became central to the trade for Indigenous peoples who sought materials that were lighter and dried faster than skin clothing. The multiple perspectives on the fur trade included the European men and indigenous men and women actually conducting the trade; the indigenous male and female trappers; European trappers; the European men and women producing trade goods; indigenous “middlemen” (men and women) who were conducting their own fur trade to benefit from European trade companies; laborers hauling the furs and trade goods; all those who built, managed, and sustained trading posts located along waterways and trails across North America; and those Europeans who manufactured and purchased the products made of fur and the trade goods desired by Indigenous peoples. As early as the 17th century, European empires used fur-trade monopolies to establish colonies in North America and later fur trading companies brought imperial trading systems inland, while Indigenous peoples drew Europeans into their own patterns of trade and power. By the 19th century, the fur trade had covered most of the continent and the networks of business, alliances, and families, and the founding of new communities led to new peoples, including the Métis, who were descended from the mixing of European and Indigenous peoples. Trading territories, monopolies, and alliances with Indigenous peoples shaped how European concepts of statehood played out in the making of European-descended nation-states, and the development of treaties with Indigenous peoples. The fur trade flourished in northern climes until well into the 20th century, after which time economic development, resource exploitation, changes in fashion, and politics in North America and Europe limited its scope and scale. Many Indigenous people continue today to hunt and trap animals and have fought in courts for Indigenous rights to resources, land, and sovereignty.