You are looking at 41-50 of 197 articles
- American History x
Domestic work was, until 1940, the largest category of women’s paid labor. Despite the number of women who performed domestic labor for pay, the wages and working conditions were often poor. Workers labored long hours for low pay and were largely left out of state labor regulations. The association of domestic work with women’s traditional household labor, defined as a “labor of love” rather than as real work, and its centrality to southern slavery, have contributed to its low status. As a result, domestic work has long been structured by class, racial, and gendered hierarchies. Nevertheless, domestic workers have time and again done their best to resist these conditions. Although traditional collective bargaining techniques did not always translate to the domestic labor market, workers found various collective and individual methods to insist on higher wages and demand occupational respect, ranging from quitting to “pan-toting” to forming unions.
Lawrence J. McAndrews
Americans almost universally agree on the importance of education to the success of individuals and the strength of the nation. Yet they have long differed over the proper mission of government in overseeing their schools. Before 1945, these debates largely occurred at the local and state levels. Since 1945, as education has become an increasingly national and international concern, the federal government has played a larger role in the nation’s schools. As Americans gradually have come to accept a greater federal presence in elementary and secondary schools, however, members of Congress and presidents from both major parties have continued to argue over the scope and substance of the federal role. From 1945 to 1965, these arguments centered on the quest for equity between rich and poor public school pupils and between public and nonpublic school students. From 1965 to 1989, national lawmakers devoted much of their attention to the goal of excellence in public education. From 1989 to the present, they have quarreled over how best to attain equity and excellence at the same time.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Employers began organizing to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and the unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops—workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the first three decades of the 20th century, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was largely successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right to work,” insisting that individual laborers must enjoy freedom from the so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, began passing “right to work” laws, which made union organizing more difficult as workers were not obligated to join unions or pay their “fair share” of dues. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.
Energy systems have played a significant role in U.S. history; some scholars claim that they have determined a number of other developments. From the colonial period to the present, Americans have shifted from depending largely on wood and their own bodies, as well as the labor of draft animals; to harnessing water power; to building steam engines; to extracting fossil fuels—first coal and then oil; to distributing electrical power through a grid. Each shift has been accompanied by a number of other striking changes, especially in the modern period associated with fossil fuels. By the late 19th century, in part thanks to new energy systems, Americans were embracing industrialization, urbanization, consumerism, and, in a common contemporary phrase, “the annihilation of space and time.” Today, in the era of climate change, the focus tends to be on the production or supply side of energy systems, but a historical perspective reminds us to consider the consumption or demand side as well. Just as important as the striking of oil in Beaumont, Texas, in 1901, was the development of new assumptions about how much energy people needed to sustain their lives and how much work they could be expected to do. Clearly, Americans are still grappling with the question of whether their society’s heavy investment in coal- and petroleum-based energy systems has been worthwhile.
John M. Dixon
The Enlightenment, a complex cultural phenomenon that lasted approximately from the late seventeenth century until the early nineteenth century, contained a dynamic mix of contrary beliefs and epistemologies. Its intellectual coherence arguably came from its distinctive historical sensibility, which was rooted in the notion that advances in the natural sciences had gifted humankind with an exceptional opportunity in the eighteenth century for self-improvement and societal progress. That unifying historical outlook was flexible and adaptable. Consequently, many aspects of the Enlightenment were left open to negotiation at local and transnational levels. They were debated by the philosophes who met in Europe’s coffeehouses, salons, and scientific societies. Equally, they were contested outside of Europe through innumerable cross-cultural exchanges as well as via long-distance intellectual interactions.
America—whether it is understood expansively as the two full continents and neighboring islands within the Western Hemisphere or, in a more limited way, as the territory that now constitutes the United States—played an especially prominent role in the Enlightenment. The New World’s abundance of plants, animals, and indigenous peoples fascinated early modern natural historians and social theorists, stimulated scientific activity, and challenged traditional beliefs. By the eighteenth century, the Western Hemisphere was an important site for empirical science and also for the intersection of different cultures of knowledge. At the same time, European conceptions of the New World as an undeveloped region inhabited by primitive savages problematized Enlightenment theories of universal progress. Comparisons of Native Americans to Africans, Asians, and Europeans led to speculation about the existence of separate human species or races. Similarly, the prevalence and profitability of American slavery fueled new and increasingly scientific conceptions of race. Eighteenth-century analyses of human differences complicated contemporary assertions that all men possessed basic natural rights. Toward the end of the eighteenth century, the American Revolution focused international attention on man’s innate entitlement to life, liberty, and happiness. Yet, in a manner that typified the contradictions and paradoxes of the Enlightenment, the founders of the United States opted to preserve slavery and social inequality after winning political freedom from Britain.
The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.
David S. Jones
Few developments in human history match the demographic consequences of the arrival of Europeans in the Americas. Between 1500 and 1900 the human populations of the Americas were traBnsformed. Countless American Indians died as Europeans established themselves, and imported Africans as slaves, in the Americas. Much of the mortality came from epidemics that swept through Indian country. The historical record is full of dramatic stories of smallpox, measles, influenza, and acute contagious diseases striking American Indian communities, causing untold suffering and facilitating European conquest. Some scholars have gone so far as to invoke the irresistible power of natural selection to explain what happened. They argue that the long isolation of Native Americans from other human populations left them uniquely susceptible to the Eurasian pathogens that accompanied European explorers and settlers; nothing could have been done to prevent the inevitable decimation of American Indians. The reality, however, is more complex. Scientists have not found convincing evidence that American Indians had a genetic susceptibility to infectious diseases. Meanwhile, it is clear that the conditions of life before and after colonization could have left Indians vulnerable to a host of diseases. Many American populations had been struggling to subsist, with declining populations, before Europeans arrived; the chaos, warfare, and demoralization that accompanied colonization made things worse. Seen from this perspective, the devastating mortality was not the result of the forces of evolution and natural selection but rather stemmed from social, economic, and political forces at work during encounter and colonization. Getting the story correct is essential. American Indians in the United States, and indigenous populations worldwide, still suffer dire health inequalities. Although smallpox is gone and many of the old infections are well controlled, new diseases have risen to prominence, especially heart disease, diabetes, cancer, substance abuse, and mental illness. The stories we tell about the history of epidemics in Indian country influence the policies we pursue to alleviate them today.
Kelly N. Fong
The Sacramento Delta is an agricultural region in northern California with deep historic significance for Asian Americans. Asian American laborers were instrumental to the development of Sacramento Delta, transforming the swampy peat bog into one of the richest agricultural areas in California. Beginning in the mid-19th century, Chinese laborers constructed levees, dikes, and ditches along the Sacramento and San Joaquin Rivers before breaking the fertile soil to grow fruit and vegetables including pears and asparagus. Asian Americans continued a permanent and transient presence in the Sacramento Delta on farms as migrant farm laborers, permanent farmworkers, and overseers, and in the small delta towns such as Isleton that emerged as merchants, restaurant operators, boardinghouse operators, and other business owners catering to the local community.
N. Bruce Duthu
United States law recognizes American Indian tribes as distinct political bodies with powers of self-government. Their status as sovereign entities predates the formation of the United States and they are enumerated in the U.S. Constitution as among the subjects (along with foreign nations and the several states) with whom Congress may engage in formal relations. And yet, despite this long-standing recognition, federal Indian law remains curiously ambivalent, even conflicted, about the legal and political status of Indian tribes within the U.S. constitutional structure. On the one hand, tribes are recognized as sovereign bodies with powers of self-government within their lands. On the other, long-standing precedents of the Supreme Court maintain that Congress possesses plenary power over Indian tribes, with authority to modify or even eliminate their powers of self-government. These two propositions are in tension with one another and are at the root of the challenges faced by political leaders and academics alike in trying to understand and accommodate the tribal rights to self-government. The body of laws that make up the field of federal Indian law include select provisions of the U.S. Constitution (notably the so-called Indian Commerce Clause), treaties between the United States and various Indian tribes, congressional statutes, executive orders, regulations, and a complex and rich body of court decisions dating back to the nation’s formative years. The noted legal scholar Felix Cohen brought much-needed coherence and order to this legal landscape in the 1940s when he led a team of scholars within the Office of the Solicitor in the Department of the Interior to produce a handbook on federal Indian law. The revised edition of Cohen’s Handbook of Federal Indian Law is still regarded as the seminal treatise in the field. Critically, however, this rich body of law only hints at the real story in federal Indian law. The laws themselves serve as historical and moral markers in the ongoing clash between indigenous and nonindigenous societies and cultures still seeking to establish systems of peaceful coexistence in shared territories. It is a story about the limits of legal pluralism and the willingness of a dominant society and nation to acknowledge and honor its promises to the first inhabitants and first sovereigns.
Alison L. LaCroix
Federalism refers to the constitutional and political structure of the United States of America, according to which political power is divided among multiple levels of government: the national level of government (also referred to as the “federal” or “general” government) and that of the states. It is a multilayered system of government that reserves some powers to component entities while also establishing an overarching level of government with a specified domain of authority. The structures of federalism are set forth in the Constitution of the United States, although some related ideas and practices predated the founding period and others have developed since. The balance between federal and state power has shifted throughout U.S. history, with assertions of broad national power meeting challenges from supporters of states’ rights and state sovereignty. Federalism is a fundamental value of the American political system, and it has been a controversial political and legal question since the founding period.