101-120 of 449 Results

Article

Peter Cole

The history of dockworkers in America is as fascinating and important as it is unfamiliar. Those who worked along the shore loading and unloading ships played an invaluable role in an industry central to both the U.S. and global economies as well as the making of the nation. For centuries, their work remained largely the same, involving brute manual labor in gangs; starting in the 1960s, however, their work was entirely remade due to technological transformation. Dockworkers possess a long history of militancy, resulting in dramatic improvements in their economic and workplace conditions. Today, nearly all are unionists, but dockworkers in ports along the Atlantic and Gulf coasts belong to the International Longshoremen’s Association (ILA), while the International Longshore and Warehouse Union (ILWU) represents them in Pacific Coast ports as well as in Hawaii and Alaska (along with British Columbia and Panama). In the mid-1930s, the ILA and ILWU became bitter rivals and remain so. This feud, which has cooled slightly since its outset, can be explained by differences in leadership, ideology, and tactics, with the ILA more craft-based, “patriotic,” and mainstream and the ILWU quite left wing, especially during its first few decades, and committed to fighting for racial equality. The existence of two unions complicates this story; in most countries, dockworkers belong to a single union. Similarly, America’s massive economy and physical size means that there are literally dozens of ports (again, unlike many other countries), making generalizations harder. Unfortunately, popular culture depictions of dockworkers inculcate unfair and incorrect notions that all dockworkers are involved with organized crime. Nevertheless, due to decades of militancy, strikes, and unionism, dockworkers in 21st-century America are—while far fewer in number—very well paid and still do important work, literally making world trade possible in an era when 90 percent of goods move by ship for at least part of their journey to market.

Article

Domestic work was, until 1940, the largest category of women’s paid labor. Despite the number of women who performed domestic labor for pay, the wages and working conditions were often poor. Workers labored long hours for low pay and were largely left out of state labor regulations. The association of domestic work with women’s traditional household labor, defined as a “labor of love” rather than as real work, and its centrality to southern slavery, have contributed to its low status. As a result, domestic work has long been structured by class, racial, and gendered hierarchies. Nevertheless, domestic workers have time and again done their best to resist these conditions. Although traditional collective bargaining techniques did not always translate to the domestic labor market, workers found various collective and individual methods to insist on higher wages and demand occupational respect, ranging from quitting to “pan-toting” to forming unions.

Article

The use of illicit drugs in US cities led to the development of important subcultures with shared practices, codes, discourses, and values. From the 19th century onward, American city dwellers have indulged in opiates, cocaine, amphetamines, cannabis, lysergic acid diethylamide (LSD), crack, and 3,4-Methylenedioxymethamphetamine (also known as MDMA or ecstasy). The population density of metropolitan America contributed to the spread of substance use and the rise of communities that centered their lives on drug consumption. In the history of urban drug use, opiates have outlasted all the other drugs and have naturally attracted the bulk of scholarly attention. The nature and identity of these illicit subcultures usually depended on the pharmacology of the drugs and the setting in which they were used. Addictive substances like heroin and amphetamines certainly led to the rise of crime in certain urban areas, but by the same token many urban Americans managed to integrate their addiction into their everyday lives. The more complex pharmacology of psychedelic drugs like LSD in turn gave birth to rich subcultures that resist easy classifications. Most drugs began their careers as medical marvels that were accepted as the product of modernity and often used by the middle class or medical practitioners. Race, age, and class prejudice, and the association of drugs with visible subcultures perceived to pose a threat to the moral fabric of society can partly explain their subsequent bans.

Article

Probably no American president was more thoroughly versed in matters of national security and foreign policy before entering office than Dwight David Eisenhower. As a young military officer, Eisenhower served stateside in World War I and then in Panama and the Philippines in the interwar years. On assignments in Washington and Manila, he worked on war plans, gaining an understanding that national security entailed economic and psychological factors in addition to manpower and materiel. In World War II, he commanded Allied forces in the European Theatre of Operations and honed his skills in coalition building and diplomacy. After the war, he oversaw the German occupation and then became Army Chief of Staff as the nation hastily demobilized. At the onset of the Cold War, Eisenhower embraced President Harry S. Truman’s containment doctrine and participated in the discussions leading to the 1947 National Security Act establishing the Central Intelligence Agency, the National Security Council, and the Department of Defense. After briefly retiring from the military, Eisenhower twice returned to public service at the behest of President Truman to assume the temporary chairmanship of the Joint Chiefs of Staff and then, following the outbreak of the Korean War, to become the first Supreme Allied Commander, Europe, charged with transforming the North Atlantic Treaty Organization into a viable military force. These experiences colored Eisenhower’s foreign policy views, which in turn led him to seek the presidency. He viewed the Cold War as a long-term proposition and worried that Truman’s military buildup would overtax finite American resources. He sought a coherent strategic concept that would be sustainable over the long haul without adversely affecting the free enterprise system and American democratic institutions. He also worried that Republican Party leaders were dangerously insular. As president, his New Look policy pursued a cost-effective strategy of containment by means of increased reliance on nuclear forces over more expensive conventional ones, sustained existing regional alliances and developed new ones, sought an orderly process of decolonization under Western guidance, resorted to covert operations to safeguard vital interests, and employed psychological warfare in the battle with communism for world opinion, particularly in the so-called Third World. His foreign policy laid the basis for what would become the overall American strategy for the duration of the Cold War. The legacy of that policy, however, was decidedly mixed. Eisenhower avoided the disaster of global war, but technological innovations did not produce the fiscal savings that he had envisioned. The NATO alliance expanded and mostly stood firm, but other alliances were more problematic. Decolonization rarely proceeded as smoothly as envisioned and caused conflict with European allies. Covert operations had long-term negative consequences. In Southeast Asia and Cuba, the Eisenhower administration’s policies bequeathed a poisoned chalice for succeeding administrations.

Article

From 1775 to 1815, empire served as the most pressing foreign relationship problem for the United States. Would the new nation successfully break free from the British Empire? What would an American empire look like? How would it be treated by other empires? And could Americans hold their own against European superpowers? These questions dominated the United States’ first few decades of existence and shaped its interactions with American Indian, Haitian, Spanish, British, and French peoples. The US government—first the Continental Congress, then the Confederation Congress, and finally the federal administration under the new Constitution—grappled with five key issues. First, they sought international recognition of their independence and negotiated trade deals during the Revolutionary War to support the war effort. Second, they obtained access to the Mississippi River and Port of New Orleans from Spain and France to facilitate trade and western settlement. Third, they grappled with ongoing conflict with Indian nations over white settlement on Indian lands and demands from white communities for border security. Fourth, they defined and protected American neutrality, negotiated a trade policy that required European recognition of American independence, and denied recognition to Haiti. Lastly, they fought a quasi-war with France and real war with Great Britain in 1812.

Article

Gregory F. Domber

American policy makers have rarely elevated Eastern Europe to the pinnacle of American grand strategy. The United States’ and Eastern Europe’s histories, however, are intertwined through the exchange of people and shared experiences. In the Age of Revolution, Eastern Europeans traveled to the United States to fight for the same causes they championed at home: to break from imperial control and expand the rights of man. At the end of the 19th century, “New Immigrants” from Eastern Europe streamed into America’s expanding cities. When countries in the region have moved to the forefront of American concerns during specific crises, Eastern European interests were regularly deemed secondary to larger American geopolitical interests. This holds true for the settlement of World War I, the conclusion of World War II, and the entirety of the Cold War. Overall, including Eastern Europeans and Eastern Europe in the history of the United States provides essential nuance and texture to broader patterns in American relations and more often than not provides evidence of the limitations of American power as it is altered by competing powers and local conditions.

Article

The conspicuous timing of the publication of Adam Smith’s The Wealth of Nations and America’s Declaration of Independence, separated by only a few months in 1776, has attracted a great deal of historical attention. America’s revolution was in large part motivated by the desire to break free from British mercantilism and engage the principles, both material and ideological, found in Smith’s work. From 1776 to the present day, the preponderance of capitalism in American economic history and the influence of The Wealth of Nations in American intellectual culture have contributed to the conventional wisdom that America and Smith enjoy a special relationship. After all, no nation has consistently pursued the tenets of Smithian-inspired capitalism, mainly free and competitive markets, a commitment to private property, and the pursuit of self-interests and profits, more than the United States. The shadow of Smith’s The Wealth of Nations looms large over America. But a closer look at American economic thought and practice demonstrates that Smith’s authority was not as dominant as the popular history assumes. Although most Americans accepted Smith’s work as the foundational text in political economy and extracted from it the cardinal principles of intellectual capitalism, its core values were twisted, turned, and fused together in contorted, sometimes contradictory fashions. American economic thought also reflects the widespread belief that the nation would trace an exceptional course, distinct from the Old World, and therefore necessitating a political economy suited to American traditions and expectations. Hybrid capitalist ideologies, although rooted in Smithian-inspired liberalism, developed within a dynamic domestic discourse that embraced ideological diversity and competing paradigms, exactly the kind expected from a new nation trying to understand its economic past, establish its present, and project its future. Likewise, American policymakers crafted legislation that brought the national economy both closer to and further from the Smithian ideal. Hybrid intellectual capitalism—a compounded ideological approach that antebellum American economic thinkers deployed to help rationalize the nation’s economic development—imitated the nation’s emergent hybrid material capitalism. Labor, commodity, and capital markets assumed amalgamated forms, combining, for instance, slave and free labor, private and public enterprises, and open and protected markets. Americans constructed different types of capitalism, reflecting a preference for mixtures of practical thought and policy that rarely conformed to strict ideological models. Historians of American economic thought and practice study capitalism as an evolutionary, dynamic institution with manifestations in traditional, expected corners, but historians also find capitalism demonstrated in unorthodox ways and practiced in obscure corners of market society that blended capitalist with non-capitalist experiences. In the 21st century, the benefits of incorporating conventional economic analysis with political, social, and cultural narratives are widely recognized. This has helped broaden scholars’ understanding of what exactly constitutes capitalism. And in doing so, the malleability of American economic thought and practice is put on full display, improving scholars’ appreciation for what remains the most significant material development in world history.

Article

Judge Glock

Despite almost three decades of strong and stable growth after World War II, the US economy, like the economies of many developed nations, faced new headwinds and challenges after 1970. Although the United States eventually overcame many of them, and continues to be one of the most dynamic in the world, it could not recover its mid-century economic miracle of rapid and broad-based economic growth. There are three major ways the US economy changed in this period. First, the US economy endured and eventually conquered the problem of high inflation, even as it instituted new policies that prioritized price stability over the so-called “Keynesian” goal of full employment. Although these new policies led to over two decades of moderate inflation and stable growth, the 2008 financial crisis challenged the post-Keynesian consensus and led to new demands for government intervention in downturns. Second, the government’s overall influence on the economy increased dramatically. Although the government deregulated several sectors in the 1970s and 1980s, such as transportation and banking, it also created new types of social and environmental regulation that were more pervasive. And although it occasionally cut spending, on the whole government spending increased substantially in this period, until it reached about 35 percent of the economy. Third, the US economy became more open to the world, and it imported more manufactured goods, even as it became more based on “intangible” products and on services rather than on manufacturing. These shifts created new economic winners and losers. Some institutions that thrived in the older economy, such as unions, which once compromised over a third of the workforce, became shadows of their former selves. The new service economy also created more gains for highly educated workers and for investors in quickly growing businesses, while blue-collar workers’ wages stagnated, at least in relative terms. Most of the trends that affected the US economy in this period were long-standing and continued over decades. Major national and international crises in this period, from the end of the Cold War, to the first Gulf War in 1991, to the September 11 attacks of 2001, seemed to have only a mild or transient impact on the economy. Two events that were of lasting importance were, first, the United States leaving the gold standard in 1971, which led to high inflation in the short term and more stable monetary policy over the long term; and second, the 2008 financial crisis, which seemed to permanently decrease American economic output even while it increased political battles about the involvement of government in the economy. The US economy at the beginning of the third decade of the 21st century was richer than it had ever been, and remained in many respects the envy of the world. But widening income gaps meant many Americans felt left behind in this new economy, and led some to worry that the stability and predictability of the old economy had been lost.

Article

Americans almost universally agree on the importance of education to the success of individuals and the strength of the nation. Yet they have long differed over the proper mission of government in overseeing their schools. Before 1945, these debates largely occurred at the local and state levels. Since 1945, as education has become an increasingly national and international concern, the federal government has played a larger role in the nation’s schools. As Americans gradually have come to accept a greater federal presence in elementary and secondary schools, however, members of Congress and presidents from both major parties have continued to argue over the scope and substance of the federal role. From 1945 to 1965, these arguments centered on the quest for equity between rich and poor public school pupils and between public and nonpublic school students. From 1965 to 1989, national lawmakers devoted much of their attention to the goal of excellence in public education. From 1989 to the present, they have quarreled over how best to attain equity and excellence at the same time.

Article

Vincent J. Cannato

The Ellis Island Immigration Station, located in New York Harbor, opened in 1892 and closed in 1954. During peak years from the 1890s until the 1920s, the station processed an estimated twelve million immigrants. Roughly 75 percent of all immigrants arriving in America during this period passed through Ellis Island. The station was run by the federal Immigration Service and represented a new era of federal control over immigration. Officials at Ellis Island were tasked with regulating the flow of immigration by enforcing a growing body of federal laws that barred various categories of “undesirable” immigrants. As the number of immigrants coming to America increased, so did the size of the inspection facility. In 1907, Ellis Island processed more than one million immigrants. The quota laws of the 1920s slowed immigration considerably and the rise of the visa system meant that Ellis Island no longer served as the primary immigrant inspection facility. For the next three decades, Ellis Island mostly served as a detention center for those ordered deported from the country. After Ellis Island closed in 1954, the facility fell into disrepair. During a period of low immigration and a national emphasis on assimilation, the immigrant inspection station was forgotten by most Americans. With a revival of interest in ethnicity in the 1970s, Ellis Island attracted more attention, especially from the descendants of immigrants who entered the country through its doors. In the 1980s, large-scale fundraising for the restoration of the neighboring Statue of Liberty led to a similar effort to restore part of Ellis Island. In 1990, the Main Building was reopened to the public as an immigration museum under the National Park Service. Ellis Island has evolved into an iconic national monument with deep meaning for the descendants of the immigrants who arrived there, as well as a contested symbol to other Americans grappling with the realities of contemporary immigration.

Article

Employers began organizing with one another to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops, workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the 20th century’s first three decades, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was mostly successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right-to-work,” insisting that individual laborers must enjoy freedom from so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, begin passing “right-to-work” laws, which made union organizing more difficult because workers were not obligated to join unions or pay their “fair share” of dues to them. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.

Article

Energy systems have played a significant role in U.S. history; some scholars claim that they have determined a number of other developments. From the colonial period to the present, Americans have shifted from depending largely on wood and their own bodies, as well as the labor of draft animals; to harnessing water power; to building steam engines; to extracting fossil fuels—first coal and then oil; to distributing electrical power through a grid. Each shift has been accompanied by a number of other striking changes, especially in the modern period associated with fossil fuels. By the late 19th century, in part thanks to new energy systems, Americans were embracing industrialization, urbanization, consumerism, and, in a common contemporary phrase, “the annihilation of space and time.” Today, in the era of climate change, the focus tends to be on the production or supply side of energy systems, but a historical perspective reminds us to consider the consumption or demand side as well. Just as important as the striking of oil in Beaumont, Texas, in 1901, was the development of new assumptions about how much energy people needed to sustain their lives and how much work they could be expected to do. Clearly, Americans are still grappling with the question of whether their society’s heavy investment in coal- and petroleum-based energy systems has been worthwhile.

Article

The Enlightenment, a complex cultural phenomenon that lasted approximately from the late seventeenth century until the early nineteenth century, contained a dynamic mix of contrary beliefs and epistemologies. Its intellectual coherence arguably came from its distinctive historical sensibility, which was rooted in the notion that advances in the natural sciences had gifted humankind with an exceptional opportunity in the eighteenth century for self-improvement and societal progress. That unifying historical outlook was flexible and adaptable. Consequently, many aspects of the Enlightenment were left open to negotiation at local and transnational levels. They were debated by the philosophes who met in Europe’s coffeehouses, salons, and scientific societies. Equally, they were contested outside of Europe through innumerable cross-cultural exchanges as well as via long-distance intellectual interactions. America—whether it is understood expansively as the two full continents and neighboring islands within the Western Hemisphere or, in a more limited way, as the territory that now constitutes the United States—played an especially prominent role in the Enlightenment. The New World’s abundance of plants, animals, and indigenous peoples fascinated early modern natural historians and social theorists, stimulated scientific activity, and challenged traditional beliefs. By the eighteenth century, the Western Hemisphere was an important site for empirical science and also for the intersection of different cultures of knowledge. At the same time, European conceptions of the New World as an undeveloped region inhabited by primitive savages problematized Enlightenment theories of universal progress. Comparisons of Native Americans to Africans, Asians, and Europeans led to speculation about the existence of separate human species or races. Similarly, the prevalence and profitability of American slavery fueled new and increasingly scientific conceptions of race. Eighteenth-century analyses of human differences complicated contemporary assertions that all men possessed basic natural rights. Toward the end of the eighteenth century, the American Revolution focused international attention on man’s innate entitlement to life, liberty, and happiness. Yet, in a manner that typified the contradictions and paradoxes of the Enlightenment, the founders of the United States opted to preserve slavery and social inequality after winning political freedom from Britain.

Article

By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.

Article

Rachel Rothschild

The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.

Article

David S. Jones

Few developments in human history match the demographic consequences of the arrival of Europeans in the Americas. Between 1500 and 1900 the human populations of the Americas were traBnsformed. Countless American Indians died as Europeans established themselves, and imported Africans as slaves, in the Americas. Much of the mortality came from epidemics that swept through Indian country. The historical record is full of dramatic stories of smallpox, measles, influenza, and acute contagious diseases striking American Indian communities, causing untold suffering and facilitating European conquest. Some scholars have gone so far as to invoke the irresistible power of natural selection to explain what happened. They argue that the long isolation of Native Americans from other human populations left them uniquely susceptible to the Eurasian pathogens that accompanied European explorers and settlers; nothing could have been done to prevent the inevitable decimation of American Indians. The reality, however, is more complex. Scientists have not found convincing evidence that American Indians had a genetic susceptibility to infectious diseases. Meanwhile, it is clear that the conditions of life before and after colonization could have left Indians vulnerable to a host of diseases. Many American populations had been struggling to subsist, with declining populations, before Europeans arrived; the chaos, warfare, and demoralization that accompanied colonization made things worse. Seen from this perspective, the devastating mortality was not the result of the forces of evolution and natural selection but rather stemmed from social, economic, and political forces at work during encounter and colonization. Getting the story correct is essential. American Indians in the United States, and indigenous populations worldwide, still suffer dire health inequalities. Although smallpox is gone and many of the old infections are well controlled, new diseases have risen to prominence, especially heart disease, diabetes, cancer, substance abuse, and mental illness. The stories we tell about the history of epidemics in Indian country influence the policies we pursue to alleviate them today.

Article

The Equal Rights Amendment (ERA), designed to enshrine in the Constitution of the United States a guarantee of equal rights to women and men, has had a long and volatile history. When first introduced in Congress in 1923, three years after ratification of the woman suffrage amendment to the US Constitution, the ERA faced fierce opposition from the majority of former suffragists. These progressive women activists opposed the ERA because it threatened hard-won protective labor legislation for wage-earning women. A half century later, however, the amendment enjoyed such broad support that it was passed by the requisite two-thirds of Congress and, in 1972, sent to the states for ratification. Unexpectedly, virulent opposition emerged during the ratification process, not among progressive women this time but among conservatives, whose savvy organizing prevented ratification by a 1982 deadline. Many scholars contend that despite the failure of ratification, equal rights thinking so triumphed in the courts and legislatures by the 1990s that a “de facto ERA” was in place. Some feminists, distrustful of reversible court decisions and repealable legislation, continued to agitate for the ERA; others voiced doubt that ERA would achieve substantive equality for women. Because support for an ERA noticeably revived in the 2010s, this history remains very much in progress.

Article

Jordan Stanger-Ross

Ethnicity is a concept employed to understand the social, cultural, and political processes whereby immigrants and their children cease to be “foreign” and yet retain practices and networks that connect them, at least imaginatively, with places of origin. From an early juncture in American history, ethnic neighborhoods were an important part of such processes. Magnets for new arrivals, city neighborhoods both emerged from and reinforced connections among people of common origins. Among the first notable immigrant neighborhoods in American cities were those composed of people from the German-speaking states of Europe. In the second half of the 19th century, American cities grew rapidly and millions of immigrants arrived to the country from a wider array of origins; neighborhoods such as the New York’s Jewish Lower East Side and San Francisco’s Chinatown supported dense and institutionally complex ethnic networks. In the middle decades of the 20th century, immigration waned as a result of legislative restriction, economic depression, and war. Many former immigrant neighborhoods emptied of residents as cities divided along racial lines and “white ethnics” dispersed to the suburbs. However, some ethnic enclaves endured, while others emerged after the resumption of mass immigration in the 1960s. By the turn of the 21st century ethnic neighborhoods were once again an important facet of American urban life, although they took new forms within the reconfigured geography and economy of a suburbanized nation.

Article

The Sacramento Delta is an agricultural region in northern California with deep historic significance for Asian Americans. Asian American laborers were instrumental to the development of Sacramento Delta, transforming the swampy peat bog into one of the richest agricultural areas in California. Beginning in the mid-19th century, Chinese laborers constructed levees, dikes, and ditches along the Sacramento and San Joaquin Rivers before breaking the fertile soil to grow fruit and vegetables including pears and asparagus. Asian Americans continued a permanent and transient presence in the Sacramento Delta on farms as migrant farm laborers, permanent farmworkers, and overseers, and in the small delta towns such as Isleton that emerged as merchants, restaurant operators, boardinghouse operators, and other business owners catering to the local community.

Article

Cody R. Melcher and Michael Goldfield

The failure of labor unions to succeed in the American South, largely because national unions proved unable or unwilling to confront white supremacy head on, offers an important key to understanding post–World War II American politics, especially the rise of the civil rights movement. Looking at the 1930s and 1940s, it is clear that the failure was not the result of a cultural aversion to collective action on the part of white workers in the South, as several histories have suggested, but rather stemmed from the refusal of the conservative leadership in the Congress of Industrial Organizations (CIO) to organize an otherwise militant southern workforce composed of both whites and Blacks. These lost opportunities, especially among southern woodworkers and textile workers, contrasts sharply with successful interracial union drives among southern coal miners and steelworkers, especially in Alabama. Counterfactual examples of potentially durable civil rights unionism illustrate how the labor movement could have affected the civil rights movement and transformed politics had the South been unionized.