In the broader field of thanatology, scholars investigate rituals of dying, attitudes toward death, evolving trajectories of life expectancy, and more. Applying a lens of social class means studying similar themes but focusing on the men, women, and children who worked for wages in the United States. Working people were more likely to die from workplace accidents, occupational diseases, or episodes of work-related violence. In most periods of American history, it was more dangerous to be a wage worker than it was to be a soldier. Battlegrounds were not just the shop floor but also the terrain of labor relations. American labor history has been filled with violent encounters between workers asserting their views of economic justice and employers defending their private property rights. These clashes frequently turned deadly. Labor unions and working-class communities extended an ethos of mutualism and solidarity from the union halls and picket lines to memorial services and gravesites. They lauded martyrs to movements for human dignity and erected monuments to honor the fallen. Aspects of ethnicity, race, and gender added layers of meaning that intersected with and refracted through individuals’ economic positions. Workers’ encounters with death and the way they made sense of loss and sacrifice in some ways overlapped with Americans from other social classes in terms of religious custom, ritual practice, and material consumption. Their experiences were not entirely unique but diverged in significant ways.
121-140 of 541 Results
Article
Erik R. Seeman
Death is universal yet is experienced in culturally specific ways. Because of this, when individuals in colonial North America encountered others from different cultural backgrounds, they were curious about how unfamiliar mortuary practices resembled and differed from their own. This curiosity spawned communication across cultural boundaries. The resulting knowledge sometimes facilitated peaceful relations between groups, while at other times it helped one group dominate another.
Colonial North Americans endured disastrously high mortality rates caused by disease, warfare, and labor exploitation. At the same time, death was central to the religions of all residents: Indians, Africans, and Europeans. Deathways thus offer an unmatched way to understand the colonial encounter from the participants’ perspectives.
Article
Jason C. Parker
The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded.
Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.
Article
The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.
Article
Jeffrey Bloodworth
Will Rogers understood the confounding nature of the Democratic Party. In noting that “Democrats never agree on anything, that’s why they’re Democrats,” the Oklahoma humorist highlighted a consistent theme in the party’s more than 200-year history: division. The political party of the underdog and ethnic, racial, and social minorities has always lacked the cultural cohesion that the Federalists, Whigs, and Republicans possessed. As a result, the main currents of Democratic Party foreign policy elude simple categorization. Muddying any efforts at classification are the dramatically disparate eras in which Democrats conducted foreign policy over two centuries.
Like other major American political parties, the Democrats’ foreign policy was animated by a messianic theme balanced against the national and constituent interests. Thinking themselves a “chosen people,” the Revolutionary generation thought their experiment foreshadowed a new global order with universal appeal. As representatives of God’s new Israel, the Founders made their new nation’s messianic relationship to the international system essential to its identity. Shunning established foreign policy practices, they founded a style of American diplomacy that combined idealism with pragmatism. Democrats, along with most every other major political party, have followed the Founders’ example but in a manner particular to the party’s history, constituents, and circumstance.
The foreign policy connective tissue of the Democratic Party has been its particular expression of the Founders’ messianic mission interpreted through its ever-evolving cast of disparate constituent groups. In pursuit of this, 19th-century Democratic foreign policy favored territorial and commercial expansion to safeguard their republican experiment. In the 20th and 21st century Democrats globalized these sentiments and sought a world conducive to democracy’s survival. But consistency is scarcely the hallmark of Democratic foreign policy. Driven by its disparate constituent groups and domestic politics, the party has bandied diverse foreign policy strategies through an array of historical circumstances. The sum total of Democratic foreign policy is, at times, a contradictory amalgam of diverse constituencies responding to the issues of the moment in a combination of self-interest and democratic idealism.
Article
Peter Cole
The history of dockworkers in America is as fascinating and important as it is unfamiliar. Those who worked along the shore loading and unloading ships played an invaluable role in an industry central to both the U.S. and global economies as well as the making of the nation. For centuries, their work remained largely the same, involving brute manual labor in gangs; starting in the 1960s, however, their work was entirely remade due to technological transformation. Dockworkers possess a long history of militancy, resulting in dramatic improvements in their economic and workplace conditions. Today, nearly all are unionists, but dockworkers in ports along the Atlantic and Gulf coasts belong to the International Longshoremen’s Association (ILA), while the International Longshore and Warehouse Union (ILWU) represents them in Pacific Coast ports as well as in Hawaii and Alaska (along with British Columbia and Panama). In the mid-1930s, the ILA and ILWU became bitter rivals and remain so. This feud, which has cooled slightly since its outset, can be explained by differences in leadership, ideology, and tactics, with the ILA more craft-based, “patriotic,” and mainstream and the ILWU quite left wing, especially during its first few decades, and committed to fighting for racial equality. The existence of two unions complicates this story; in most countries, dockworkers belong to a single union. Similarly, America’s massive economy and physical size means that there are literally dozens of ports (again, unlike many other countries), making generalizations harder. Unfortunately, popular culture depictions of dockworkers inculcate unfair and incorrect notions that all dockworkers are involved with organized crime. Nevertheless, due to decades of militancy, strikes, and unionism, dockworkers in 21st-century America are—while far fewer in number—very well paid and still do important work, literally making world trade possible in an era when 90 percent of goods move by ship for at least part of their journey to market.
Article
Vanessa May
Domestic work was, until 1940, the largest category of women’s paid labor. Despite the number of women who performed domestic labor for pay, the wages and working conditions were often poor. Workers labored long hours for low pay and were largely left out of state labor regulations. The association of domestic work with women’s traditional household labor, defined as a “labor of love” rather than as real work, and its centrality to southern slavery, have contributed to its low status. As a result, domestic work has long been structured by class, racial, and gendered hierarchies. Nevertheless, domestic workers have time and again done their best to resist these conditions. Although traditional collective bargaining techniques did not always translate to the domestic labor market, workers found various collective and individual methods to insist on higher wages and demand occupational respect, ranging from quitting to “pan-toting” to forming unions.
Article
Megan Threlkeld
The issue of compulsory military service has been contested in the United States since before its founding. In a nation characterized by both liberalism and republicanism, there is an inherent tension between the idea that individuals should be able to determine their own destiny and the idea that all citizens have a duty to serve their country. Prior to the 20th century, conscription occurred mainly on the level of local militias, first in the British colonies and later in individual states. It was during the Civil War that the first federal drafts were instituted, both in the Union and the Confederacy. In the North, the draft was unpopular and largely ineffective. Congress revived national conscription when the United States entered World War I and established the Selective Service System to oversee the process. That draft ended when U.S. belligerency ended in 1918. The first peacetime draft was implemented in 1940; with the exception of one year, it remained in effect until 1973. Its most controversial days came during the Vietnam War, when thousands of people across the country demonstrated against it and, in some cases, outright refused to be inducted. The draft stopped with the end of the war, but in 1980, Congress reinstated compulsory Selective Service registration. More than two decades into the 21st century, male citizens and immigrant noncitizens are still required to register within thirty days of their eighteenth birthday.
The very idea of “selective service” is ambiguous. It is selective because not everyone is conscripted, but it is compulsory because one can be prosecuted for failing to register or to comply with orders of draft boards. Especially during the Cold War, one of the system’s main functions was not to procure soldiers but to identify and exempt from service those men best suited for other endeavors framed as national service: higher education, careers in science and engineering, and even supporting families. That fact, combined with the decentralized nature of the Selective Service System itself, left the process vulnerable to the prejudices of local draft boards and meant that those most likely to be drafted were poor and nonwhite.
Article
Chris Elcock
The use of illicit drugs in US cities led to the development of important subcultures with shared practices, codes, discourses, and values. From the 19th century onward, American city dwellers have indulged in opiates, cocaine, amphetamines, cannabis, lysergic acid diethylamide (LSD), crack, and 3,4-Methylenedioxymethamphetamine (also known as MDMA or ecstasy). The population density of metropolitan America contributed to the spread of substance use and the rise of communities that centered their lives on drug consumption. In the history of urban drug use, opiates have outlasted all the other drugs and have naturally attracted the bulk of scholarly attention.
The nature and identity of these illicit subcultures usually depended on the pharmacology of the drugs and the setting in which they were used. Addictive substances like heroin and amphetamines certainly led to the rise of crime in certain urban areas, but by the same token many urban Americans managed to integrate their addiction into their everyday lives. The more complex pharmacology of psychedelic drugs like LSD in turn gave birth to rich subcultures that resist easy classifications. Most drugs began their careers as medical marvels that were accepted as the product of modernity and often used by the middle class or medical practitioners. Race, age, and class prejudice, and the association of drugs with visible subcultures perceived to pose a threat to the moral fabric of society can partly explain their subsequent bans.
Article
Richard V. Damms
Probably no American president was more thoroughly versed in matters of national security and foreign policy before entering office than Dwight David Eisenhower. As a young military officer, Eisenhower served stateside in World War I and then in Panama and the Philippines in the interwar years. On assignments in Washington and Manila, he worked on war plans, gaining an understanding that national security entailed economic and psychological factors in addition to manpower and materiel. In World War II, he commanded Allied forces in the European Theatre of Operations and honed his skills in coalition building and diplomacy. After the war, he oversaw the German occupation and then became Army Chief of Staff as the nation hastily demobilized. At the onset of the Cold War, Eisenhower embraced President Harry S. Truman’s containment doctrine and participated in the discussions leading to the 1947 National Security Act establishing the Central Intelligence Agency, the National Security Council, and the Department of Defense. After briefly retiring from the military, Eisenhower twice returned to public service at the behest of President Truman to assume the temporary chairmanship of the Joint Chiefs of Staff and then, following the outbreak of the Korean War, to become the first Supreme Allied Commander, Europe, charged with transforming the North Atlantic Treaty Organization into a viable military force.
These experiences colored Eisenhower’s foreign policy views, which in turn led him to seek the presidency. He viewed the Cold War as a long-term proposition and worried that Truman’s military buildup would overtax finite American resources. He sought a coherent strategic concept that would be sustainable over the long haul without adversely affecting the free enterprise system and American democratic institutions. He also worried that Republican Party leaders were dangerously insular. As president, his New Look policy pursued a cost-effective strategy of containment by means of increased reliance on nuclear forces over more expensive conventional ones, sustained existing regional alliances and developed new ones, sought an orderly process of decolonization under Western guidance, resorted to covert operations to safeguard vital interests, and employed psychological warfare in the battle with communism for world opinion, particularly in the so-called Third World. His foreign policy laid the basis for what would become the overall American strategy for the duration of the Cold War. The legacy of that policy, however, was decidedly mixed. Eisenhower avoided the disaster of global war, but technological innovations did not produce the fiscal savings that he had envisioned. The NATO alliance expanded and mostly stood firm, but other alliances were more problematic. Decolonization rarely proceeded as smoothly as envisioned and caused conflict with European allies. Covert operations had long-term negative consequences. In Southeast Asia and Cuba, the Eisenhower administration’s policies bequeathed a poisoned chalice for succeeding administrations.
Article
Lindsay M. Chervinsky
From 1775 to 1815, empire served as the most pressing foreign relationship problem for the United States. Would the new nation successfully break free from the British Empire? What would an American empire look like? How would it be treated by other empires? And could Americans hold their own against European superpowers? These questions dominated the United States’ first few decades of existence and shaped its interactions with American Indian, Haitian, Spanish, British, and French peoples. The US government—first the Continental Congress, then the Confederation Congress, and finally the federal administration under the new Constitution—grappled with five key issues. First, they sought international recognition of their independence and negotiated trade deals during the Revolutionary War to support the war effort. Second, they obtained access to the Mississippi River and Port of New Orleans from Spain and France to facilitate trade and western settlement. Third, they grappled with ongoing conflict with Indian nations over white settlement on Indian lands and demands from white communities for border security. Fourth, they defined and protected American neutrality, negotiated a trade policy that required European recognition of American independence, and denied recognition to Haiti. Lastly, they fought a quasi-war with France and real war with Great Britain in 1812.
Article
Sally Hadden
Slave law in early America may be found in the formal written laws created in metropolitan places such as Paris or Madrid as well as locally within English colonies such as Barbados or South Carolina. These written laws constitute only one portion of the known law governing slave behavior, for individual masters created their own rules to restrict enslaved persons. These master-made rules of conduct almost never appear in print and were conveyed most often through oral transmission. Such vernacular laws provide another element of the limitations all enslaved people experienced in the colonial period. Those without literacy, including Native Americans or illiterate settlers, nonetheless had rules to limit slave behavior, even if they remained unwritten. Customary law, Bible precepts, and Islamic law all provided bases for understanding the rules that bound unfree persons. Most colonial law mandated barbaric punishments for slave crime, though these were sometimes commuted to banishment. Spanish and French codes and local ordinances did not always agree on how slaves should be treated.
The numerous laws found in English colonies, sometimes wrongly denominated as codes, spread widely as individuals migrated; the number and variety of such laws makes comprehensive transimperial comparisons challenging. Laws might occasionally ban keeping slaves or trading in them, but most such laws were ignored. Slave courts typically operated in arbitrary, capricious ways that assumed slave guilt and accepted weak evidence to prove it. Runaways might, if they joined strong maroon communities (bands of runaways living together), end up enforcing the laws against slave flight, much as slave catchers and slave patrols did. Laws to prevent manumission by a master frequently required the posting of bonds to prevent those freed from becoming a financial burden on their communities. Later manumission laws often mandated the physical departure of those freed, creating emotional turmoil for the newly emancipated.
Article
Gregory F. Domber
American policy makers have rarely elevated Eastern Europe to the pinnacle of American grand strategy. The United States’ and Eastern Europe’s histories, however, are intertwined through the exchange of people and shared experiences. In the Age of Revolution, Eastern Europeans traveled to the United States to fight for the same causes they championed at home: to break from imperial control and expand the rights of man. At the end of the 19th century, “New Immigrants” from Eastern Europe streamed into America’s expanding cities. When countries in the region have moved to the forefront of American concerns during specific crises, Eastern European interests were regularly deemed secondary to larger American geopolitical interests. This holds true for the settlement of World War I, the conclusion of World War II, and the entirety of the Cold War. Overall, including Eastern Europeans and Eastern Europe in the history of the United States provides essential nuance and texture to broader patterns in American relations and more often than not provides evidence of the limitations of American power as it is altered by competing powers and local conditions.
Article
Christopher W. Calvo
The conspicuous timing of the publication of Adam Smith’s The Wealth of Nations and America’s Declaration of Independence, separated by only a few months in 1776, has attracted a great deal of historical attention. America’s revolution was in large part motivated by the desire to break free from British mercantilism and engage the principles, both material and ideological, found in Smith’s work. From 1776 to the present day, the preponderance of capitalism in American economic history and the influence of The Wealth of Nations in American intellectual culture have contributed to the conventional wisdom that America and Smith enjoy a special relationship. After all, no nation has consistently pursued the tenets of Smithian-inspired capitalism, mainly free and competitive markets, a commitment to private property, and the pursuit of self-interests and profits, more than the United States.
The shadow of Smith’s The Wealth of Nations looms large over America. But a closer look at American economic thought and practice demonstrates that Smith’s authority was not as dominant as the popular history assumes. Although most Americans accepted Smith’s work as the foundational text in political economy and extracted from it the cardinal principles of intellectual capitalism, its core values were twisted, turned, and fused together in contorted, sometimes contradictory fashions. American economic thought also reflects the widespread belief that the nation would trace an exceptional course, distinct from the Old World, and therefore necessitating a political economy suited to American traditions and expectations. Hybrid capitalist ideologies, although rooted in Smithian-inspired liberalism, developed within a dynamic domestic discourse that embraced ideological diversity and competing paradigms, exactly the kind expected from a new nation trying to understand its economic past, establish its present, and project its future.
Likewise, American policymakers crafted legislation that brought the national economy both closer to and further from the Smithian ideal. Hybrid intellectual capitalism—a compounded ideological approach that antebellum American economic thinkers deployed to help rationalize the nation’s economic development—imitated the nation’s emergent hybrid material capitalism. Labor, commodity, and capital markets assumed amalgamated forms, combining, for instance, slave and free labor, private and public enterprises, and open and protected markets. Americans constructed different types of capitalism, reflecting a preference for mixtures of practical thought and policy that rarely conformed to strict ideological models. Historians of American economic thought and practice study capitalism as an evolutionary, dynamic institution with manifestations in traditional, expected corners, but historians also find capitalism demonstrated in unorthodox ways and practiced in obscure corners of market society that blended capitalist with non-capitalist experiences. In the 21st century, the benefits of incorporating conventional economic analysis with political, social, and cultural narratives are widely recognized. This has helped broaden scholars’ understanding of what exactly constitutes capitalism. And in doing so, the malleability of American economic thought and practice is put on full display, improving scholars’ appreciation for what remains the most significant material development in world history.
Article
Aaron Slater
Identifying and analyzing a unified system called the “economy of colonial British America” presents a number of challenges. The regions that came to constitute Britain’s North American empire developed according to a variety of factors, including climate and environment, relations with Native peoples, international competition and conflict, internal English/British politics, and the social system and cultural outlook of the various groups that settled each colony. Nevertheless, while there was great diversity in the socioeconomic organization across colonial British America, a few generalizations can be made. First, each region initially focused economic activity on some form of export-oriented production that tied it to the metropole. New England specialized in timber, fish, and shipping services, the Middle Colonies in furs, grains, and foodstuffs, the Chesapeake in tobacco, the South in rice, indigo, and hides, and the West Indies in sugar. Second, the maturation of the export-driven economy in each colony eventually spurred the development of an internal economy directed toward providing the ancillary goods and services necessary to promote the export trade. Third, despite variations within and across colonies, colonial British America underwent more rapid economic expansion over the course of the 17th and 18th centuries than did its European counterparts, to the point that, on the eve of the American Revolution, white settlers in British America enjoyed one of the highest living standards in the world at the time.
A final commonality that all the regions shared was that this robust economic growth spurred an almost insatiable demand for land and labor. With the exception of the West Indies, where the Spanish had largely exterminated the Native inhabitants by the time the English arrived, frontier warfare was ubiquitous across British America, as land-hungry settlers invaded Indian territory and expropriated their lands. The labor problem, while also ubiquitous, showed much greater regional variation. The New England and the Middle colonies largely supplied their labor needs through a combination of family immigration, natural increase, and the importation of bound European workers known as indentured servants. The Chesapeake, Carolina, and West Indian colonies, on the other hand, developed “slave societies,” where captive peoples of African descent were imported in huge numbers and forced to serve as enslaved laborers on colonial plantations. Despite these differences, it should be emphasized that, by the outbreak of the American Revolution, the institution of slavery had, to a greater or lesser extent, insinuated itself into the economy of every British American colony. The expropriation of land from Indians and labor from enslaved Africans thus shaped the economic history of all the colonies of British America.
Article
Judge Glock
Despite almost three decades of strong and stable growth after World War II, the US economy, like the economies of many developed nations, faced new headwinds and challenges after 1970. Although the United States eventually overcame many of them, and continues to be one of the most dynamic in the world, it could not recover its mid-century economic miracle of rapid and broad-based economic growth.
There are three major ways the US economy changed in this period. First, the US economy endured and eventually conquered the problem of high inflation, even as it instituted new policies that prioritized price stability over the so-called “Keynesian” goal of full employment. Although these new policies led to over two decades of moderate inflation and stable growth, the 2008 financial crisis challenged the post-Keynesian consensus and led to new demands for government intervention in downturns.
Second, the government’s overall influence on the economy increased dramatically. Although the government deregulated several sectors in the 1970s and 1980s, such as transportation and banking, it also created new types of social and environmental regulation that were more pervasive. And although it occasionally cut spending, on the whole government spending increased substantially in this period, until it reached about 35 percent of the economy.
Third, the US economy became more open to the world, and it imported more manufactured goods, even as it became more based on “intangible” products and on services rather than on manufacturing. These shifts created new economic winners and losers. Some institutions that thrived in the older economy, such as unions, which once compromised over a third of the workforce, became shadows of their former selves. The new service economy also created more gains for highly educated workers and for investors in quickly growing businesses, while blue-collar workers’ wages stagnated, at least in relative terms.
Most of the trends that affected the US economy in this period were long-standing and continued over decades. Major national and international crises in this period, from the end of the Cold War, to the first Gulf War in 1991, to the September 11 attacks of 2001, seemed to have only a mild or transient impact on the economy. Two events that were of lasting importance were, first, the United States leaving the gold standard in 1971, which led to high inflation in the short term and more stable monetary policy over the long term; and second, the 2008 financial crisis, which seemed to permanently decrease American economic output even while it increased political battles about the involvement of government in the economy.
The US economy at the beginning of the third decade of the 21st century was richer than it had ever been, and remained in many respects the envy of the world. But widening income gaps meant many Americans felt left behind in this new economy, and led some to worry that the stability and predictability of the old economy had been lost.
Article
Lawrence J. McAndrews
Americans almost universally agree on the importance of education to the success of individuals and the strength of the nation. Yet they have long differed over the proper mission of government in overseeing their schools. Before 1945, these debates largely occurred at the local and state levels. Since 1945, as education has become an increasingly national and international concern, the federal government has played a larger role in the nation’s schools. As Americans gradually have come to accept a greater federal presence in elementary and secondary schools, however, members of Congress and presidents from both major parties have continued to argue over the scope and substance of the federal role. From 1945 to 1965, these arguments centered on the quest for equity between rich and poor public school pupils and between public and nonpublic school students. From 1965 to 1989, national lawmakers devoted much of their attention to the goal of excellence in public education. From 1989 to the present, they have quarreled over how best to attain equity and excellence at the same time.
Article
Catherine O'Donnell
Elizabeth Bayley Seton is the first native-born US citizen to be made a Roman Catholic saint. Canonized in 1975, Seton founded the Sisters of Charity of St. Joseph, the first vowed community of Catholic women religious created in the United States. Seton’s sainthood marked the culmination of a role she first served during her life: a respectable, benevolent face for a church whose local leaders were eager to demonstrate its compatibility with American culture. Seton’s founding of the American Sisters of Charity was a more practical achievement and one that shaped the Catholic Church in the United States in tangible ways. Starting in 1809, when Seton began a school and vowed community in Emmitsburg, Maryland, the Sisters of Charity expanded throughout the United States, eventually running hundreds of schools and orphanages and offering both a spiritual home and a career path for women who chose it. Seton’s life is expressive for what it reveals about her era as well as for her distinctive achievements. Her prominence led to the preservation of decades of correspondence and spiritual writings. Through them it is possible to see with unusual clarity the ways in which the Age of Revolutions and the rise of Napoleon variously disrupted, reinvigorated, and transformed Catholic traditions; to observe the possibilities and constraints Catholicism offered a spiritually ambitious woman; and to witness changes in the relationship between Protestants and Catholics in the United States. Finally, Seton’s rich archive also renders visible one woman’s experience of intellectual inquiry, marriage, widowhood, motherhood, spiritual ambition, and female friendship.
Article
Vincent J. Cannato
The Ellis Island Immigration Station, located in New York Harbor, opened in 1892 and closed in 1954. During peak years from the 1890s until the 1920s, the station processed an estimated twelve million immigrants. Roughly 75 percent of all immigrants arriving in America during this period passed through Ellis Island. The station was run by the federal Immigration Service and represented a new era of federal control over immigration. Officials at Ellis Island were tasked with regulating the flow of immigration by enforcing a growing body of federal laws that barred various categories of “undesirable” immigrants. As the number of immigrants coming to America increased, so did the size of the inspection facility. In 1907, Ellis Island processed more than one million immigrants. The quota laws of the 1920s slowed immigration considerably and the rise of the visa system meant that Ellis Island no longer served as the primary immigrant inspection facility. For the next three decades, Ellis Island mostly served as a detention center for those ordered deported from the country.
After Ellis Island closed in 1954, the facility fell into disrepair. During a period of low immigration and a national emphasis on assimilation, the immigrant inspection station was forgotten by most Americans. With a revival of interest in ethnicity in the 1970s, Ellis Island attracted more attention, especially from the descendants of immigrants who entered the country through its doors. In the 1980s, large-scale fundraising for the restoration of the neighboring Statue of Liberty led to a similar effort to restore part of Ellis Island. In 1990, the Main Building was reopened to the public as an immigration museum under the National Park Service. Ellis Island has evolved into an iconic national monument with deep meaning for the descendants of the immigrants who arrived there, as well as a contested symbol to other Americans grappling with the realities of contemporary immigration.
Article
Employers began organizing with one another to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops, workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the 20th century’s first three decades, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was mostly successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right-to-work,” insisting that individual laborers must enjoy freedom from so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, begin passing “right-to-work” laws, which made union organizing more difficult because workers were not obligated to join unions or pay their “fair share” of dues to them. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.