You are looking at 161-180 of 312 articles
The development of military arms harnessing nuclear energy for mass destruction has inspired continual efforts to control them. Since 1945, the United States, the Soviet Union, the United Kingdom, France, the People’s Republic of China (PRC), Israel, India, Pakistan, North Korea, and South Africa acquired control over these powerful weapons, though Pretoria dismantled its small cache in 1989 and Russia inherited the Soviet arsenal in 1996. Throughout this period, Washington sought to limit its nuclear forces in tandem with those of Moscow, prevent new states from fielding them, discourage their military use, and even permit their eventual abolition.
Scholars disagree about what explains the United States’ distinct approach to nuclear arms control. The history of U.S. nuclear policy treats intellectual theories and cultural attitudes alongside technical advances and strategic implications. The central debate is one of structure versus agency: whether the weapons’ sheer power, or historical actors’ attitudes toward that power, drove nuclear arms control. Among those who emphasize political responsibility, there are two further disagreements: (1) the relative influence of domestic protest, culture, and politics; and (2) whether U.S. nuclear arms control aimed first at securing the peace by regulating global nuclear forces or at bolstering American influence in the world.
The intensity of nuclear arms control efforts tended to rise or fall with the likelihood of nuclear war. Harry Truman’s faith in the country’s monopoly on nuclear weapons caused him to sabotage early initiatives, while Dwight Eisenhower’s belief in nuclear deterrence led in a similar direction. Fears of a U.S.-Soviet thermonuclear exchange mounted in the late 1950s, stoked by atmospheric nuclear testing and widespread radioactive fallout, which stirred protest movements and diplomatic initiatives. The spread of nuclear weapons to new states motivated U.S. presidents (John Kennedy in the vanguard) to mount a concerted campaign against “proliferation,” climaxing with the 1968 Treaty on the Non-Proliferation of Nuclear Weapons (NPT). Richard Nixon was exceptional. His reasons for signing the Strategic Arms Limitation Treaty (SALT I) and Anti-Ballistic Missile Treaty (ABM) with Moscow in 1972 were strategic: to buttress the country’s geopolitical position as U.S. armed forces withdrew from Southeast Asia. The rise of protest movements and Soviet economic difficulties after Ronald Reagan entered the Oval Office brought about two more landmark U.S.-Soviet accords—the 1987 Intermediate Ballistic Missile Treaty (INF) and the 1991 Strategic Arms Reduction Treaty (START)—the first occasions on which the superpowers eliminated nuclear weapons through treaty. The country’s attention swung to proliferation after the Soviet collapse in December 1991, as failed states, regional disputes, and non-state actors grew more prominent. Although controversies over Iraq, North Korea, and Iran’s nuclear programs have since erupted, Washington and Moscow continued to reduce their arsenals and refine their nuclear doctrines even as President Barack Obama proclaimed his support for a nuclear-free world.
Christopher J. Castañeda
The modern oil industry began in 1859 with Edwin Drake’s discovery of oil at Titusville, Pennsylvania. Since then, this dynamic industry has experienced dramatic episodes of growth, aggressive competition for market share, various forms of corporate organization and cartel-like agreements, and governmental efforts at regulation and control, as well as monopoly, mergers, and consolidation. The history of the oil industry reflects its capital-intensive nature. Immense sums of money are spent on oil discovery, production, and refining projects. Marketing, transportation, and distribution systems likewise require enormous amounts of financing and logistical planning. Although oil is often produced in conjunction with, or in wells pressurized by, natural gas, the oil industry is distinct from the related natural gas industry. Since its origins in the mid-19th century, the oil industry has developed an industrial structure that emphasizes scale and scope to maximize profits. Profits can be huge, which attracts entrepreneurial efforts on individual, corporate, and national scales. By the late 20th through early 21st century, the oil industry had begun confronting questions about long-term viability, combined with an increasingly influential environmental movement that seeks to reduce fossil fuel consumption and prevent its toxic waste and by-products from polluting human, animal habitats, and natural habitats.
The relationship between organized labor and the civil rights movement proceeded along two tracks. At work, the two groups were adversaries, as civil rights groups criticized employment discrimination by the unions. But in politics, they allied. Unions and civil rights organizations partnered to support liberal legislation and to oppose conservative southern Democrats, who were as militant in opposing unions as they were fervent in supporting white supremacy.
At work, unions dithered in their efforts to root out employment discrimination. Their initial enthusiasm for Title VII of the 1964 Civil Rights Act, which outlawed employment discrimination, waned the more the new law violated foundational union practices by infringing on the principle of seniority, emphasizing the rights of the individual over the group, and inserting the courts into the workplace. The two souls of postwar liberalism— labor solidarity represented by unions and racial justice represented by the civil rights movement—were in conflict at work.
Although the unions and civil rights activists were adversaries over employment discrimination, they united in trying to register southern blacks to vote. Black enfranchisement would end the South’s exceptionalism and the veto it exercised over liberal legislation in Congress. But the two souls of liberalism that were at odds over the meaning of fairness at work would also diverge at the ballot box. As white workers began to defect from the Democratic Party, the political coalition of black and white workers that union leaders had hoped to build was undermined from below. The divergence between the two souls of liberalism in the 1960s—economic justice represented by unions and racial justice represented by civil rights—helps explain the resurgence of conservatism that followed.
Jessica M. Chapman
The origins of the Vietnam War can be traced to France’s colonization of Indochina in the late 1880s. The Viet Minh, led by Ho Chi Minh, emerged as the dominant anti-colonial movement by the end of World War II, though Viet Minh leaders encountered difficulties as they tried to consolidate their power on the eve of the First Indochina War against France. While that war was, initially, a war of decolonization, it became a central battleground of the Cold War by 1950. The lines of future conflict were drawn that year when the Peoples Republic of China and the Soviet Union recognized and provided aid to the Democratic Republic of Vietnam in Hanoi, followed almost immediately by Washington’s recognition of the State of Vietnam in Saigon. From that point on, American involvement in Vietnam was most often explained in terms of the Domino Theory, articulated by President Dwight D. Eisenhower on the eve of the Geneva Conference of 1954. The Franco-Viet Minh ceasefire reached at Geneva divided Vietnam in two at the 17th parallel, with countrywide reunification elections slated for the summer of 1956. However, the United States and its client, Ngo Dinh Diem, refused to participate in talks preparatory to those elections, preferring instead to build South Vietnam as a non-communist bastion. While the Vietnamese communist party, known as the Vietnam Worker’s Party in Hanoi, initially hoped to reunify the country by peaceful means, it reached the conclusion by 1959 that violent revolution would be necessary to bring down the “American imperialists and their lackeys.” In 1960, the party formed the National Liberation Front for Vietnam and, following Diem’s assassination in 1963, passed a resolution to wage all-out war in the south in an effort to claim victory before the United States committed combat troops. After President John F. Kennedy took office in 1961, he responded to deteriorating conditions in South Vietnam by militarizing the American commitment, though he stopped short of introducing dedicated ground troops. After Diem and Kennedy were assassinated in quick succession in November 1963, Lyndon Baines Johnson took office determined to avoid defeat in Vietnam, but hoping to prevent the issue from interfering with his domestic political agenda. As the situation in South Vietnam became more dire, LBJ found himself unable to maintain the middle-of-the-road approach that Kennedy had pursued. Forced to choose between escalation and withdrawal, he chose the former in March 1965 by launching a sustained campaign of aerial bombardment, coupled with the introduction of the first officially designated U.S. combat forces to Vietnam.
Michael E. Donoghue
The United States’ construction and operation of the Panama Canal began as an idea and developed into a reality after prolonged diplomatic machinations to acquire the rights to build the waterway. Once the canal was excavated, a century-long struggle ensued to hold it in the face of Panamanian nationalism. Washington used considerable negotiation and finally gunboat diplomacy to achieve its acquisition of the Canal. The construction of the channel proved a titanic effort with large regional, global, and cultural ramifications. The importance of the Canal as a geostrategic and economic asset was magnified during the two world wars. But rising Panamanian frustration over the U.S. creation of a state-within-a-state via the Canal Zone, one with a discriminatory racial structure, fomented a local movement to wrest control of the Canal from the Americans. The explosion of the 1964 anti-American uprising drove this process forward toward the 1977 Carter-Torrijos treaties that established a blueprint for eventual U.S. retreat and transfer of the channel to Panama at the century’s end. But before that historic handover, the Noriega crisis and the 1989 U.S. invasion nearly upended the projected transition of U.S. retreat from the management and control of the Canal.
Early historians emphasized high politics, economics, and military considerations in the U.S. acquisition of the Canal. They concentrated on high-status actors, economic indices, and major political contingencies in establishing the U.S. colonial order on the isthmus. Panamanian scholars brought a legalistic and nationalist critique, stressing that Washington did not create Panama and that local voices in the historical debate have largely been ignored in the grand narrative of the Canal as a great act of progressive civilization. More recent U.S. scholarship has focused on American imperialism in Panama, on the role of race, culture, labor, and gender as major factors that shaped the U.S. presence, the structure of the Canal Zone, as well as Panamanian resistance to its occupation. The role of historical memory, of globalization, representation, and how the Canal fits into notions of U.S. empire have also figured more prominently in recent scholarly examination of this relationship. Contemporary research on the Panama Canal has been supported by numerous archives in the United States and Panama, as well as a variety of newspapers, magazines, novels, and films.
The creation and evolution of urban parks is in some ways a familiar story, especially given the attention that Frederick Law Olmsted’s work has commanded since the early 1970s. Following the success of Central Park, cities across the United States began building parks to meet the recreational needs of residents, and during the second half of the 19th century, Olmsted and his partners designed major parks or park systems in thirty cities. Yet, even that story is incomplete. To be sure, Olmsted believed that every city should have a large rural park as an alternative to the density of building and crowding of the modern metropolis, a place to provide for an “unbending of the faculties,” a process of recuperation from the stresses and strains of urban life. But, even in the mid-1860s he sought to create alternative spaces for other types of recreation. Olmsted and his partner Calvert Vaux successfully persuaded the Prospect Park commission, in Brooklyn, New York, to acquire land for a parade ground south of the park as a place for military musters and athletics; moreover, in 1868 they prepared a plan for a park system in Buffalo, New York, that consisted of three parks, linked by parkways, that served different functions and provided for different forms of recreation. As the decades progressed, Olmsted became a champion of parks designed for active recreation; gymnasiums for women as well as men, especially in working-class areas of cities; and playgrounds for small children. He did so in part to relieve pressure on the large landscape parks to accommodate uses he believed would be inappropriate, but also because he recognized the legitimate demands for new forms of recreation. In later years, other park designers and administrators would similarly add facilities for active recreation, though sometimes in ways that compromised what Olmsted considered the primary purpose of a public park. Urban parks are, in important ways, a microcosm of the nation’s cities. Battles over location, financing, political patronage, and use have been a constant. Through it all, parks have evolved to meet the changing recreational needs of residents. And, as dominant a figure as Olmsted has been, this is a story that antedates his professional career and that includes the many voices that have shaped public parks in U.S. cities in the 20th century.
Peace activism in the United States between 1945 and the 2010s focused mostly on opposition to U.S. foreign policy, efforts to strengthen and foster international cooperation, and support for nuclear nonproliferation and arms control. The onset of the Cold War between the United States and the Soviet Union marginalized a reviving postwar American peace movement emerging from concerns about atomic and nuclear power and worldwide nationalist politics that everywhere seemed to foster conflict, not peace. Still, peace activism continued to evolve in dynamic ways and to influence domestic politics and international relations.
Most significantly, peace activists pioneered the use of Gandhian nonviolence in the United States and provided critical assistance to the African American civil rights movement, led the postwar antinuclear campaign, played a major role in the movement against the war in Vietnam, helped to move the liberal establishment (briefly) toward a more dovish foreign policy in the early 1970s, and helped to shape the political culture of American radicalism. Despite these achievements, the peace movement never regained the political legitimacy and prestige it held in the years before World War II, and it struggled with internal divisions about ideology, priorities, and tactics.
Peace activist histories in the 20th century tended to emphasize organizational or biographical approaches that sometimes carried hagiographic overtones. More recently, historians have applied the methods of cultural history, examining the role of religion, gender, and race in structuring peace activism. The transnational and global turn in the historical discipline has also begun to make inroads in peace scholarship. These are promising new directions because they situate peace activism within larger historical and cultural developments and relate peace history to broader historiographical debates and trends.
In the years after the Civil War, Polish immigrants became an important part of the American working class. They actively participated in the labor movement and played key roles in various industrial strikes ranging from the 1877 Railroad Strike through the rise of the CIO and the post-1945 era of prosperity. Over time, the Polish American working class became acculturated and left its largely immigrant past behind while maintaining itself as an ethnic community. It also witnessed a good deal of upward mobility, especially over several generations. This ethnic community, however, continued to be refreshed with immigrants throughout the 20th century.
As with the larger American working class, Polish American workers were hard hit by changes in the industrial structure of the United States. Deindustrialization turned the centers of much of the Polish American community into the Rust Belt. This, despite a radical history, caused many to react by turning toward conservative causes in the late 20th and early 21st centuries.
The reproductive experiences of women and girls in the 20th-century United States followed historical patterns shaped by the politics of race and class. Laws and policies governing reproduction generally regarded white women as legitimate reproducers and potentially fit mothers and defined women of color as unfit for reproduction and motherhood; regulations provided for rewards and punishments accordingly. In addition, public policy and public rhetoric defined “population control” as the solution to a variety of social and political problems in the United States, including poverty, immigration, the “quality” of the population, environmental degradation, and “overpopulation.” Throughout the century, nonetheless, women, communities of color, and impoverished persons challenged official efforts, at times reducing or even eliminating barriers to reproductive freedom and community survival.
Between 1900 and 1930, decades marked by increasing urbanization, industrialization, and immigration, eugenic fears of “race suicide” (concerns that white women were not having enough babies) fueled a reproductive control regime that pressured middle-class white women to reproduce robustly. At the same time, the state enacted anti-immigrant laws, undermined the integrity of Native families, and protected various forms of racial segregation and white supremacy, all of which attacked the reproductive dignity of millions of women. Also in these decades, many African American women escaped the brutal and sexually predatory Jim Crow culture of the South, and middle-class white women gained greater sexual freedom and access to reproductive health care, including contraceptive services.
During the Great Depression, the government devised the Aid to Dependent Children program to provide destitute “worthy” white mothers with government aid while often denying such supports to women of color forced to subordinate their motherhood to agricultural and domestic labor. Following World War II, as the Civil Rights movement gathered form, focus, and adherents, and as African American and other women of color claimed their rights to motherhood and social provision, white policymakers railed against “welfare queens” and defined motherhood as a class privilege, suitable only for those who could afford to give their children “advantages.” The state, invoking the “population bomb,” fought to reduce the birth rates of poor women and women of color through sterilization and mandatory contraception, among other strategies. Between 1960 and 1980, white feminists employed the consumerist language of “choice” as part of the campaign for legalized abortion, even as Native, black, Latina, immigrant, and poor women struggled to secure the right to give birth to and raise their children with dignity and safety. The last decades of the 20th century saw severe cuts in social programs designed to aid low-income mothers and their children, cuts to funding for public education and housing, court decisions that dramatically reduced poor women’s access to reproductive health care including abortion, and the emergence of a powerful, often violent, anti-abortion movement. In response, in 1994 a group of women of color activists articulated the theory of reproductive justice, splicing together “social justice” and “reproductive rights.” The resulting Reproductive Justice movement, which would become increasingly influential in the 21st century, defined reproductive health, rights, and justice as human rights due to all persons and articulated what each individual requires to achieve these rights: the right not to have children, the right to have children, and the right to the social, economic, and environmental conditions necessary to raise children in healthy, peaceful, and sustainable households and communities.
Rosina A. Lozano
Language rights are an integral part of civil rights. They provide the tools that permit individuals to engage with and participate in society. The broad use of the Spanish language in the United States by both citizens and immigrants—it is the second-most-spoken language in the country by far—has a long history. Spanish was the first European governing language in parts of the future United States that included the Southwest, portions of the Louisiana Purchase, and Florida. The use of the language did not disappear when these regions became part of the United States, but rather persisted in some locales as a politically important language. In the 20th century, Spanish-speaking immigrants entered not just the Southwest and Florida, but also Chicago, New York, the South, Michigan, and other locales across the country in large numbers. Throughout the 20th century and into the 21st century, Spanish speakers and their advocates have reasserted their cultural preference by fighting for monolingual speakers’ right to use Spanish in legal settings, in public, as voters, as elected officials, at work, and in education. The politics of the Spanish language have only grown in importance as the largest influx of Spanish-speaking immigrants ever has entered the United States. This demographic shift makes the longer history of Spanish a crucial backstory for future language-policy decisions.
Andrew J. Falk
Americans in and out of government have relied on media and popular culture to construct the national identity, frame debates on military interventions, communicate core values abroad, and motivate citizens around the world to act in prescribed ways. During the late 19th century, as the United States emerged as a world power and expanded overseas, Americans adopted an ethos of worldliness in their everyday lives, even as some expressed worry about the nation’s position on war and peace. During the interwar period of the 1920s and 1930s, though America failed to join the League of Nations and retreated from foreign engagements, the nation also increased cultural interactions with the rest of the world through the export of motion pictures, music, consumer products, food, fashion, and sports. The policies and character of the Second World War were in part shaped by propaganda that evolved from earlier information campaigns. As the United States confronted communism during the Cold War, the government sanitized its cultural weapons to win the hearts and minds of Americans, allies, enemies, and nonaligned nations. But some cultural producers dissented from America’s “containment policy,” refashioned popular media for global audiences, and sparked a change in Washington’s cultural-diplomacy programs. An examination of popular culture also shows how people in the “Third World” deftly used the media to encourage superpower action. In the 21st century, activists and revolutionaries can be considered the inheritors of this tradition because they use social media to promote their political agendas. In short, understanding the roles popular culture played as America engaged the world greatly expands our understanding of modern American foreign relations.
Courts and legislatures in colonial America and the early American republic developed and refined a power to compel civilians to assist peace and law enforcement officers in arresting wrongdoers, keeping the peace, and other matters of law enforcement. This power to command civilian cooperation was known as the posse comitatus or “power of the county.” Rooted in early modern English countryside law enforcement, the posse comitatus became an important police institution in 18th- and 19th-century America. The posse comitatus was typically composed of able-bodied white male civilians who were temporarily deputized to aid a sheriff or constable. But if this “power of the county” was insufficient, law enforcement officers were often authorized to call on the military to serve as the posse comitatus.
The posse comitatus proved particularly important in buttressing slavery in the American South. Slaveholders pushed for and especially benefited from laws that required citizens to assist in the recapture of local runaway slaves and fugitive slaves who crossed into states without slavery. Though slave patrols were rooted in the posse comitatus, the posse comitatus originated as a compulsory and noncompensated institution. Slaveholders in the American South later added financial incentives for those who acted in the place of a posse to recapture slaves on the run from their owners.
The widespread use of the posse comitatus in southern slave law became part of the national discussion about slavery during the early American republic as national lawmakers contemplated how to deal with the problem of fugitive slaves who fled to free states. This dialogue culminated with the Fugitive Slave Law of 1850, in which the US Congress authorized officials to “summon and call to their aid the bystanders, or posse comitatus” and declared that “all good citizens are hereby commanded to aid and assist in the prompt and efficient execution of this law, whenever their services may be required.” During Reconstruction, the Radical Republican Congress used the posse comitatus to enforce laws that targeted conquered Confederates. After the end of Reconstruction in 1877, Southern states pushed Congress to create what would come to be known as the “Posse Comitatus Act,” which prohibited the use of federal military forces for law enforcement. The history of the posse comitatus in early America is thus best understood as a story about and an example of the centralization of government authority and its ramifications.
Substantial numbers of Asian Americans and Asian immigrants moved into suburbs across the United States after World War II, bringing distinctive everyday lifeways, identities, worldviews, family types, and community norms that remade much of American suburbia. Although Asian Americans were excluded from suburbs on racial grounds since the late 19th century, American Cold War objectives in Asia and the Pacific and domestic American civil rights struggles afforded Asian Americans increased access to suburban housing in the 1950s, especially Chinese and Japanese Americans. Following passage of the Immigration Act of 1965 and the Fair Housing Act of 1968, new groups of Asian Americans, particularly Filipino, Vietnamese, Thai, Korean, and South Asian Indian, joined Chinese and Japanese Americans in settling in earnest into all kinds of suburban neighborhoods. At the turn of the 21st century, a majority of Asians resided in the suburbs, which also became the preferred gateway communities for new immigrants who often bypassed urban cores and moved straight to the suburbs when they arrived.
Entrance into highly racialized postwar suburbs defined by white middle-class norms and segregated white privilege did not, however, mean that Asian Americans gained entry or assimilated into whiteness. While many certainly aspired to and reinforced long-standing white suburban ideals, others revamped, contested, and outright fractured dominant notions of the suburban good life. By the 1980s Asian Americans of various ethnic and national backgrounds had transformed the sights, sounds, and smells of suburban landscapes throughout the country. They made claims on suburban space and asserted a “right to the suburb” through a range of social and cultural practices, often in physical places, especially shopping plazas, grocery stores, restaurants, religious centers, and schools. Yet as Asian Americans tried to become full-fledged participants in suburban culture and life, their presence, ethnic expressions, and ways of life sparked tensions with other mostly white suburbanites that led to heated debates over immigration, race, multiculturalism, and assimilation in American society.
The history of post-World War II Asian American suburban cultures highlights suburbia as a principal setting for Asian American experiences and the making of Asian American identity during the second half of the 20th century. More broadly, the Asian American experience reveals how control over the suburban ideal and the making of suburban space in the United States was and remains a contested, layered process. It also underscores the racial and ethnic diversification of metropolitan America and how pressing social, political, economic, and cultural issues in US society played out increasingly on the suburban stage. Moreover, Asian Americans built communities and social networks precisely the moment in which the authentic “American” community was supposedly in decline, providing a powerful counterpunch to those who lament nonwhite populations, particularly immigrants, for fracturing an otherwise unified American culture or sense of togetherness.
American cities expanded during the late 19th century, as industrial growth was fueled by the arrival of millions of immigrants and migrants. Poverty rates escalated, overwhelming existing networks of private charities. Progressive reformers created relief organizations and raised public awareness of urban poverty. The devastating effects of the Great Depression inspired greater focus on poverty from state and federal agencies. The Social Security Act, the greatest legacy of the New Deal, would provide a safety net for millions of Americans. During the postwar era of general prosperity, federal housing policies often reinforced and deepened racial and socioeconomic inequality and segregation. The 1960s War on Poverty created vital aid programs that expanded access to food, housing, and health care. These programs also prompted a rising tide of conservative backlash against perceived excesses. Fueled by such critical sentiments, the Reagan administration implemented dramatic cuts to assistance programs. Later, the Clinton administration further reformed welfare by tying aid to labor requirements. Throughout the 20th century, the urban homeless struggled to survive in hostile environments. Skid row areas housed the homeless for decades, providing shelter, food, and social interaction within districts that were rarely visited by the middle and upper classes. The loss of such spaces to urban renewal and gentrification in many cities left many of the homeless unsheltered and dislocated.
Robert G. Parkinson
According to David Ramsay, one of the first historians of the American Revolution, “in establishing American independence, the pen and press had merit equal to that of the sword.” Because of the unstable and fragile notions of unity among the thirteen American colonies, print acted as a binding agent that mitigated the chances that the colonies would not support one another when war with Britain broke out in 1775.
Two major types of print dealt with the political process of the American Revolution: pamphlets and newspapers. Pamphlets were one of the most important conveyors of ideas during the imperial crisis. Often written by elites under pseudonyms and published by booksellers, they have long been held by historians as the lifeblood of the American Revolution. There were also three dozen newspaper printers in the American mainland colonies at the start of the Revolution, each producing a four-page issue every week. These weekly papers, or one-sheet broadsides that appeared in American cities even more frequently, were the most important communication avenue to keep colonists informed of events hundreds of miles away. Because of the structure of the newspaper business in the 18th century, the stories that appeared in each paper were “exchanged” from other papers in different cities, creating a uniform effect akin to a modern news wire. The exchange system allowed for the same story to appear across North America, and it provided the Revolutionaries with a method to shore up that fragile sense of unity. It is difficult to imagine American independence—as a popular idea let alone a possible policy decision—without understanding how print worked in colonial America in the mid-18th century.
Steven A. Riess
Professional sports teams are athletic organizations comprising talented, expert players hired by club owners whose revenues originally derived from admission fees charged to spectators seeing games in enclosed ballparks or indoor arenas. Teams are usually members of a league that schedules a championship season, although independent teams also can arrange their own contests. The first professional baseball teams emerged in the east and Midwest in 1860s, most notably the all-salaried undefeated Cincinnati Red Stockings of 1869. The first league was the haphazardly organized National Association of Professional Base Ball Players (1871), supplanted five years later by the more profit-oriented National League (NL) that set up strict rules for franchise locations, financing, and management–employee relations (including a reserve clause in 1879, which bound players to their original employer), and barred African Americans after 1884. Once the NL prospered, rival major leagues also sprang up, notably the American Association in 1882 and the American League in 1901.
Major League Baseball (MLB) became a model for the professionalization of football, basketball, and hockey, which all had short-lived professional leagues around the turn of the century. The National Football League and the National Hockey League of the 1920s were underfinanced regional operations, and their teams often went out of business, while the National Basketball Association was not even organized until 1949.
Professional team sports gained considerable popularity after World War II. The leagues dealt with such problems as franchise relocations and nationwide expansion, conflicts with interlopers, limiting player salaries, and racial integration. The NFL became the most successful operation by securing rich national television contracts, supplanting baseball as the national pastime in the 1970s. All these leagues became lucrative investments. With the rise of “free agency,” professional team athletes became extremely well paid, currently averaging more than $2 million a year.
Maureen A. Flanagan
The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members.
Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.
Laura A. Belmonte
From the revolutionary era to the post-9/11 years, public and private actors have attempted to shape U.S. foreign relations by persuading mass audiences to embrace particular policies, people, and ways of life. Although the U.S. government conducted wartime propaganda activities prior to the 20th century, it had no official propaganda agency until the Committee on Public Information (CPI) was formed in 1917. For the next two years, CPI aimed to generate popular support for the United States and its allies in World War I. In 1938, as part of its Good Neighbor Policy, the Franklin Roosevelt administration launched official informational and cultural exchanges with Latin America. Following American entry into World War II, the U.S. government created a new propaganda agency, the Office of War Information (OWI). Like CPI, OWI was disbanded once hostilities ended. But in the fall of 1945, to combat the threats of anti-Americanism and communism, President Harry S. Truman broke with precedent and ordered the continuation of U.S. propaganda activities in peacetime. After several reorganizations within the Department of State, all U.S. cultural and information activities came under the purview of the newly created U.S. Information Agency (USIA) in 1953. Following the dissolution of USIA in 1999, the State Department reassumed authority over America’s international information and cultural programs through its Office of International Information Programs.
Commercialized sexuality became a prominent feature of American urban settings in the early 19th century when young men migrated far from the watchful eyes of family as soldiers and laborers. Concentrated in large populations, and unable to afford the comforts of marriage, these men constituted a reliable pool of customers for women who sold sexual access to their bodies. These women turned to prostitution on a casual or steady basis as a survival strategy in a sex segregated labor market that paid women perilously low wages, or in response to family disruptions such as paternal or spousal abandonment. Prostitution could be profitable and it provided some women with a path towards economic independence, although it brought risks of venereal disease, addiction, violence, harassment by law enforcement, and unintended pregnancy. By mid-century most American cities tolerated red-light districts where brothels thrived as part of the urban sporting culture. Fears that white women were being coerced into prostitution led to the “white slavery” scare of the 1910s, spurring a concerted attack on brothels by progressive reformers. These reformers used the emergency of World War I to close public brothels, pushing America’s sex markets into clandestine spaces and empowering pimps’ control over women’s sexual labor. World War II raised concerns about soldiers’ venereal health that prompted the US military to experiment with different schemes for regulating prostitution that had been developed earlier during the Spanish–American War, as well as in the Philippines and Puerto Rico. After the war, the introduction of antibiotics and the celebration of marriage and family nudged prostitution into the margins of society, where women who sold sex were seen as psychologically deviant, yet men who purchased sex were thought to be sexually liberated. The dawning of second-wave feminism gave birth to the sex workers’ rights movement and a new critique of the criminalization of prostitution. Nevertheless, attitudes about prostitution continue to divide activists, and sex workers still bear the brunt of criminalization.
It is virtually impossible to understand the history of the American experience without Protestantism. The theological and religious descendants of the Protestant Reformation arrived in the United States in the early 17th century, shaped American culture in the 18th century, grew dramatically in the 19th century, and continued to be the guardians of American religious life in the 20th century. Protestantism, of course, is not monolithic. In fact, the very idea at the heart of Protestantism—the translation of the Bible into vernacular languages so it can be read and interpreted by all men and women—has resulted in thousands of different denominations, all claiming to be true to the teachings of scripture.
Protestantism, with its emphasis on the belief that human beings can access God as individuals, flourished in a nation that celebrated democracy and freedom. During the period of British colonization, especially following the so-called Glorious Revolution of 1688, Protestantism went hand in hand with British concepts of political liberty. As the British people celebrated their rights-oriented philosophy of government and compared their freedoms with the tyranny of France and other absolute monarchies in Europe, they also extolled the religious freedom that they had to read and interpret the Bible for themselves. Following the American Revolution, this historic connection between political liberty and Protestant liberty proved to be compatible with the kind of democratic individualism that emerged in the decades preceding the Civil War and, in many respects, continues to define American political culture.
Protestantism, of course, is first and foremost a religious movement. The proliferation of Protestant denominations provides the best support for G. K. Chesterton’s quip that “America is a nation with the soul of a church.” Spiritual individualism, a commitment to the authority of an inspired Bible, and the idea that faith in the Christian gospel is all that is needed to be saved from eternal punishment, has transformed the lives of millions and millions of ordinary Americans over the course of the last four hundred years.