You are looking at 181-200 of 353 articles
Spencer D. Bakich
The Persian Gulf War of 1990–1991 was something of a paradox. From the American perspective, the war had the hallmarks of a resounding victory. Responding to a flagrant case of interstate aggression by Iraq against Kuwait, the George H. W. Bush administration assembled a substantial international coalition to deter further Iraqi attacks against its neighbors in the Gulf and to compel Saddam Hussein into quitting Kuwait, to avoid war. When the latter proved infeasible, the United States led that coalition in forcibly ousting Iraq’s military from Kuwait, substantially degrading Iraqi combat power in the process. The war’s outcome resulted from an auspiciously altered geopolitical landscape at the end of the Cold War, the overwhelming superiority of American power vis-à-vis Iraq, and a US decision-making process that tightly knitted military and diplomatic objectives into a coherent—and coherently executed—wartime strategy. However, America’s historically lopsided victory in the Persian Gulf War proved fleeting. Iraq’s surviving military forces retained the capacity to crush domestic challenges to the Ba’athist regime and to threaten its Gulf neighbors. President Bush’s vision of a post-war new world order notwithstanding, Gulf security depended heavily on continuing military missions years after the Persian Gulf War ended. Despite wartime tactical and strategic successes, grand strategic success eluded the United States in the years after the war.
In the years after the Civil War, Polish immigrants became an important part of the American working class. They actively participated in the labor movement and played key roles in various industrial strikes ranging from the 1877 Railroad Strike through the rise of the CIO and the post-1945 era of prosperity. Over time, the Polish American working class became acculturated and left its largely immigrant past behind while maintaining itself as an ethnic community. It also witnessed a good deal of upward mobility, especially over several generations. This ethnic community, however, continued to be refreshed with immigrants throughout the 20th century.
As with the larger American working class, Polish American workers were hard hit by changes in the industrial structure of the United States. Deindustrialization turned the centers of much of the Polish American community into the Rust Belt. This, despite a radical history, caused many to react by turning toward conservative causes in the late 20th and early 21st centuries.
The reproductive experiences of women and girls in the 20th-century United States followed historical patterns shaped by the politics of race and class. Laws and policies governing reproduction generally regarded white women as legitimate reproducers and potentially fit mothers and defined women of color as unfit for reproduction and motherhood; regulations provided for rewards and punishments accordingly. In addition, public policy and public rhetoric defined “population control” as the solution to a variety of social and political problems in the United States, including poverty, immigration, the “quality” of the population, environmental degradation, and “overpopulation.” Throughout the century, nonetheless, women, communities of color, and impoverished persons challenged official efforts, at times reducing or even eliminating barriers to reproductive freedom and community survival.
Between 1900 and 1930, decades marked by increasing urbanization, industrialization, and immigration, eugenic fears of “race suicide” (concerns that white women were not having enough babies) fueled a reproductive control regime that pressured middle-class white women to reproduce robustly. At the same time, the state enacted anti-immigrant laws, undermined the integrity of Native families, and protected various forms of racial segregation and white supremacy, all of which attacked the reproductive dignity of millions of women. Also in these decades, many African American women escaped the brutal and sexually predatory Jim Crow culture of the South, and middle-class white women gained greater sexual freedom and access to reproductive health care, including contraceptive services.
During the Great Depression, the government devised the Aid to Dependent Children program to provide destitute “worthy” white mothers with government aid while often denying such supports to women of color forced to subordinate their motherhood to agricultural and domestic labor. Following World War II, as the Civil Rights movement gathered form, focus, and adherents, and as African American and other women of color claimed their rights to motherhood and social provision, white policymakers railed against “welfare queens” and defined motherhood as a class privilege, suitable only for those who could afford to give their children “advantages.” The state, invoking the “population bomb,” fought to reduce the birth rates of poor women and women of color through sterilization and mandatory contraception, among other strategies. Between 1960 and 1980, white feminists employed the consumerist language of “choice” as part of the campaign for legalized abortion, even as Native, black, Latina, immigrant, and poor women struggled to secure the right to give birth to and raise their children with dignity and safety. The last decades of the 20th century saw severe cuts in social programs designed to aid low-income mothers and their children, cuts to funding for public education and housing, court decisions that dramatically reduced poor women’s access to reproductive health care including abortion, and the emergence of a powerful, often violent, anti-abortion movement. In response, in 1994 a group of women of color activists articulated the theory of reproductive justice, splicing together “social justice” and “reproductive rights.” The resulting Reproductive Justice movement, which would become increasingly influential in the 21st century, defined reproductive health, rights, and justice as human rights due to all persons and articulated what each individual requires to achieve these rights: the right not to have children, the right to have children, and the right to the social, economic, and environmental conditions necessary to raise children in healthy, peaceful, and sustainable households and communities.
Rosina A. Lozano
Language rights are an integral part of civil rights. They provide the tools that permit individuals to engage with and participate in society. The broad use of the Spanish language in the United States by both citizens and immigrants—it is the second-most-spoken language in the country by far—has a long history. Spanish was the first European governing language in parts of the future United States that included the Southwest, portions of the Louisiana Purchase, and Florida. The use of the language did not disappear when these regions became part of the United States, but rather persisted in some locales as a politically important language. In the 20th century, Spanish-speaking immigrants entered not just the Southwest and Florida, but also Chicago, New York, the South, Michigan, and other locales across the country in large numbers. Throughout the 20th century and into the 21st century, Spanish speakers and their advocates have reasserted their cultural preference by fighting for monolingual speakers’ right to use Spanish in legal settings, in public, as voters, as elected officials, at work, and in education. The politics of the Spanish language have only grown in importance as the largest influx of Spanish-speaking immigrants ever has entered the United States. This demographic shift makes the longer history of Spanish a crucial backstory for future language-policy decisions.
Andrew J. Falk
Americans in and out of government have relied on media and popular culture to construct the national identity, frame debates on military interventions, communicate core values abroad, and motivate citizens around the world to act in prescribed ways. During the late 19th century, as the United States emerged as a world power and expanded overseas, Americans adopted an ethos of worldliness in their everyday lives, even as some expressed worry about the nation’s position on war and peace. During the interwar period of the 1920s and 1930s, though America failed to join the League of Nations and retreated from foreign engagements, the nation also increased cultural interactions with the rest of the world through the export of motion pictures, music, consumer products, food, fashion, and sports. The policies and character of the Second World War were in part shaped by propaganda that evolved from earlier information campaigns. As the United States confronted communism during the Cold War, the government sanitized its cultural weapons to win the hearts and minds of Americans, allies, enemies, and nonaligned nations. But some cultural producers dissented from America’s “containment policy,” refashioned popular media for global audiences, and sparked a change in Washington’s cultural-diplomacy programs. An examination of popular culture also shows how people in the “Third World” deftly used the media to encourage superpower action. In the 21st century, activists and revolutionaries can be considered the inheritors of this tradition because they use social media to promote their political agendas. In short, understanding the roles popular culture played as America engaged the world greatly expands our understanding of modern American foreign relations.
The People’s (or Populist) Party represented the last major third-party effort to prevent the emergence of large-scale corporate capitalism in the United States. Founded in 1891, the party sought to unite the producers of wealth—farmers and workers—into a political coalition dedicated to breaking the hold of private bankers over the nation’s monetary system, controlling monopolies through government ownership, and opening up unused land to actual settlers. Industrial workers and their unions were initially wary of the new party, but things changed after the traumatic labor unrest of 1894: Coxey’s March, the nationwide coal strike, and the Pullman boycott. At that time, the American Federation of Labor (AFL) debated some form of alliance with the Populists. Although the Federation rejected such an alliance in both 1894 and 1895 by the slimmest of margins, it did elect a labor Populist—John McBride of the United Mine Workers of America (UMWA)—to the presidency in 1894. This Populist insurgency represents the closest that the main body of the nation’s labor movement ever came to forming a labor party resembling those that arose in industrialized Europe, and its failure helps explain why American workers were unable to mobilize politically to challenge the emerging economic order dominated by large corporate enterprises.
While the agrarian leaders of the People’s Party at first sought the backing of industrial workers, especially those associated with the AFL, they shunned labor’s support after the trauma of 1894. Party officials like Herman Taubeneck, James Weaver, and Tom Watson feared that labor’s support would taint the party with radicalism and violence, warned that trade unionists sought to control the party, and took steps designed to alienate industrial workers. They even justified their retreat from the broad-based Omaha Platform (1892) on the grounds that it would drive the trade unionists they called “socialists” from the party.
Courts and legislatures in colonial America and the early American republic developed and refined a power to compel civilians to assist peace and law enforcement officers in arresting wrongdoers, keeping the peace, and other matters of law enforcement. This power to command civilian cooperation was known as the posse comitatus or “power of the county.” Rooted in early modern English countryside law enforcement, the posse comitatus became an important police institution in 18th- and 19th-century America. The posse comitatus was typically composed of able-bodied white male civilians who were temporarily deputized to aid a sheriff or constable. But if this “power of the county” was insufficient, law enforcement officers were often authorized to call on the military to serve as the posse comitatus.
The posse comitatus proved particularly important in buttressing slavery in the American South. Slaveholders pushed for and especially benefited from laws that required citizens to assist in the recapture of local runaway slaves and fugitive slaves who crossed into states without slavery. Though slave patrols were rooted in the posse comitatus, the posse comitatus originated as a compulsory and noncompensated institution. Slaveholders in the American South later added financial incentives for those who acted in the place of a posse to recapture slaves on the run from their owners.
The widespread use of the posse comitatus in southern slave law became part of the national discussion about slavery during the early American republic as national lawmakers contemplated how to deal with the problem of fugitive slaves who fled to free states. This dialogue culminated with the Fugitive Slave Law of 1850, in which the US Congress authorized officials to “summon and call to their aid the bystanders, or posse comitatus” and declared that “all good citizens are hereby commanded to aid and assist in the prompt and efficient execution of this law, whenever their services may be required.” During Reconstruction, the Radical Republican Congress used the posse comitatus to enforce laws that targeted conquered Confederates. After the end of Reconstruction in 1877, Southern states pushed Congress to create what would come to be known as the “Posse Comitatus Act,” which prohibited the use of federal military forces for law enforcement. The history of the posse comitatus in early America is thus best understood as a story about and an example of the centralization of government authority and its ramifications.
Substantial numbers of Asian Americans and Asian immigrants moved into suburbs across the United States after World War II, bringing distinctive everyday lifeways, identities, worldviews, family types, and community norms that remade much of American suburbia. Although Asian Americans were excluded from suburbs on racial grounds since the late 19th century, American Cold War objectives in Asia and the Pacific and domestic American civil rights struggles afforded Asian Americans increased access to suburban housing in the 1950s, especially Chinese and Japanese Americans. Following passage of the Immigration Act of 1965 and the Fair Housing Act of 1968, new groups of Asian Americans, particularly Filipino, Vietnamese, Thai, Korean, and South Asian Indian, joined Chinese and Japanese Americans in settling in earnest into all kinds of suburban neighborhoods. At the turn of the 21st century, a majority of Asians resided in the suburbs, which also became the preferred gateway communities for new immigrants who often bypassed urban cores and moved straight to the suburbs when they arrived.
Entrance into highly racialized postwar suburbs defined by white middle-class norms and segregated white privilege did not, however, mean that Asian Americans gained entry or assimilated into whiteness. While many certainly aspired to and reinforced long-standing white suburban ideals, others revamped, contested, and outright fractured dominant notions of the suburban good life. By the 1980s Asian Americans of various ethnic and national backgrounds had transformed the sights, sounds, and smells of suburban landscapes throughout the country. They made claims on suburban space and asserted a “right to the suburb” through a range of social and cultural practices, often in physical places, especially shopping plazas, grocery stores, restaurants, religious centers, and schools. Yet as Asian Americans tried to become full-fledged participants in suburban culture and life, their presence, ethnic expressions, and ways of life sparked tensions with other mostly white suburbanites that led to heated debates over immigration, race, multiculturalism, and assimilation in American society.
The history of post-World War II Asian American suburban cultures highlights suburbia as a principal setting for Asian American experiences and the making of Asian American identity during the second half of the 20th century. More broadly, the Asian American experience reveals how control over the suburban ideal and the making of suburban space in the United States was and remains a contested, layered process. It also underscores the racial and ethnic diversification of metropolitan America and how pressing social, political, economic, and cultural issues in US society played out increasingly on the suburban stage. Moreover, Asian Americans built communities and social networks precisely the moment in which the authentic “American” community was supposedly in decline, providing a powerful counterpunch to those who lament nonwhite populations, particularly immigrants, for fracturing an otherwise unified American culture or sense of togetherness.
American cities expanded during the late 19th century, as industrial growth was fueled by the arrival of millions of immigrants and migrants. Poverty rates escalated, overwhelming existing networks of private charities. Progressive reformers created relief organizations and raised public awareness of urban poverty. The devastating effects of the Great Depression inspired greater focus on poverty from state and federal agencies. The Social Security Act, the greatest legacy of the New Deal, would provide a safety net for millions of Americans. During the postwar era of general prosperity, federal housing policies often reinforced and deepened racial and socioeconomic inequality and segregation. The 1960s War on Poverty created vital aid programs that expanded access to food, housing, and health care. These programs also prompted a rising tide of conservative backlash against perceived excesses. Fueled by such critical sentiments, the Reagan administration implemented dramatic cuts to assistance programs. Later, the Clinton administration further reformed welfare by tying aid to labor requirements. Throughout the 20th century, the urban homeless struggled to survive in hostile environments. Skid row areas housed the homeless for decades, providing shelter, food, and social interaction within districts that were rarely visited by the middle and upper classes. The loss of such spaces to urban renewal and gentrification in many cities left many of the homeless unsheltered and dislocated.
Robert G. Parkinson
According to David Ramsay, one of the first historians of the American Revolution, “in establishing American independence, the pen and press had merit equal to that of the sword.” Because of the unstable and fragile notions of unity among the thirteen American colonies, print acted as a binding agent that mitigated the chances that the colonies would not support one another when war with Britain broke out in 1775.
Two major types of print dealt with the political process of the American Revolution: pamphlets and newspapers. Pamphlets were one of the most important conveyors of ideas during the imperial crisis. Often written by elites under pseudonyms and published by booksellers, they have long been held by historians as the lifeblood of the American Revolution. There were also three dozen newspaper printers in the American mainland colonies at the start of the Revolution, each producing a four-page issue every week. These weekly papers, or one-sheet broadsides that appeared in American cities even more frequently, were the most important communication avenue to keep colonists informed of events hundreds of miles away. Because of the structure of the newspaper business in the 18th century, the stories that appeared in each paper were “exchanged” from other papers in different cities, creating a uniform effect akin to a modern news wire. The exchange system allowed for the same story to appear across North America, and it provided the Revolutionaries with a method to shore up that fragile sense of unity. It is difficult to imagine American independence—as a popular idea let alone a possible policy decision—without understanding how print worked in colonial America in the mid-18th century.
Steven A. Riess
Professional sports teams are athletic organizations comprising talented, expert players hired by club owners whose revenues originally derived from admission fees charged to spectators seeing games in enclosed ballparks or indoor arenas. Teams are usually members of a league that schedules a championship season, although independent teams also can arrange their own contests. The first professional baseball teams emerged in the east and Midwest in 1860s, most notably the all-salaried undefeated Cincinnati Red Stockings of 1869. The first league was the haphazardly organized National Association of Professional Base Ball Players (1871), supplanted five years later by the more profit-oriented National League (NL) that set up strict rules for franchise locations, financing, and management–employee relations (including a reserve clause in 1879, which bound players to their original employer), and barred African Americans after 1884. Once the NL prospered, rival major leagues also sprang up, notably the American Association in 1882 and the American League in 1901.
Major League Baseball (MLB) became a model for the professionalization of football, basketball, and hockey, which all had short-lived professional leagues around the turn of the century. The National Football League and the National Hockey League of the 1920s were underfinanced regional operations, and their teams often went out of business, while the National Basketball Association was not even organized until 1949.
Professional team sports gained considerable popularity after World War II. The leagues dealt with such problems as franchise relocations and nationwide expansion, conflicts with interlopers, limiting player salaries, and racial integration. The NFL became the most successful operation by securing rich national television contracts, supplanting baseball as the national pastime in the 1970s. All these leagues became lucrative investments. With the rise of “free agency,” professional team athletes became extremely well paid, currently averaging more than $2 million a year.
Maureen A. Flanagan
The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members.
Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.
Laura A. Belmonte
From the revolutionary era to the post-9/11 years, public and private actors have attempted to shape U.S. foreign relations by persuading mass audiences to embrace particular policies, people, and ways of life. Although the U.S. government conducted wartime propaganda activities prior to the 20th century, it had no official propaganda agency until the Committee on Public Information (CPI) was formed in 1917. For the next two years, CPI aimed to generate popular support for the United States and its allies in World War I. In 1938, as part of its Good Neighbor Policy, the Franklin Roosevelt administration launched official informational and cultural exchanges with Latin America. Following American entry into World War II, the U.S. government created a new propaganda agency, the Office of War Information (OWI). Like CPI, OWI was disbanded once hostilities ended. But in the fall of 1945, to combat the threats of anti-Americanism and communism, President Harry S. Truman broke with precedent and ordered the continuation of U.S. propaganda activities in peacetime. After several reorganizations within the Department of State, all U.S. cultural and information activities came under the purview of the newly created U.S. Information Agency (USIA) in 1953. Following the dissolution of USIA in 1999, the State Department reassumed authority over America’s international information and cultural programs through its Office of International Information Programs.
Commercialized sexuality became a prominent feature of American urban settings in the early 19th century when young men migrated far from the watchful eyes of family as soldiers and laborers. Concentrated in large populations, and unable to afford the comforts of marriage, these men constituted a reliable pool of customers for women who sold sexual access to their bodies. These women turned to prostitution on a casual or steady basis as a survival strategy in a sex segregated labor market that paid women perilously low wages, or in response to family disruptions such as paternal or spousal abandonment. Prostitution could be profitable and it provided some women with a path towards economic independence, although it brought risks of venereal disease, addiction, violence, harassment by law enforcement, and unintended pregnancy. By mid-century most American cities tolerated red-light districts where brothels thrived as part of the urban sporting culture. Fears that white women were being coerced into prostitution led to the “white slavery” scare of the 1910s, spurring a concerted attack on brothels by progressive reformers. These reformers used the emergency of World War I to close public brothels, pushing America’s sex markets into clandestine spaces and empowering pimps’ control over women’s sexual labor. World War II raised concerns about soldiers’ venereal health that prompted the US military to experiment with different schemes for regulating prostitution that had been developed earlier during the Spanish–American War, as well as in the Philippines and Puerto Rico. After the war, the introduction of antibiotics and the celebration of marriage and family nudged prostitution into the margins of society, where women who sold sex were seen as psychologically deviant, yet men who purchased sex were thought to be sexually liberated. The dawning of second-wave feminism gave birth to the sex workers’ rights movement and a new critique of the criminalization of prostitution. Nevertheless, attitudes about prostitution continue to divide activists, and sex workers still bear the brunt of criminalization.
It is virtually impossible to understand the history of the American experience without Protestantism. The theological and religious descendants of the Protestant Reformation arrived in the United States in the early 17th century, shaped American culture in the 18th century, grew dramatically in the 19th century, and continued to be the guardians of American religious life in the 20th century. Protestantism, of course, is not monolithic. In fact, the very idea at the heart of Protestantism—the translation of the Bible into vernacular languages so it can be read and interpreted by all men and women—has resulted in thousands of different denominations, all claiming to be true to the teachings of scripture.
Protestantism, with its emphasis on the belief that human beings can access God as individuals, flourished in a nation that celebrated democracy and freedom. During the period of British colonization, especially following the so-called Glorious Revolution of 1688, Protestantism went hand in hand with British concepts of political liberty. As the British people celebrated their rights-oriented philosophy of government and compared their freedoms with the tyranny of France and other absolute monarchies in Europe, they also extolled the religious freedom that they had to read and interpret the Bible for themselves. Following the American Revolution, this historic connection between political liberty and Protestant liberty proved to be compatible with the kind of democratic individualism that emerged in the decades preceding the Civil War and, in many respects, continues to define American political culture.
Protestantism, of course, is first and foremost a religious movement. The proliferation of Protestant denominations provides the best support for G. K. Chesterton’s quip that “America is a nation with the soul of a church.” Spiritual individualism, a commitment to the authority of an inspired Bible, and the idea that faith in the Christian gospel is all that is needed to be saved from eternal punishment, has transformed the lives of millions and millions of ordinary Americans over the course of the last four hundred years.
Public authorities are agencies created by governments to engage directly in the economy for public purposes. They differ from standard agencies in that they operate outside the administrative framework of democratically accountable government. Since they generate their own operating income by charging users for goods and services and borrow for capital expenses based on projections of future revenues, they can avoid the input from voters and the regulations that control public agencies funded by tax revenues.
Institutions built on the public authority model exist at all levels of government and in every state. A few of these enterprises, such as the Tennessee Valley Authority and the Port Authority of New York and New Jersey, are well known. Thousands more toil in relative obscurity, operating toll roads and bridges, airports, transit systems, cargo ports, entertainment venues, sewer and water systems, and even parking garages. Despite their ubiquity, these agencies are not well understood. Many release little information about their internal operations. It is not even possible to say conclusively how many exist, since experts disagree about how to define them, and states do not systematically track them.
One thing we do know about public authorities is that, over the course of the 20th century, these institutions have become a major component of American governance. Immediately following the Second World War, they played a minor role in public finance. But by the early 21st century, borrowing by authorities constituted well over half of all public borrowing at the sub-federal level. This change means that increasingly the leaders of these entities, rather than elected officials, make key decisions about where and how to build public infrastructure and steer economic development in the United States
D. Bradford Hunt
Public housing emerged during the New Deal as a progressive effort to end the scourge of dilapidated housing in American cities. Reformers argued that the private market had failed to provide decent, safe, and affordable housing, and they convinced Congress to provide deep subsidies to local housing authorities to build and manage modern, low-cost housing projects for the working poor. Well-intentioned but ultimately misguided policy decisions encouraged large-scale developments, concentrated poverty and youth, and starved public housing of needed resources. Further, the antipathy of private interests to public competition and the visceral resistance of white Americans to racial integration saddled public housing with many enemies and few friends. While residents often formed tight communities and fought for improvements, stigmatization and neglect undermined the success of many projects; a sizable fraction became disgraceful and tangible symbols of systemic racism toward the nation’s African American poor. Federal policy had few answers and retreated in the 1960s, eventually making a neoliberal turn to embrace public-private partnerships for delivering affordable housing. Housing vouchers and tax credits effectively displaced the federal public housing program. In the 1990s, the Clinton administration encouraged the demolition and rebuilding of troubled projects using vernacular “New Urbanist” designs to house “mixed-income” populations. Policy problems, political weakness, and an ideology of homeownership in the United States meant that a robust, public-centered program of housing for use rather than profit could not be sustained.
Adam M. Sowards
For more than a century after the republic’s founding in the 1780s, American law reflected the ideal that the commons—the public domain—should be turned into private property. As Americans became concerned about resource scarcity, waste, and monopolies at the end of the 19th century, reform-minded bureaucrats and scientists convinced Congress to maintain in perpetuity some of the nation’s land as public. This shift offered a measure of protection and an alternative to private property regimes. The federal agencies that primarily manage these lands today—U.S. Forest Service (USFS), National Park Service (NPS), U.S. Fish and Wildlife Service (USFWS), and Bureau of Land Management (BLM)—have worked since their origins in the early decades of the 20th century to fulfill their diverse, competing, evolving missions. Meanwhile, the public and Congress have continually demanded new and different goals as the land itself has functioned and responded in interdependent ways. In the mid-20th century, the agencies intensified their management, hoping they could satisfy the rising—and often conflicting—demands American citizens placed on the public lands. This intensification often worsened public lands’ ecology and increased political conflict, resulting in a series of new laws in the 1960s and 1970s. Those laws strengthened the role of science and the public in influencing agency practices while providing more opportunities for litigation. Predictably, since the late 1970s, these developments have polarized public lands’ politics. The economies, but also the identities, of many Americans remain entwined with the public lands, making political standoffs—over endangered species, oil production, privatizing land, and more—common and increasingly intractable. Because the public lands are national in scope but used by local people for all manner of economic and recreational activities, they have been and remain microcosms of the federal democratic system and all its conflicted nature.
Nicholas J. Cull
Public opinion has been part of US foreign relations in two key ways. As one would expect in a democracy, the American public has shaped the foreign policy of its government. No less significantly, the United States has sought to influence foreign public opinion as a tool of its diplomacy, now known as public diplomacy. The US public has also been a target of foreign attempts at influence with varying degrees of success. While analysis across the span of US history reveals a continuity of issues and approaches, issues of public opinion gained unprecedented salience in the second decade of the 21st century. This salience was not matched by scholarship.
Joseph E. Hower
Government employees are an essential part of the early-21st-century labor movement in the United States. Teachers, firefighters, and police officers are among the most heavily unionized occupations in America, but public-sector union members also include street cleaners and nurses, janitors and librarians, zookeepers and engineers. Despite cultural stereotypes that continue to associate unions with steel or auto workers, public employees are five times more likely to be members of unions than workers in private industry. Today, nearly half of all union members work for federal, state, or local governments.
It was not always so. Despite a long, rich history of workplace and ballot box activism, government workers were marginal to the broader labor movement until the second half of the 20th century. Excluded from the legal breakthroughs that reshaped American industry in the 1930s, government workers lacked the basic organizing and bargaining rights extended to their private-sector counterparts. A complicated, and sometimes convoluted, combination of discourse and doctrine held that government employees were, as union leader Jerry Wurf later put it, a “servant to a master” rather than “a worker with a boss.” Inspired by the material success of workers in mass industry and moved by the moral clarity of the Black Freedom struggle, government workers demanded an end to their second-class status through one of the most consequential, and least recognized, social movements of late 20th century. Yet their success at improving the pay, benefits, and conditions of government work also increased the cost of government services, imposing new obligations at a time of dramatic change in the global economy. In the resulting crunch, unionized public workers came under political pressure, particularly from fiscal conservatives who charged that their bargaining rights and political power were incompatible with a new age of austerity and limits.