You are looking at 1-20 of 56 articles
Kathleen A. Brosnan and Jacob Blackwell
Throughout history, food needs bonded humans to nature. The transition to agriculture constituted slow, but revolutionary ecological transformations. After 1500
Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.
From its inception as a nation in 1789, the United States has engaged in an environmental diplomacy that has included attempts to gain control of resources, as well as formal diplomatic efforts to regulate the use of resources shared with other nations and peoples. American environmental diplomacy has sought to gain control of natural resources, to conserve those resources for the future, and to protect environmental amenities from destruction. As an acquirer of natural resources, the United States has focused on arable land as well as on ocean fisheries, although around 1900, the focus on ocean fisheries turned into a desire to conserve marine resources from unregulated harvesting.
The main 20th-century U.S. goal was to extend beyond its borders its Progressive-era desire to utilize resources efficiently, meaning the greatest good for the greatest number for the longest time. For most of the 20th century, the United States was the leader in promoting global environmental protection through the best science, especially emphasizing wildlife. Near the end of the century, U.S. government science policy was increasingly out of step with global environmental thinking, and the United States often found itself on the outside. Most notably, the attempts to address climate change moved ahead with almost every country in the world except the United States.
While a few monographs focus squarely on environmental diplomacy, it is safe to say that historians have not come close to tapping the potential of the intersection of the environmental and diplomatic history of the United States.
Foreign economic policy involves the mediation and management of economic flows across borders. Over two and a half centuries, the context for U.S. foreign economic policy has transformed. Once a fledgling republic on the periphery of the world economy, the United States has become the world’s largest economy, the arbiter of international economic order, and a predominant influence on the global economy. Throughout this transformation, the making of foreign economic policy has entailed delicate tradeoffs between diverse interests—political and material, foreign and domestic, sectional and sectoral, and so on. Ideas and beliefs have also shaped U.S. foreign economic policy—from Enlightenment-era convictions about the pacifying effects of international commerce to late 20th-century convictions about the efficacy of free markets.
The Soviet Union’s successful launch of the first artificial satellite Sputnik 1 on October 4, 1957, captured global attention and achieved the initial victory in what would soon become known as the space race. This impressive technological feat and its broader implications for Soviet missile capability rattled the confidence of the American public and challenged the credibility of U.S. leadership abroad. With the U.S.S.R.’s launch of Sputnik, and then later the first human spaceflight in 1961, U.S. policymakers feared that the public and political leaders around the world would view communism as a viable and even more dynamic alternative to capitalism, tilting the global balance of power away from the United States and towards the Soviet Union.
Reactions to Sputnik confirmed what members of the U.S. National Security Council had predicted: the image of scientific and technological superiority had very real, far-reaching geopolitical consequences. By signaling Soviet technological and military prowess, Sputnik solidified the link between space exploration and national prestige, setting a course for nationally funded space exploration for years to come. For over a decade, both the Soviet Union and the United States funneled significant financial and personnel resources into achieving impressive firsts in space, as part of a larger effort to win alliances in the Cold War contest for global influence.
From a U.S. vantage point, the space race culminated in the first Moon landing in July 1969. In 1961, President John F. Kennedy proposed Project Apollo, a lunar exploration program, as a tactic for restoring U.S. prestige in the wake of Soviet cosmonaut Yuri Gagarin’s spaceflight and the failure of the Bay of Pigs invasion. To achieve Kennedy’s goal of sending a man to the Moon and returning him safely back to Earth by the end of the decade, the United States mobilized a workforce in the hundreds of thousands. Project Apollo became the most expensive government funded civilian engineering program in U.S. history, at one point stretching to more than 4 percent of the federal budget. The United States’ substantial investment in winning the space race reveals the significant status of soft power in American foreign policy strategy during the Cold War.
Early 20th century American labor and working-class history is a subfield of American social history that focuses attention on the complex lives of working people in a rapidly changing global political and economic system. Once focused closely on institutional dynamics in the workplace and electoral politics, labor history has expanded and refined its approach to include questions about the families, communities, identities, and cultures workers have developed over time. With a critical eye on the limits of liberal capitalism and democracy for workers’ welfare, labor historians explore individual and collective struggles against exclusion from opportunity, as well as accommodation to political and economic contexts defined by rapid and volatile growth and deep inequality.
Particularly important are the ways that workers both defined and were defined by differences of race, gender, ethnicity, class, and place. Individual workers and organized groups of working Americans both transformed and were transformed by the main struggles of the industrial era, including conflicts over the place of former slaves and their descendants in the United States, mass immigration and migrations, technological change, new management and business models, the development of a consumer economy, the rise of a more active federal government, and the evolution of popular culture.
The period between 1896 and 1945 saw a crucial transition in the labor and working-class history of the United States. At its outset, Americans were working many more hours a day than the eight for which they had fought hard in the late 19th century. On average, Americans labored fifty-four to sixty-three hours per week in dangerous working conditions (approximately 35,000 workers died in accidents annually at the turn of the century). By 1920, half of all Americans lived in growing urban neighborhoods, and for many of them chronic unemployment, poverty, and deep social divides had become a regular part of life. Workers had little power in either the Democratic or Republican party. They faced a legal system that gave them no rights at work but the right to quit, judges who took the side of employers in the labor market by issuing thousands of injunctions against even nonviolent workers’ organizing, and vigilantes and police forces that did not hesitate to repress dissent violently. The ranks of organized labor were shrinking in the years before the economy began to recover in 1897. Dreams of a more democratic alternative to wage labor and corporate-dominated capitalism had been all but destroyed. Workers struggled to find their place in an emerging consumer-oriented culture that assumed everyone ought to strive for the often unattainable, and not necessarily desirable, marks of middle-class respectability.
Yet American labor emerged from World War II with the main sectors of the industrial economy organized, with greater earning potential than any previous generation of American workers, and with unprecedented power as an organized interest group that could appeal to the federal government to promote its welfare. Though American workers as a whole had made no grand challenge to the nation’s basic corporate-centered political economy in the preceding four and one-half decades, they entered the postwar world with a greater level of power, and a bigger share in the proceeds of a booming economy, than anyone could have imagined in 1896. The labor and working-class history of the United States between 1900 and 1945, then, is the story of how working-class individuals, families, and communities—members of an extremely diverse American working class—managed to carve out positions of political, economic, and cultural influence, even as they remained divided among themselves, dependent upon corporate power, and increasingly invested in a individualistic, competitive, acquisitive culture.
Gregory A. Daddis
For nearly a decade, American combat soldiers fought in South Vietnam to help sustain an independent, noncommunist nation in Southeast Asia. After U.S. troops departed in 1973, the collapse of South Vietnam in 1975 prompted a lasting search to explain the United States’ first lost war. Historians of the conflict and participants alike have since critiqued the ways in which civilian policymakers and uniformed leaders applied—some argued misapplied—military power that led to such an undesirable political outcome. While some claimed U.S. politicians failed to commit their nation’s full military might to a limited war, others contended that most officers fundamentally misunderstood the nature of the war they were fighting. Still others argued “winning” was essentially impossible given the true nature of a struggle over Vietnamese national identity in the postcolonial era. On their own, none of these arguments fully satisfy. Contemporary policymakers clearly understood the difficulties of waging a war in Southeast Asia against an enemy committed to national liberation. Yet the faith of these Americans in their power to resolve deep-seated local and regional sociopolitical problems eclipsed the possibility there might be limits to that power. By asking military strategists to simultaneously fight a war and build a nation, senior U.S. policymakers had asked too much of those crafting military strategy to deliver on overly ambitious political objectives. In the end, the Vietnam War exposed the limits of what American military power could achieve in the Cold War era.
On January 5, 2014—the fiftieth anniversary of President Lyndon Johnson’s launch of the War on Poverty—the New York Times asked a panel of opinion leaders a simple question: “Does the U.S. Need Another War on Poverty?” While the answers varied, all the invited debaters accepted the martial premise of the question—that a war on poverty had been fought and that eliminating poverty was, without a doubt, a “fight,” or a “battle.”
Yet the debate over the manner—martial or not—by which the federal government and public policy has dealt with the issue of poverty in the United States is still very much an open-ended one.
The evolution and development of the postwar American welfare state is a story not only of a number of “wars,” or individual political initiatives, against poverty, but also about the growth of institutions within and outside government that seek to address, alleviate, and eliminate poverty and its concomitant social ills. It is a complex and at times messy story, interwoven with the wider historical trajectory of this period: civil rights, the rise and fall of a “Cold War consensus,” the emergence of a counterculture, the Vietnam War, the credibility gap, the rise of conservatism, the end of “welfare,” and the emergence of compassionate conservatism. Mirroring the broader organization of the American political system, with a relatively weak center of power and delegated authority and decision-making in fifty states, the welfare model has developed and grown over decades. Policies viewed in one era as unmitigated failures have instead over time evolved and become part of the fabric of the welfare state.
Daryl Joji Maeda
The Asian American Movement was a social movement for racial justice, most active during the late 1960s through the mid-1970s, which brought together people of various Asian ancestries in the United States who protested against racism and U.S. neo-imperialism, demanded changes in institutions such as colleges and universities, organized workers, and sought to provide social services such as housing, food, and healthcare to poor people. As one of its signal achievements, the Movement created the category “Asian American,” (coined by historian and activist Yuji Ichioka), which encompasses the multiple Asian ethnic groups who have migrated to the United States. Its founding principle of coalitional politics emphasizes solidarity among Asians of all ethnicities, multiracial solidarity among Asian Americans as well as with African, Latino, and Native Americans in the United States, and transnational solidarity with peoples around the globe impacted by U.S. militarism.
The movement participated in solidarity work with other Third World peoples in the United States, including the Third World Liberation Front strikes at San Francisco State College and University of California, Berkeley. The Movement fought for housing rights for poor people in the urban cores of San Francisco, Los Angeles, New York City, Seattle, and Philadelphia; it created arts collectives, published newspapers and magazines, and protested vigorously against the Vietnam War. It also extended to Honolulu, where activists sought to preserve land rights in rural Hawai’i. It contributed to the larger radical movement for power and justice that critiqued capitalism and neo-imperialism, which flourished during the 1960s and 1970s.
Thomas J. Sugrue
Racism in the United States has long been a national problem, not a regional phenomenon. The long and well-documented history of slavery, Jim Crow laws, and racial violence in the South overshadows the persistent reality of racial discrimination, systemic segregation, and entrenched inequality north of the Mason-Dixon line. From the mid-19th century forward, African Americans and their allies mounted a series of challenges to racially separate schools, segregated public accommodations, racially divided workplaces, endemic housing segregation, and discriminatory policing. The northern civil rights movement expanded dramatically in the aftermath of the Great Migration of blacks northward and the intensification of segregation in northern hotels, restaurants, and theaters, workplaces, housing markets, and schools in the early 20th century. During the Great Depression and World War II, emboldened civil rights organizations engaged in protest, litigation, and lobbying efforts to undermine persistent racial discrimination and segregation. Their efforts resulted in legal and legislative victories against racially separate and unequal institutions, particularly workplaces and stores. But segregated housing and schools remained more impervious to change. By the 1960s, many black activists in the North grew frustrated with the pace of change, even as they succeeded in increasing black representation in elected office, in higher education, and in certain sectors of the economy. In the late 20th century, civil rights activists launched efforts to fight the ongoing problem of police brutality and the rise of the prison-industrial complex. And they pushed, mostly through the courts, for the protection of the fragile gains of the civil rights era. The black freedom struggle in the North remained incomplete in the face of ongoing segregation, persistent racism, and ongoing racial inequality in employment, education, income, and wealth.
Cambodians entered the United States as refugees after a group of Cambodian Communists named Khmer Rouge, led by the French-educated Pol Pot, won a civil war that had raged from March 1970 to April 1975 and proceeded to rule the country with extraordinary brutality. In power from April 17, 1975, to January 7, 1979, they destroyed all the major institutions in the country. An estimated 1.7 million people out of an estimated total population of 7.9 million died from executions, hunger, disease, injuries, coerced labor, and exposure to the elements. The refuge-seekers came in three waves: (1) just before the Khmer Rouge takeover, (2) during the regime’s existence, and (3) after the regime was overthrown. Some former Khmer Rouge personnel, who had escaped to Vietnam because they opposed Pol Pot’s extremist ideology and savage practices, returned in late December 1978, accompanied by 120,000 Vietnamese troops, to topple the government of their former comrades. A second civil war then erupted along the Thai-Cambodian border pitting the rump Khmer Rouge against two groups of non-communist combatants. Though fighting among themselves, all three groups opposed the new Cambodian government that was supported and controlled by Vietnam. When hundreds of thousands of Cambodians, along with Laotians and Vietnamese, showed up at the Thai-Cambodian border to seek refuge in Thailand, the Thai government and military did not welcome them. Thailand treated the Cambodians especially harshly for reasons related to the Thai officials’ concerns about the internal security of their country.
Almost 158,000 Cambodians gained entry into the United States between 1975 and 1994, mainly as refugees but with a smaller number as immigrants and “humanitarian parolees.” Cambodian ethnic communities sprang up on American soil, many of them in locations chosen by the U.S. Office of Refugee Resettlement. By the time the 1990 U.S. census was taken, Cambodians could be found in all fifty states. The refugees encountered enormous difficulties adapting to life in the United States. Only about 5 percent of them, mostly educated people from the first wave of refugees who came in 1975 and who, therefore, did not experience the atrocities of the Khmer Rouge era, managed to find white-collar jobs, often serving as intermediaries between their compatriots and the larger American society. About 40 to 50 percent of the Cambodian newcomers who arrived in the second and third waves found employment in blue-collar occupations. The rest of the population has relied on welfare and other forms of public assistance. A significant portion of this last group is composed of households headed by women whose fathers, husbands, or sons the Khmer Rouge had killed. It is they who have had to struggle the hardest to keep themselves and their children alive. Many women had to learn to become the main bread winners in their families even though they had never engaged in wage labor in their homeland. Large numbers of refugees have suffered from post-traumatic stress disorder but have received very little help to deal with the symptoms. Some children, lacking role models, have not done well academically and dropped out of school. Others have joined gangs. Despite myriad difficulties, Cambodians in the United States are determined to resuscitate their social institutions and culture that the Khmer Rouge had tried to destroy during their reign of terror. By reviving Cambodian classical dance, music, and other performing and visual arts, and by rebuilding institutions, particularly Buddhist temples, they are trying valiantly to transcend the tragedies that befell them in order to survive as a people and a culture.
In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.
Carol L. Higham
Comparing Catholic and Protestant missionaries in North America can be a herculean task. It means comparing many religious groups, at least five governments, and hundreds of groups of Indians. But missions to the Indians played important roles in social, cultural, and political changes for Indians, Europeans, and Americans from the very beginning of contact in the 1500s to the present. By comparing Catholic and Protestant missions to the Indians, this article provides a better understanding of the relationship between these movements and their functions in the history of borders and frontiers, including how the missions changed both European and Indian cultures.
Claudrena N. Harold
The civil rights movement in the urban South transformed the political, economic, and cultural landscape of post–World War II America. Between 1955 and 1968, African Americans and their white allies relied on nonviolent direct action, political lobbying, litigation, and economic boycotts to dismantle the Jim Crow system. Not all but many of the movement’s most decisive political battles occurred in the cities of Montgomery and Birmingham, Alabama; Nashville and Memphis, Tennessee; Greensboro and Durham, North Carolina; and Atlanta, Georgia. In these and other urban centers, civil rights activists launched full-throttled campaigns against white supremacy, economic exploitation, and state-sanctioned violence against African Americans. Their fight for racial justice coincided with monumental changes in the urban South as the upsurge in federal spending in the region created unprecedented levels of economic prosperity in the newly forged “Sunbelt.”
A dynamic and multifaceted movement that encompassed a wide range of political organizations and perspectives, the black freedom struggle proved successful in dismantling legal segregation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 expanded black southerners’ economic, political, and educational opportunities. And yet, many African Americans continued to struggle as they confronted not just the long-term effects of racial discrimination and exclusion but also the new challenges engendered by deindustrialization and urban renewal as well as entrenched patterns of racial segregation in the public-school system.
The issue of genocide and American Indian history has been contentious. Many writers see the massive depopulation of the indigenous population of the Americas after 1492 as a clear-cut case of the genocide. Other writers, however, contend that European and U.S. actions toward Indians were deplorable but were rarely if ever genocidal. To a significant extent, disagreements about the pervasiveness of genocide in the history of the post-Columbian Western Hemisphere, in general, and U.S. history, in particular, pivot on definitions of genocide. Conservative definitions emphasize intentional actions and policies of governments that result in very large population losses, usually from direct killing. More liberal definitions call for less stringent criteria for intent, focusing more on outcomes. They do not necessarily require direct sanction by state authorities; rather, they identify societal forces and actors. They also allow for several intersecting forces of destruction, including dispossession and disease. Because debates about genocide easily devolve into quarrels about definitions, an open-ended approach to the question of genocide that explores several phases and events provides the possibility of moving beyond the present stalemate. However one resolves the question of genocide in American Indian history, it is important to recognize that European and U.S. settler colonial projects unleashed massively destructive forces on Native peoples and communities. These include violence resulting directly from settler expansion, intertribal violence (frequently aggravated by colonial intrusions), enslavement, disease, alcohol, loss of land and resources, forced removals, and assaults on tribal religion, culture, and language. The configuration and impact of these forces varied considerably in different times and places according to the goals of particular colonial projects and the capacities of colonial societies and institutions to pursue them. The capacity of Native people and communities to directly resist, blunt, or evade colonial invasions proved equally important.
Timothy S. Huebner
The Supreme Court of the United States stands at the head of the nation’s judicial system. Created in Article III of the Constitution of 1787 but obscured by the other branches of government during the first few decades of its history, the Court came into its own as a co-equal branch in the early 19th century. Its exercise of judicial review—the power that it claimed to determine the constitutionality of legislative acts—gave the Court a unique status as the final arbiter of the nation’s constitutional conflicts. From the slavery question during the antebellum era to abortion and gay rights in more recent times, the Court has decided cases brought to it by individual litigants, and in doing so has shaped American constitutional and legal development. Composed of unelected justices who serve “during good behavior,” the Court’s rise in stature has not gone uncontested. Throughout the nation’s history, Congress, the president, and organized interest groups have all attempted to influence the Court’s jurisdiction, composition, and decision making. The Court’s prominence reflects Americans’ historically paradoxical attitudes toward the judiciary: they have often been suspicious of the power of unelected judges at the same time that they have relied on independent judicial institutions to resolve their deepest disputes.
Sean P. Harvey
“Race,” as a concept denoting a fundamental division of humanity and usually encompassing cultural as well as physical traits, was crucial in early America. It provided the foundation for the colonization of Native land, the enslavement of American Indians and Africans, and a common identity among socially unequal and ethnically diverse Europeans. Longstanding ideas and prejudices merged with aims to control land and labor, a dynamic reinforced by ongoing observation and theorization of non-European peoples. Although before colonization, neither American Indians, nor Africans, nor Europeans considered themselves unified “races,” Europeans endowed racial distinctions with legal force and philosophical and scientific legitimacy, while Natives appropriated categories of “red” and “Indian,” and slaves and freed people embraced those of “African” and “colored,” to imagine more expansive identities and mobilize more successful resistance to Euro-American societies. The origin, scope, and significance of “racial” difference were questions of considerable transatlantic debate in the age of Enlightenment and they acquired particular political importance in the newly independent United States.
Since the beginning of European exploration in the 15th century, voyagers called attention to the peoples they encountered, but European, American Indian, and African “races” did not exist before colonization of the so-called New World. Categories of “Christian” and “heathen” were initially most prominent, though observations also encompassed appearance, gender roles, strength, material culture, subsistence, and language. As economic interests deepened and colonies grew more powerful, classifications distinguished Europeans from “Negroes” or “Indians,” but at no point in the history of early America was there a consensus that “race” denoted bodily traits only. Rather, it was a heterogeneous compound of physical, intellectual, and moral characteristics passed on from one generation to another. While Europeans assigned blackness and African descent priority in codifying slavery, skin color was secondary to broad dismissals of the value of “savage” societies, beliefs, and behaviors in providing a legal foundation for dispossession.
“Race” originally denoted a lineage, such as a noble family or a domesticated breed, and concerns over purity of blood persisted as 18th-century Europeans applied the term—which dodged the controversial issue of whether different human groups constituted “varieties” or “species”—to describe a roughly continental distribution of peoples. Drawing upon the frameworks of scripture, natural and moral philosophy, and natural history, scholars endlessly debated whether different races shared a common ancestry, whether traits were fixed or susceptible to environmentally produced change, and whether languages or the body provided the best means to trace descent. Racial theorization boomed in the U.S. early republic, as some citizens found dispossession and slavery incompatible with natural-rights ideals, while others reconciled any potential contradictions through assurances that “race” was rooted in nature.
The Immigration Act of 1924 was in large part the result of a deep political and cultural divide in America between heavily immigrant cities and far less diverse small towns and rural areas. The 1924 legislation, together with growing residential segregation, midcentury federal urban policy, and postwar suburbanization, undermined scores of ethnic enclaves in American cities between 1925 and the 1960s. The deportation of Mexicans and their American children during the Great Depression, the incarceration of West Coast Japanese Americans during World War II, and the wartime and postwar shift of so many jobs to suburban and Sunbelt areas also reshaped many US cities in these years. The Immigration Act of 1965, which enabled the immigration of large numbers of people from Asia, Latin America, and, eventually, Africa, helped to revitalize many depressed urban areas and inner-ring suburbs. In cities and suburbs across the country, the response to the new immigration since 1965 has ranged from welcoming to hostile. The national debate over immigration in the early 21st century reflects both familiar and newer cultural, linguistic, religious, racial, and regional rifts. However, urban areas with a history of immigrant incorporation remain the most politically supportive of such people, just as they were a century ago.
Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.
Mass transit has been part of the urban scene in the United States since the early 19th century. Regular steam ferry service began in New York City in the early 1810s and horse-drawn omnibuses plied city streets starting in the late 1820s. Expanding networks of horse railways emerged by the mid-19th century. The electric streetcar became the dominant mass transit vehicle a half century later. During this era, mass transit had a significant impact on American urban development. Mass transit’s importance in the lives of most Americans started to decline with the growth of automobile ownership in the 1920s, except for a temporary rise in transit ridership during World War II. In the 1960s, congressional subsidies began to reinvigorate mass transit and heavy-rail systems opened in several cities, followed by light rail systems in several others in the next decades. Today concerns about environmental sustainability and urban revitalization have stimulated renewed interest in the benefits of mass transit.