21-40 of 191 Results  for:

  • 20th Century: Post-1945 x
Clear all

Article

Tyson Reeder

The United States has shared an intricate and turbulent history with Caribbean islands and nations since its inception. In its relations with the Caribbean, the United States has displayed the dueling tendencies of imperialism and anticolonialism that characterized its foreign policy with South America and the rest of the world. For nearly two and a half centuries, the Caribbean has stood at the epicenter of some of the US government’s most controversial and divisive foreign policies. After the American Revolution severed political ties between the United States and the British West Indies, US officials and traders hoped to expand their political and economic influence in the Caribbean. US trade in the Caribbean played an influential role in the events that led to the War of 1812. The Monroe Doctrine provided a blueprint for reconciling imperial ambitions in the Caribbean with anti-imperial sentiment. During the mid-19th century, Americans debated the propriety of annexing Caribbean islands, especially Cuba. After the Spanish-American War of 1898, the US government took an increasingly imperialist approach to its relations with the Caribbean, acquiring some islands as federal territories and augmenting its political, military, and economic influence in others. Contingents of the US population and government disapproved of such imperialistic measures, and beginning in the 1930s the US government softened, but did not relinquish, its influence in the Caribbean. Between the 1950s and the end of the Cold War, US officials wrestled with how to exert influence in the Caribbean in a postcolonial world. Since the end of the Cold War, the United States has intervened in Caribbean domestic politics to enhance democracy, continuing its oscillation between democratic and imperial impulses.

Article

The NAACP, established in 1909, was formed as an integrated organization to confront racism in the United States rather than seeing the issue as simply a southern problem. It is the longest running civil rights organization and continues to operate today. The original name of the organization was The National Negro League, but this was changed to the NAACP on May 30, 1910. Organized to promote racial equality and integration, the NAACP pursued this goal via legal cases, political lobbying, and public campaigns. Early campaigns involved lobbying for national anti-lynching legislation, pursuing through the US Supreme Court desegregation in areas such as housing and higher education, and the pursuit of voting rights. The NAACP is renowned for the US Supreme Court case of Brown v. Board of Education (1954) that desegregated primary and secondary schools and is seen as a catalyst for the civil rights movement (1955–1968). It also advocated public education by promoting African American achievements in education and the arts to counteract racial stereotypes. The organization published a monthly journal, The Crisis, and promoted African American art forms and culture as another means to advance equality. NAACP branches were established all across the United States and became a network of information, campaigning, and finance that underpinned activism. Youth groups and university branches mobilized younger members of the community. Women were also invaluable to the NAACP in local, regional, and national decision-making processes and campaigning. The organization sought to integrate African Americans and other minorities into the American social, political, and economic model as codified by the US Constitution.

Article

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Ivón Padilla-Rodríguez

Child migration has garnered widespread media coverage in the 21st century, becoming a central topic of national political discourse and immigration policymaking. Contemporary surges of child migrants are part of a much longer history of migration to the United States. In the first half of the 20th century, millions of European and Asian child migrants passed through immigration inspection stations in the New York harbor and San Francisco Bay. Even though some accompanied and unaccompanied European child migrants experienced detention at Ellis Island, most were processed and admitted into the United States fairly quickly in the early 20th century. Few of the European child migrants were deported from Ellis Island. Predominantly accompanied Chinese and Japanese child migrants, however, like Latin American and Caribbean migrants in recent years, were more frequently subjected to family separation, abuse, detention, and deportation at Angel Island. Once inside the United States, both European and Asian children struggled to overcome poverty, labor exploitation, educational inequity, the attitudes of hostile officials, and public health problems. After World War II, Korean refugee “orphans” came to the United States under the Refugee Relief Act of 1953 and the Immigration and Nationality Act. European, Cuban, and Indochinese refugee children were admitted into the United States through a series of ad hoc programs and temporary legislation until the 1980 Refugee Act created a permanent mechanism for the admission of refugee and unaccompanied children. Exclusionary immigration laws, the hardening of US international boundaries, and the United States preference for refugees who fled Communist regimes made unlawful entry the only option for thousands of accompanied and unaccompanied Mexican, Central American, and Haitian children in the second half of the 20th century. Black and brown migrant and asylum-seeking children were forced to endure educational deprivation, labor trafficking, mandatory detention, deportation, and deadly abuse by US authorities and employers at US borders and inside the country.

Article

Patrick William Kelly

The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.

Article

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

The 1969 Supreme Court ruling in Tinker v. Des Moines established that students in public elementary and secondary schools do not “shed their constitutional rights to freedom of speech or expression at the schoolhouse gate.” Before Tinker, students often faced punishment from school officials for their role in protests both on and off campus. A rise in civil rights protests and the role of young people in the social movements of the 1960s led to frequent conflicts between students and school administrators. Many black students were especially vocal in contesting racial discrimination at school in the two decades following the 1954Brown v. Board of Education decision. But before Tinker, students in public elementary and secondary schools were not considered to have any constitutional rights, including the right to free expression. Some of these students brought lawsuits in response to punishments they believed unfairly disciplined them for participating in legitimate protests. The political activism of young people and developments in First Amendment law eventually brought the Constitution into the public school classroom, leading to Tinker and other cases that established students’ rights.

Article

The civil rights movement in the urban South transformed the political, economic, and cultural landscape of post–World War II America. Between 1955 and 1968, African Americans and their white allies relied on nonviolent direct action, political lobbying, litigation, and economic boycotts to dismantle the Jim Crow system. Not all but many of the movement’s most decisive political battles occurred in the cities of Montgomery and Birmingham, Alabama; Nashville and Memphis, Tennessee; Greensboro and Durham, North Carolina; and Atlanta, Georgia. In these and other urban centers, civil rights activists launched full-throttled campaigns against white supremacy, economic exploitation, and state-sanctioned violence against African Americans. Their fight for racial justice coincided with monumental changes in the urban South as the upsurge in federal spending in the region created unprecedented levels of economic prosperity in the newly forged “Sunbelt.” A dynamic and multifaceted movement that encompassed a wide range of political organizations and perspectives, the black freedom struggle proved successful in dismantling legal segregation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 expanded black southerners’ economic, political, and educational opportunities. And yet, many African Americans continued to struggle as they confronted not just the long-term effects of racial discrimination and exclusion but also the new challenges engendered by deindustrialization and urban renewal as well as entrenched patterns of racial segregation in the public-school system.

Article

Clodagh Harrington

The Clinton scandals have settled in the annals of American political history in the context of the era’s recurrent presidential misbehavior. Viewed through a historical lens, the activities, investigation, and impeachment trial of the forty-second president are almost inevitably measured against the weight of Watergate and Iran-Contra. As a result, the actions and consequences of this high-profile moment in the late-20th-century political history of the United States arguably took on a weightier meaning than it might otherwise have. If Watergate tested the U.S. constitutional system to its limits and Iran-Contra was arguably as grave, the Clinton affair was crisis-light by comparison. Originating with an investigation into a failed 1970s Arkansas land deal by Bill Clinton and his wife, the saga developed to include such meandering subplots as Filegate, Travelgate, Troopergate, the death of White House counsel Vince Foster, and, most infamously, the president’s affair with a White House intern. Unlike Richard Nixon and Ronald Reagan, even Bill Clinton’s most ardent critics could not find a national security threat among the myriad scandals linked to his name. By the time that Justice Department appointee Robert Fiske was replaced as prosecutor by the infinitely more zealous Kenneth Starr, the case had become synonymous with the culture wars that permeated 1990s American society. As the Whitewater and related tentacles of the investigation failed to result in any meaningfully negative impact on the president, it was his marital infidelities that came closest to unseating him. Pursued with vigor by the Independent Counsel, his supporters remained loyal as his detractors spotted political opportunity via his lapses in judgment. Certain key factors made the Clinton scandal particular to its era. First, in an unprecedented development, the personal indiscretion aspect of the story broke via the Internet. In addition, had the Independent Counsel legislation not been renewed, prosecutor Fiske would likely have wrapped up his investigation in a timely fashion with no intention of pursuing an impeachment path. And, the relentless cable news cycle and increasingly febrile partisan atmosphere of the decade ensured that the nation remained as focused as it was divided on the topic.

Article

The US working class and the institutional labor movement was shaped by anticommunism. Anticommunism preceded the founding of the Soviet Union and the Cold War, and this early history affected the later experience. It reinforced conservative positions on union issues even in the period before the Cold War, and forged the alliances that influenced the labor movement’s direction, including the campaign to organize the South, the methods and structures of unions, and US labor’s foreign policy positions. While the Communist Party of the USA (CP) was a hierarchical organization straitjacketed by an allegiance to the Soviet Union, the unions it fostered cultivated radical democratic methods, while anticommunism often justified opposition to militancy and obstructed progressive policies. In the hottest moments of the postwar development of domestic anticommunism, unions and their members were vilified and purged from the labor movement, forced to take loyalty oaths, and fired for their association with the CP. The Cold War in the working class removed critical perspectives on capitalism, reinforced a moderate and conservative labor officialdom, and led to conformity with the state on foreign policy issues.

Article

American history is replete with instances of counterinsurgency. An unsurprising reality considering the United States has always participated in empire building, thus the need to pacify resistance to expansion. For much of its existence, the U.S. has relied on its Army to pacify insurgents. While the U.S. Army used traditional military formations and use of technology to battle peer enemies, the same strategy did not succeed against opponents who relied on speed and surprise. Indeed, in several instances, insurgents sought to fight the U.S. Army on terms that rendered superior manpower and technology irrelevant. By introducing counterinsurgency as a strategy, the U.S. Army attempted to identify and neutralize insurgents and the infrastructure that supported them. Discussions of counterinsurgency include complex terms, thus readers are provided with simplified, yet accurate definitions and explanations. Moreover, understanding the relevant terms provided continuity between conflicts. While certain counterinsurgency measures worked during the American Civil War, the Indian Wars, and in the Philippines, the concept failed during the Vietnam War. The complexities of counterinsurgency require readers to familiarize themselves with its history, relevant scholarship, and terminology—in particular, counterinsurgency, pacification, and infrastructure.

Article

The first credit reporting organizations emerged in the United States during the 19th century to address problems of risk and uncertainty in an expanding market economy. Early credit reporting agencies assisted merchant lenders by collecting and centralizing information about the business activities and reputations of unknown borrowers throughout the country. These agencies quickly evolved into commercial surveillance networks, amassing huge archives of personal information about American citizens and developing credit rating systems to rank them. Shortly after the Civil War, separate credit reporting organizations devoted to monitoring consumers, rather than businesspeople, also began to emerge to assist credit-granting retailers. By the early 20th century, hundreds of local credit bureaus dissected the personal affairs of American consumers, forming the genesis of a national consumer credit surveillance infrastructure. The history of American credit reporting reveals fundamental links between the development of modern capitalism and contemporary surveillance society. These connections became increasingly apparent during the late 20th century as technological advances in computing and networked communication fueled the growth of new information industries, raising concerns about privacy and discrimination. These connections and concerns, however, are not new. They can be traced to 19th-century credit reporting organizations, which turned personal information into a commodity and converted individual biographies into impersonal financial profiles and risk metrics. As these disembodied identities and metrics became authoritative representations of one’s reputation and worth, they exerted real effects on one’s economic life chances and social legitimacy. While drawing attention to capitalism’s historical twin, surveillance, the history of credit reporting illuminates the origins of surveillance-based business models that became ascendant during the 21st century.

Article

Michael J. Bustamante

The Cuban Revolution transformed the largest island nation of the Caribbean into a flashpoint of the Cold War. After overthrowing US-backed ruler Fulgencio Batista in early 1959, Fidel Castro established a socialist, anti-imperialist government that defied the island’s history as a dependent and dependable ally of the United States. But the Cuban Revolution is not only significant for its challenge to US interests and foreign policy prerogatives. For Cubans, it fundamentally reordered their lives, inspiring multitudes yet also driving thousands of others to migrate to Miami and other points north. Sixty years later, Fidel Castro may be dead and the Soviet Union may be long gone. Cuban socialism has become more hybrid in economic structure, and in 2014 the Cuban and US governments moved to restore diplomatic ties. But Cuba’s leaders continue to insist that “the Revolution,” far from a terminal political event, is still alive. Today, as the founding generation of Cuban leaders passes from the scene, “the Revolution” faces another important crossroads of uncertainty and reform.

Article

Frederick Rowe Davis

The history of DDT and pesticides in America is overshadowed by four broad myths. The first myth suggests that DDT was the first insecticide deployed widely by American farmers. The second indicates that DDT was the most toxic pesticide to wildlife and humans alike. The third myth assumes that Rachel Carson’s Silent Spring (1962) was an exposé of the problems of DDT rather than a broad indictment of American dependency on chemical insecticides. The fourth and final myth reassures Americans that the ban on DDT late in 1972 resolved the pesticide paradox in America. Over the course of the 20th century, agricultural chemists have developed insecticides from plants with phytotoxic properties (“botanical” insecticides) and a range of chemicals including heavy metals such as lead and arsenic, chlorinated hydrocarbons like DDT, and organophosphates like parathion. All of the synthetic insecticides carried profound unintended consequences for landscapes and wildlife alike. More recently, chemists have returned to nature and developed chemical analogs of the botanical insecticides, first with the synthetic pyrethroids and now with the neonicotinoids. Despite recent introduction, neonics have become widely used in agriculture and there are suspicions that these chemicals contribute to declines in bees and grassland birds.

Article

Michael K. Rosenow

In the broader field of thanatology, scholars investigate rituals of dying, attitudes toward death, evolving trajectories of life expectancy, and more. Applying a lens of social class means studying similar themes but focusing on the men, women, and children who worked for wages in the United States. Working people were more likely to die from workplace accidents, occupational diseases, or episodes of work-related violence. In most periods of American history, it was more dangerous to be a wage worker than it was to be a soldier. Battlegrounds were not just the shop floor but also the terrain of labor relations. American labor history has been filled with violent encounters between workers asserting their views of economic justice and employers defending their private property rights. These clashes frequently turned deadly. Labor unions and working-class communities extended an ethos of mutualism and solidarity from the union halls and picket lines to memorial services and gravesites. They lauded martyrs to movements for human dignity and erected monuments to honor the fallen. Aspects of ethnicity, race, and gender added layers of meaning that intersected with and refracted through individuals’ economic positions. Workers’ encounters with death and the way they made sense of loss and sacrifice in some ways overlapped with Americans from other social classes in terms of religious custom, ritual practice, and material consumption. Their experiences were not entirely unique but diverged in significant ways.

Article

The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded. Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.

Article

Megan Threlkeld

The issue of compulsory military service has been contested in the United States since before its founding. In a nation characterized by both liberalism and republicanism, there is an inherent tension between the idea that individuals should be able to determine their own destiny and the idea that all citizens have a duty to serve their country. Prior to the 20th century, conscription occurred mainly on the level of local militias, first in the British colonies and later in individual states. It was during the Civil War that the first federal drafts were instituted, both in the Union and the Confederacy. In the North, the draft was unpopular and largely ineffective. Congress revived national conscription when the United States entered World War I and established the Selective Service System to oversee the process. That draft ended when U.S. belligerency ended in 1918. The first peacetime draft was implemented in 1940; with the exception of one year, it remained in effect until 1973. Its most controversial days came during the Vietnam War, when thousands of people across the country demonstrated against it and, in some cases, outright refused to be inducted. The draft stopped with the end of the war, but in 1980, Congress reinstated compulsory Selective Service registration. More than two decades into the 21st century, male citizens and immigrant noncitizens are still required to register within thirty days of their eighteenth birthday. The very idea of “selective service” is ambiguous. It is selective because not everyone is conscripted, but it is compulsory because one can be prosecuted for failing to register or to comply with orders of draft boards. Especially during the Cold War, one of the system’s main functions was not to procure soldiers but to identify and exempt from service those men best suited for other endeavors framed as national service: higher education, careers in science and engineering, and even supporting families. That fact, combined with the decentralized nature of the Selective Service System itself, left the process vulnerable to the prejudices of local draft boards and meant that those most likely to be drafted were poor and nonwhite.

Article

Probably no American president was more thoroughly versed in matters of national security and foreign policy before entering office than Dwight David Eisenhower. As a young military officer, Eisenhower served stateside in World War I and then in Panama and the Philippines in the interwar years. On assignments in Washington and Manila, he worked on war plans, gaining an understanding that national security entailed economic and psychological factors in addition to manpower and materiel. In World War II, he commanded Allied forces in the European Theatre of Operations and honed his skills in coalition building and diplomacy. After the war, he oversaw the German occupation and then became Army Chief of Staff as the nation hastily demobilized. At the onset of the Cold War, Eisenhower embraced President Harry S. Truman’s containment doctrine and participated in the discussions leading to the 1947 National Security Act establishing the Central Intelligence Agency, the National Security Council, and the Department of Defense. After briefly retiring from the military, Eisenhower twice returned to public service at the behest of President Truman to assume the temporary chairmanship of the Joint Chiefs of Staff and then, following the outbreak of the Korean War, to become the first Supreme Allied Commander, Europe, charged with transforming the North Atlantic Treaty Organization into a viable military force. These experiences colored Eisenhower’s foreign policy views, which in turn led him to seek the presidency. He viewed the Cold War as a long-term proposition and worried that Truman’s military buildup would overtax finite American resources. He sought a coherent strategic concept that would be sustainable over the long haul without adversely affecting the free enterprise system and American democratic institutions. He also worried that Republican Party leaders were dangerously insular. As president, his New Look policy pursued a cost-effective strategy of containment by means of increased reliance on nuclear forces over more expensive conventional ones, sustained existing regional alliances and developed new ones, sought an orderly process of decolonization under Western guidance, resorted to covert operations to safeguard vital interests, and employed psychological warfare in the battle with communism for world opinion, particularly in the so-called Third World. His foreign policy laid the basis for what would become the overall American strategy for the duration of the Cold War. The legacy of that policy, however, was decidedly mixed. Eisenhower avoided the disaster of global war, but technological innovations did not produce the fiscal savings that he had envisioned. The NATO alliance expanded and mostly stood firm, but other alliances were more problematic. Decolonization rarely proceeded as smoothly as envisioned and caused conflict with European allies. Covert operations had long-term negative consequences. In Southeast Asia and Cuba, the Eisenhower administration’s policies bequeathed a poisoned chalice for succeeding administrations.

Article

Judge Glock

Despite almost three decades of strong and stable growth after World War II, the US economy, like the economies of many developed nations, faced new headwinds and challenges after 1970. Although the United States eventually overcame many of them, and continues to be one of the most dynamic in the world, it could not recover its mid-century economic miracle of rapid and broad-based economic growth. There are three major ways the US economy changed in this period. First, the US economy endured and eventually conquered the problem of high inflation, even as it instituted new policies that prioritized price stability over the so-called “Keynesian” goal of full employment. Although these new policies led to over two decades of moderate inflation and stable growth, the 2008 financial crisis challenged the post-Keynesian consensus and led to new demands for government intervention in downturns. Second, the government’s overall influence on the economy increased dramatically. Although the government deregulated several sectors in the 1970s and 1980s, such as transportation and banking, it also created new types of social and environmental regulation that were more pervasive. And although it occasionally cut spending, on the whole government spending increased substantially in this period, until it reached about 35 percent of the economy. Third, the US economy became more open to the world, and it imported more manufactured goods, even as it became more based on “intangible” products and on services rather than on manufacturing. These shifts created new economic winners and losers. Some institutions that thrived in the older economy, such as unions, which once compromised over a third of the workforce, became shadows of their former selves. The new service economy also created more gains for highly educated workers and for investors in quickly growing businesses, while blue-collar workers’ wages stagnated, at least in relative terms. Most of the trends that affected the US economy in this period were long-standing and continued over decades. Major national and international crises in this period, from the end of the Cold War, to the first Gulf War in 1991, to the September 11 attacks of 2001, seemed to have only a mild or transient impact on the economy. Two events that were of lasting importance were, first, the United States leaving the gold standard in 1971, which led to high inflation in the short term and more stable monetary policy over the long term; and second, the 2008 financial crisis, which seemed to permanently decrease American economic output even while it increased political battles about the involvement of government in the economy. The US economy at the beginning of the third decade of the 21st century was richer than it had ever been, and remained in many respects the envy of the world. But widening income gaps meant many Americans felt left behind in this new economy, and led some to worry that the stability and predictability of the old economy had been lost.