Robert R. Gioielli
By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.
The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.
Gabriella M. Petrick
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles.
The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods.
Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.
Changing foodways, the consumption and production of food, access to food, and debates over food shaped the nature of American cities in the 20th century. As American cities transformed from centers of industrialization at the start of the century to post-industrial societies at the end of the 20th century, food cultures in urban America shifted in response to the ever-changing urban environment. Cities remained centers of food culture, diversity, and food reform despite these shifts.
Growing populations and waves of immigration changed the nature of food cultures throughout the United States in the 20th century. These changes were significant, all contributing to an evolving sense of American food culture. For urban denizens, however, food choice and availability were dictated and shaped by a variety of powerful social factors, including class, race, ethnicity, gender, and laboring status. While cities possessed an abundance of food in a variety of locations to consume food, fresh food often remained difficult for the urban poor to obtain as the 20th century ended.
As markets expanded from 1900 to 1950, regional geography became a less important factor in determining what types of foods were available. In the second half of the 20th century, even global geography became less important to food choices. Citrus fruit from the West Coast was readily available in northeastern markets near the start of the century, and off-season fruits and vegetables from South America filled shelves in grocery stores by the end of the 20th century. Urban Americans became further disconnected from their food sources, but this dislocation spurred counter-movements that embraced ideas of local, seasonal foods and a rethinking of the city’s relationship with its food sources.
Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.
According to the First Amendment of the US Constitution, Congress is barred from abridging the freedom of the press (“Congress shall make no law . . . abridging the freedom of speech, or of the press”). In practice, the history of press freedom is far more complicated than this simple constitutional right suggests. Over time, the meaning of the First Amendment has changed greatly. The Supreme Court largely ignored the First Amendment until the 20th century, leaving the scope of press freedom to state courts and legislatures. Since World War I, jurisprudence has greatly expanded the types of publication protected from government interference. The press now has broad rights to publish criticism of public officials, salacious material, private information, national security secrets, and much else. To understand the shifting history of press freedom, however, it is important to understand not only the expansion of formal constitutional rights but also how those rights have been shaped by such factors as economic transformations in the newspaper industry, the evolution of professional standards in the press, and the broader political and cultural relations between politicians and the press.
While American gambling has a historical association with the lawlessness of the frontier and with the wasteful leisure practices of Southern planters, it was in large cities where American gambling first flourished as a form of mass leisure, and as a commercial enterprise of significant scale. In the urban areas of the Mid-Atlantic, the Northeast, and the upper Mid-West, for the better part of two centuries the gambling economy was deeply intertwined with municipal politics and governance, the practices of betting were a prominent feature of social life, and controversies over the presence of gambling both legal and illegal, were at the center of public debate. In New York and Chicago in particular, but also in Cleveland, Pittsburgh, Detroit, Baltimore, and Philadelphia, gambling channeled money to municipal police forces and sustained machine politics. In the eyes of reformers, gambling corrupted governance and corroded social and economic interactions. Big city gambling has changed over time, often in a manner reflecting important historical processes and transformations in economics, politics, and demographics. Yet irrespective of such change, from the onset of Northern urbanization during the 19th century, through much of the 20th century, gambling held steady as a central feature of city life and politics. From the poolrooms where recently arrived Irish New Yorkers bet on horseracing after the Civil War, to the corner stores where black and Puerto Rican New Yorkers bet on the numbers game in the 1960s, the gambling activity that covered the urban landscape produced argument and controversy, particularly with respect to drawing the line between crime and leisure, and over the question of where and to what ends the money of the gambling public should be directed.
Marjorie J. Spruill
The late 20th century saw gender roles transformed as the so-called Second Wave of American feminism that began in the 1960s gained support. By the early 1970s public opinion increasingly favored the movement and politicians in both major political parties supported it. In 1972 Congress overwhelmingly approved the Equal Rights Amendment (ERA) and sent it to the states. Many quickly ratified, prompting women committed to traditional gender roles to organize. However, by 1975 ERA opponents led by veteran Republican activist Phyllis Schlafly, founder of Stop ERA, had slowed the ratification process, although federal support for feminism continued. Congresswoman Bella Abzug (D-NY), inspired by the United Nations’ International Women’s Year (IWY) program, introduced a bill approved by Congress that mandated state and national IWY conferences at which women would produce recommendations to guide the federal government on policy regarding women. Federal funding of these conferences (held in 1977), and the fact that feminists were appointed to organize them, led to an escalation in tensions between feminist and conservative women, and the conferences proved to be profoundly polarizing events. Feminists elected most of the delegates to the culminating IWY event, the National Women’s Conference held in Houston, Texas, and the “National Plan of Action” adopted there endorsed a wide range of feminist goals including the ERA, abortion rights, and gay rights. But the IWY conferences presented conservatives with a golden opportunity to mobilize, and anti-ERA, pro-life, and anti-gay groups banded together as never before. By the end of 1977, these groups, supported by conservative Catholics,
Mormons, and evangelical and fundamentalist Protestants, had come together to form a “Pro-Family Movement” that became a powerful force in American politics. By 1980 they had persuaded the Republican Party to drop its support for women’s rights. Afterward, as Democrats continued to support feminist goals and the GOP presented itself as the defender of “family values,” national politics became more deeply polarized and bitterly partisan.
Gentrification is one of the most controversial issues in American cities today. But it also remains one of the least understood. Few agree on how to define it or whether it is boon or curse for cities. Gentrification has changed over time and has a history dating back to the early 20th century. Historically, gentrification has had a smaller demographic impact on American cities than suburbanization or immigration. But since the late 1970s, gentrification has dramatically reshaped cities like Seattle, San Francisco, and Boston. Furthermore, districts such as the French Quarter in New Orleans, New York City’s Greenwich Village, and Georgetown in Washington DC have had an outsized influence on the political, cultural, and architectural history of cities. Gentrification thus must be examined alongside suburbanization as one of the major historical trends shaping the 20th-century American metropolis.
During the 20th century, the black population of the United States transitioned from largely rural to mostly urban. In the early 1900s the majority of African Americans lived in rural, agricultural areas. Depictions of black people in popular culture often focused on pastoral settings, like the cotton fields of the rural South. But a dramatic shift occurred during the Great Migrations (1914–1930 and 1941–1970) when millions of rural black southerners relocated to US cities.
Motivated by economic opportunities in urban industrial areas during World Wars I and II, African Americans opted to move to southern cities as well as to urban centers in the Northeast, Midwest, and West Coast. New communities emerged that contained black social and cultural institutions, and musical and literary expressions flourished. Black migrants who left the South exercised voting rights, sending the first black representatives to Congress in the 20th century. Migrants often referred to themselves as “New Negroes,” pointing to their social, political, and cultural achievements, as well as their use of armed self-defense during violent racial confrontations, as evidence of their new stance on race.
Sarah B. Snyder
In its formulation of foreign policy, the United States takes account of many priorities and factors, including national security concerns, economic interests, and alliance relationships. An additional factor with significance that has risen and fallen over time is human rights, or more specifically violations of human rights. The extent to which the United States should consider such abuses or seek to moderate them has been and continues to be the subject of considerable debate.
The Immigration Act of 1924 was in large part the result of a deep political and cultural divide in America between heavily immigrant cities and far less diverse small towns and rural areas. The 1924 legislation, together with growing residential segregation, midcentury federal urban policy, and postwar suburbanization, undermined scores of ethnic enclaves in American cities between 1925 and the 1960s. The deportation of Mexicans and their American children during the Great Depression, the incarceration of West Coast Japanese Americans during World War II, and the wartime and postwar shift of so many jobs to suburban and Sunbelt areas also reshaped many US cities in these years. The Immigration Act of 1965, which enabled the immigration of large numbers of people from Asia, Latin America, and, eventually, Africa, helped to revitalize many depressed urban areas and inner-ring suburbs. In cities and suburbs across the country, the response to the new immigration since 1965 has ranged from welcoming to hostile. The national debate over immigration in the early 21st century reflects both familiar and newer cultural, linguistic, religious, racial, and regional rifts. However, urban areas with a history of immigrant incorporation remain the most politically supportive of such people, just as they were a century ago.
Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.
Mass transit has been part of the urban scene in the United States since the early 19th century. Regular steam ferry service began in New York City in the early 1810s and horse-drawn omnibuses plied city streets starting in the late 1820s. Expanding networks of horse railways emerged by the mid-19th century. The electric streetcar became the dominant mass transit vehicle a half century later. During this era, mass transit had a significant impact on American urban development. Mass transit’s importance in the lives of most Americans started to decline with the growth of automobile ownership in the 1920s, except for a temporary rise in transit ridership during World War II. In the 1960s, congressional subsidies began to reinvigorate mass transit and heavy-rail systems opened in several cities, followed by light rail systems in several others in the next decades. Today concerns about environmental sustainability and urban revitalization have stimulated renewed interest in the benefits of mass transit.
By serving travelers and commerce, roads and streets unite people and foster economic growth. But as they develop, roads and streets also disrupt old patterns, upset balances of power, and isolate some as they serve others. The consequent disagreements leave historical records documenting social struggles that might otherwise be overlooked. For long-distance travel in America before the middle of the 20th century, roads were generally poor alternatives, resorted to when superior means of travel, such as river and coastal vessels, canal boats, or railroads were unavailable. Most roads were unpaved, unmarked, and vulnerable to the effects of weather. Before the railroads, for travelers willing to pay the toll, rare turnpikes and plank roads could be much better. Even in towns, unpaved streets were common until the late 19th century, and persisted into the 20th. In the late 19th century, rapid urban growth, rural free delivery of the mails, and finally the proliferation of electric railways and bicycling contributed to growing pressure for better roads and streets. After 1910, the spread of the automobile accelerated the trend, but only with great controversy, especially in cities. Partly in response to the controversy, advocates of the automobile organized to promote state and county motor highways funded substantially by gasoline taxes; such roads were intended primarily for motor vehicles. In the 1950s, massive federal funds accelerated the trend; by then, motor vehicles were the primary transportation mode for both long and short distances. The consequences have been controversial, and alternatives have been attracting growing interest.
Thomas A. Reinstein
The United States has a rich history of intelligence in the conduct of foreign relations. Since the Revolutionary War, intelligence has been most relevant to U.S. foreign policy in two ways. Intelligence analysis helps to inform policy. Intelligence agencies also have carried out overt action—secret operations—to influence political, military, or economic conditions in foreign states. The American intelligence community has developed over a long period, and major changes to that community have often occurred because of contingent events rather than long-range planning. Throughout their history, American intelligence agencies have used intelligence gained from both human and technological sources to great effect. Often, U.S. intelligence agencies have been forced to rely on technological means of intelligence gathering for lack of human sources. Recent advances in cyberwarfare have made technology even more important to the American intelligence community.
At the same time, the relationship between intelligence and national-security–related policymaking has often been dysfunctional. Indeed, though some American policymakers have used intelligence avidly, many others have used it haphazardly or not at all. Bureaucratic fights also have crippled the American intelligence community. Several high-profile intelligence failures tend to dominate the recent history of intelligence and U.S. foreign relations. Some of these failures were due to lack of intelligence or poor analytic tradecraft. Others came because policymakers failed to use the intelligence they had. In some cases, policymakers have also pressured intelligence officers to change their findings to better suit those policymakers’ goals. And presidents have often preferred to use covert action to carry out their preferred policies without paying attention to intelligence analysis. The result has been constant debate about the appropriate role of intelligence in U.S. foreign relations.
Kelly J. Shannon
Historian James A. Bill famously described America’s relationship with Iran as a tragedy. “Few international relationships,” he wrote, “have had a more positive beginning than that which characterized Iranian-American contacts for more than a century.” The nations’ first diplomatic dealings in the 1850s resulted in a treaty of friendship, and although the U.S. government remained largely aloof from Iranian affairs until World War II, many Iranians saw Americans and the United States positively by the early 20th century. The United States became more deeply involved with Iran during the Second World War, and the two nations were close allies during the Cold War. Yet they became enemies following the 1979 Iranian Revolution. How did this happen?
The events that led to the Islamic Republic of Iran dubbing the United States the “Great Satan” in 1979 do indeed contain elements of tragedy. By the late 19th century, Iran—known to Americans as “Persia” until the 1930s—was caught in the middle of the imperial “Great Game” between Great Britain and Russia. Although no European power formally colonized Iran, Britain and Russia developed “spheres of influence” in the country and meddled constantly in Iran’s affairs. As Iranians struggled to create a modern, independent nation-state, they looked to disinterested third parties for help in their struggle to break free from British and Russian control. Consequently, many Iranians came to see the United States as a desirable ally. Activities of individual Americans in Iran from the mid-19th century onward, ranging from Presbyterian missionaries who built hospitals and schools to economic experts who advised Iran’s government, as well as the United States’ own revolutionary and democratic history, fostered a positive view of the United States among Iranians. The two world wars drew the United States into more active involvement in the Middle East, and following both conflicts, the U.S. government defended Iran’s sovereignty against British and Soviet manipulation.
The event that caused the United States to lose the admiration of many Iranians occurred in 1953, when the U.S. Central Intelligence Agency and the British Secret Intelligence Service staged a coup, which overthrew Iran’s democratically elected prime minister, Mohammad Mossadegh, because he nationalized Iran’s oil industry. The coup allowed Iran’s shah, Mohammad Reza Shah Pahlavi, to transform himself from a constitutional monarch into an absolute ruler. The 1953 coup, coupled with the subsequent decades of U.S. support for the Shah’s politically repressive regime, resulted in anti-American resentment that burst forth during the 1979 Iranian Revolution. The two nations have been enemies ever since. This article traces the origins and evolution of the U.S. relationship with Iran from the 19th through the early 21st centuries.
Justus D. Doenecke
For the United States, isolationism is best defined as avoidance of wars outside the Western Hemisphere, particularly in Europe; opposition to binding military alliances; and the unilateral freedom to act politically and commercially unrestrained by mandatory commitments to other nations. Until the controversy over American entry into the League of Nations, isolationism was never subject to debate. The United States could expand its territory, protect its commerce, and even fight foreign powers without violating its traditional tenets. Once President Woodrow Wilson sought membership in the League, however, Americans saw isolationism as a foreign policy option, not simply something taken for granted. A fundamental foreign policy tenet now became a faction, limited to a group of people branded as “isolationists.” Its high point came during the years 1934–1937, when Congress, noting the challenge of the totalitarian nations to the international status quo, passed the neutrality acts to insulate the country from global entanglements.
Once World War II broke out in Europe, President Franklin D. Roosevelt increasingly sought American participation on the side of the Allies. Isolationists unsuccessfully fought FDR’s legislative proposals, beginning with repeal of the arms embargo and ending with the convoying of supplies to Britain. The America First Committee (1940–1941), however, so effectively mobilized anti-interventionist opinion as to make the president more cautious in his diplomacy.
If the Japanese attack on Pearl Harbor permanently ended classic isolationism, by 1945 a “new isolationism” voiced suspicion of the United Nations, the Truman Doctrine, aid to Greece and Turkey, the Marshall Plan, the North Atlantic Treaty Organization, and U.S. participation in the Korean War. Yet, because the “new isolationists” increasingly advocated militant unilateral measures to confront Communist Russia and China, often doing so to advance the fortunes of the Republican party, they exposed themselves to charges of inconsistency and generally faded away in the 1950s. Since the 1950s, many Americans have opposed various military involvements— including the ones in Vietnam, Iraq, and Afghanistan— but few envision returning to an era when the United States avoids all commitments.
Racism and xenophobia, but also resilience and community building, characterize the return of thousands of Japanese Americans, or Nikkei, to the West Coast after World War II. Although the specific histories of different regions shaped the resettlement experiences for Japanese Americans, Los Angeles provides an instructive case study. For generations, the City of Angels has been home to one of the nation’s largest and most diverse Nikkei communities and the ways in which Japanese Americans rebuilt their lives and institutions resonate with the resettlement experience elsewhere.
Before World War II, greater Los Angeles was home to a vibrant Japanese American population. First generation immigrants, or Issei, and their American-born children, the Nisei, forged dynamic social, economic, cultural, and spiritual institutions out of various racial exclusions. World War II uprooted the community as Japanese Americans left behind their farms, businesses, and homes. In the best instances, they were able to entrust their property to neighbors or other sympathetic individuals. More often, the uncertainty of their future led Japanese Americans to sell off their property, far below the market price. Upon the war’s end, thousands of Japanese Americans returned to Los Angeles, often to financial ruin.
Upon their arrival in the Los Angeles area, Japanese Americans continued to face deep-seated prejudice, all the more accentuated by an overall dearth of housing. Without a place to live, they sought refuge in communal hostels set up in pre-war institutions that survived the war such as a variety of Christian and Buddhist churches. Meanwhile, others found housing in temporary trailer camps set up by the War Relocation Authority (WRA), and later administered by the Federal Public Housing Authority (FPHA), in areas such as Burbank, Sun Valley, Hawthorne, Santa Monica, and Long Beach. Although some local religious groups and others welcomed the returnees, white homeowners, who viewed the settlement of Japanese Americans as a threat to their property values, often mobilized to protest the construction of these camps. The last of these camps closed in 1956, demonstrating the hardship some Japanese Americans still faced in integrating back into society. Even when the returnees were able to leave the camps, they still faced racially restrictive housing covenants and, when those practices were ruled unconstitutional, exclusionary lending. Although new suburban enclaves of Japanese Americans eventually developed in areas such as Gardena, West Los Angeles, and Pacoima by the 1960s, the pathway to those destinations was far from easy. Ultimately, the resettlement of Japanese Americans in Los Angeles after their mass incarceration during World War II took place within the intertwined contexts of lingering anti-Japanese racism, Cold War politics, and the suburbanization of Southern California.
In the post-1945 period, jazz moved rapidly from one major avant-garde revolution (the birth of bebop) to another (the emergence of free jazz) while developing a profusion of subgenres (hard bop, progressive, modal, Third Stream, soul jazz) and a new idiomatic persona (cool or hip) that originated as a form of African American resistance but soon became a signature of transgression and authenticity across the modern arts and culture. Jazz’s long-standing affiliation with African American urban life and culture intensified through its central role in the Black Arts Movement of the 1960s. By the 1970s, jazz, now fully eclipsed in popular culture by rock n’ roll, turned to electric instruments and fractured into a multitude of hyphenated styles (jazz-funk, jazz-rock, fusion, Latin jazz). The move away from acoustic performance and traditional codes of blues and swing musicianship generated a neoclassical reaction in the 1980s that coincided with a mission to establish an orthodox jazz canon and honor the music’s history in elite cultural institutions. Post-1980s jazz has been characterized by tension between tradition and innovation, earnest preservation and intrepid exploration, Americanism and internationalism.