Courtney Q. Shah
A concerted movement to promote sex education in America emerged in the early 20th century as part of a larger public health movement that also responded to the previous century’s concerns about venereal disease, prostitution, “seduction,” and “white slavery.” Sex education, therefore, offered a way to protect people (especially privileged women) from sexual activity of all kinds—consensual and coerced. A widespread introduction into public schools did not occur until after World War I. Sex education programs in schools tended to focus on training for heterosexual marriage at a time when high school attendance spiked in urban and suburban areas. Teachers often segregated male and female students.
Beyond teaching boys about male anatomy and girls about female anatomy, reformers and educators often conveyed different messages and used different materials, depending on the race of their students. Erratic desegregation efforts during the Civil Rights movement renewed a crisis in sex education programs. Parents and administrators considered sexuality education even more dangerous in the context of a racially integrated classroom. The backlash against sex education in the schools kept pace with the backlash against integration, with each often used to bolster the other. Opponents of integration and sex education, for example, often used racial language to scare parents about what kids were learning, and with whom.
In the 1980s and 1990s, the political power of the evangelical movement in the United States attracted support for “abstinence-only” curricula that relied on scare tactics and traditional assumptions about gender and sexuality. The ever-expanding acceptance (both legal and social) of lesbian, gay, bisexual, or transgender identity directly challenged the conservative turn of abstinence-until-marriage sex education programs. The politics of gender, race, class, and sexual orientation have consistently shaped and limited sex education.
Radicalism in the United States since 1945 has been varied, complex, and often fragmented, making it difficult to analyze as a coherent movement. Communist and pro-Soviet organizations remained active after World War II, but a proliferation of noncommunist groups in the 1940s and 1950s, formed by those disillusioned by Marxist theory or the Soviet Union, began to chart a new course for the American Left. Eschewing much of the previous focus on labor, the proletariat, and Marxist doctrine, American postwar radical organizations realigned around humanist values, moral action, democracy, and even religion, with tenuous connections to Marxism, if any. The parameters of postwar radical moral theory were not always clearly defined, and questions of strategy and vision caused frequent divisions among activists. Nonetheless, claims of individual dignity and freedom continued to frame left radicalism into the late 20th century, emphasizing identity politics, community-building initiatives, and cultural expression in the streets of U.S. cities and the halls of academia. The presidential campaign of Bernie Sanders in 2016 helped revitalize leftist rhetoric on the national stage with its calls for racial and economic equality on moral terms.
Rap is the musical practice of hip hop culture that features vocalists, or MCs, reciting lyrics over an instrumental beat that emerged out of the political and economic transformations of New York City after the 1960s. Black and Latinx youth, many of them Caribbean immigrants, created this new cultural form in response to racism, poverty, urban renewal, deindustrialization, and inner-city violence. These new cultural forms eventually spread beyond New York to all regions of the United States as artists from Los Angeles, New Orleans, Miami, and Chicago began releasing rap music with their own distinct sounds. Despite efforts to demonize and censor rap music and hip hop culture, rap music has served as a pathway for social mobility for many black and Latinx youth. Many artists have enjoyed crossover success in acting, advertising, and business. Rap music has also sparked new conversations about various issues such as electoral politics, gender and sexuality, crime, policing, and mass incarceration, as well as technology.
America’s tremendous diversities of faith, region, and ethnicity complicate efforts to generalize relationships between religious groups and the labor movement. Americans’ historic and widely shared commitment to Christianity masks deep divisions: between white Christians and black Christians, between Catholics and Protestants, between northern Protestants and southern Protestants, and between “modernist” Protestants (who view the Bible in metaphorical terms as a source of ethical guidance and emphasize social justice) and “fundamentalist” Protestants (who view the Bible literally and eschew social activism in favor of individual evangelizing). Work, class, and the role of the labor movement add extra dimensions to these complexities, which are multiplied when considering non-Christian traditions such as Judaism or the other world religious communities that have grown in the United States since the immigration reforms of 1965.
Nevertheless, scholars accept a general narrative that delineates key periods, themes, and players over the course of the twentieth century. From the turn of the 19th century until the 1930s, the relationship between religion and labor was shaped by the centrality of the American Federation of Labor (AFL) in the labor movement, the development of a “social gospel” among northern mainline Protestants, and the massive immigration from southern and eastern Europe that brought millions of Catholic and Jewish workers into the United States before it largely ended in the 1920s. These developments were sometimes in tension. The AFL favored craft unionism and placed a premium on organizing skilled male workers; it therefore left out many of the unskilled new arrivals (as well as African Americans and most women). Consequently, the shape of “religion and labor” formed primarily around the dynamic between the AFL and Protestant social reformers, without much regard to the large masses of unorganized Catholic, Jewish, and African American workers.
These dynamics shifted in the Great Depression. The Congress of Industrial Organizations (CIO), begun as a committee within the AFL in 1934, sought the organization of entire industries—skilled and unskilled alike, and ethnic Catholics and Jews became unionized in large numbers. Even traditional racial barriers in the labor movement began crumbling in some industries. And, the labor movement expanded its geographical ambition, pushing aggressively into the South. In turn, the religious voices associated with the labor movement broadened and deepened. Labor’s new alliances with Catholics, Jews, African Americans, and southern evangelicals helped to push the ranks of organized workers to historic highs in the 1950s.
This coalition has faced divisive, even disastrous headwinds since the 1960s. The strength of anticommunism, especially within religious groups, caused some religious workers to retreat from the reformist ambitions of the labor movement and sparked a conservative religious movement deeply opposed to labor and liberalism. Race became an ever-hotter flashpoint. Although religiously affiliated civil rights reformers often forged alliances with unions, the backlash and resistance to civil rights among portions of the white working class undermined the efficacy of labor unions as sources of social cohesion. Perhaps most profoundly, the economy as a whole transformed from an urban-industrial to a post-urban service model. Organized labor has floundered in the wake of these changes, and the concomitant resurgence of a traditionalist, individualistic, and therapeutic religious culture has offered the remains of the labor movement little to partner with.
Jimmy Carter’s “Crisis of Confidence Speech” of July 1979 was a critical juncture in post-1945 U.S. politics, but it also marks an exemplary pivot in post-1945 religion. Five dimensions of faith shaped the president’s sermon. The first concerned the shattered consensus of American religion. When Carter encouraged Americans to recapture a spirit of unity, he spoke in a heartfelt but spent language more suitable to Dwight Eisenhower’s presidency than his own. By 1979, the Protestant-Catholic-Jewish consensus of Eisenhower’s time was fractured into a dynamic pluralism, remaking American religion in profound ways. Carter’s speech revealed a second revolution of post-1945 religion when it decried its polarization and politicization. Carter sought to heal ruptures that were dividing the nation between what observers, two decades hence, would label “red” (conservative Republican) and “blue” (liberal Democratic) constituencies. Yet his endeavors failed, as would be evidenced in the religious politics of Ronald Reagan’s era, which followed. Carter championed community values as the answer to his society’s problems aware of yet a third dawning reality: globalization. The virtues of localism that Carter espoused were in fact implicated in (and complicated by) transnational forces of change that saw immigration, missionary enterprises, and state and non-state actors internationalizing the American religious experience. A fourth illuminating dimension of Carter’s speech was its critique of America’s gospel of wealth. Although this “born-again” southerner was a product of the evangelical South’s revitalized free-market capitalism, he lamented how laissez-faire Christianity had become America’s lingua franca. Finally, Carter wrestled with secularization, revealing a fifth feature of post-1945 America. Even though faith commitments were increasingly cordoned off from formal state functions during this time, the nation’s political discourse acquired a pronounced religiosity. Carter contributed by framing mundane issues (such as energy) in moral contexts that drew no hard-and-fast boundaries between matters of the soul and governance. Drawn from the political and economic crises of his moment, Carter’s speech thus also reveals the all-enveloping tide of religion in America’s post-1945 age.
Rock and roll, a popular music craze of the mid-1950s, turned a loud, fast, and sexy set of sounds rooted in urban, black, working class, and southern America into the pop preference as well of suburban, white, young, and northern America. By the late 1960s, those fans and British counterparts made their own version, more politicized and experimental and just called rock—the summoning sound of the counterculture. Rock’s aura soon faded: it became as much entertainment staple as dissident form, with subcategories disparate as singer-songwriter, heavy metal, alternative, and “classic rock.” Where rock and roll was integrated and heterogeneous, rock was largely white and homogeneous, policing its borders. Notoriously, rock fans detonated disco records in 1979. By the 1990s, rock and roll style was hip-hop, with its youth appeal and rebelliousness; post‒baby boomer bands gave rock some last vanguard status; and suburbanites found classic rock in New Country. This century’s notions of rock and roll have blended thoroughly, from genre “mash-ups” to superstar performers almost categories unto themselves and new sounds such as EDM beats. Still, crossover moments evoke rock and roll; assertions of authenticity evoke rock. Because rock and roll, and rock, epitomize cultural ideals and group identities, their definitions have been constantly debated. Initial argument focused on challenging genteel, professional notions of musicianship and behavior. Later discourse took up cultural incorporation and social empowerment, with issues of gender and commercialism as prominent as race and artistry. Rock and roll promised one kind of revolution to the post-1945 United States; rock another. The resulting hope and confusion has never been fully sorted, with mixed consequences for American music and cultural history.
In 1835, Alexis de Tocqueville argued in Democracy in America that there were “two great nations in the world.” They had started from different historical points but seemed to be heading in the same direction. As expanding empires, they faced the challenges of defeating nature and constructing a civilization for the modern era. Although they adhered to different governmental systems, “each of them,” de Tocqueville declared, “seems marked out by the will of Heaven to sway the destinies of half the globe.”
De Tocqueville’s words were prophetic. In the 19th century, Russian and American intellectuals and diplomats struggled to understand the roles that their countries should play in the new era of globalization and industrialization. Despite their differing understandings of how development should happen, both sides believed in their nation’s vital role in guiding the rest of the world. American adherents of liberal developmentalism often argued that a free flow of enterprise, trade, investment, information, and culture was the key to future growth. They held that the primary obligation of American foreign policy was to defend that freedom by pursuing an “open door” policy and free access to markets. They believed that the American model would work for everyone and that the United States had an obligation to share its system with the old and underdeveloped nations around it.
A similar sense of mission developed in Russia. Russian diplomats had for centuries struggled to establish defensive buffers around the periphery of their empire. They had linked economic development to national security, and they had argued that their geographic expansion represented a “unification” of peoples as opposed to a conquering of them. In the 19th century, after the Napoleonic Wars and the failed Decembrist Revolution, tsarist policymakers fought to defend autocracy, orthodoxy, and nationalism from domestic and international critics. As in the United States, Imperial and later Soviet leaders envisioned themselves as the emissaries of the Enlightenment to the backward East and as protectors of tradition and order for the chaotic and revolutionary West.
These visions of order clashed in the 20th century as the Soviet Union and the United States became superpowers. Conflicts began early, with the American intervention in the 1918–1921 Russian civil war. Tensions that had previously been based on differing geographic and strategic interests then assumed an ideological valence, as the fight between East and West became a struggle between the political economies of communism and capitalism. Foreign relations between the two countries experienced boom and bust cycles that took the world to the brink of nuclear holocaust and yet maintained a strategic balance that precluded the outbreak of global war for fifty years. This article will examine how that relationship evolved and how it shaped the modern world.
Robert O. Self
Few decades in American history reverberate with as much historical reach or glow as brightly in living mythology as the 1960s. During those years Americans reanimated and reinvented the core political principles of equality and liberty but, in a primal clash that resonates more than half a century later, fiercely contested what those principles meant, and for whom. For years afterward, the decade’s appreciators considered the era to have its own “spirit,” defined by greater freedoms and a deeper, more authentic personhood, and given breath by a youthful generation’s agitation for change in nearly every dimension of national life. To its detractors in subsequent decades, the era was marked by immature radical fantasies and dangerous destabilizations of the social order, behind which lay misguided youthful enthusiasms and an overweening, indulgent federal government. We need not share either conviction to appreciate the long historical shadow cast by the decade’s clashing of left, right, and center and its profound influence over the political debates, cultural logics, and social practices of the many years that followed.
The decade’s political and ideological clashes registered with such force because post–World War II American life was characterized by a society-wide embrace of antiradicalism and a prescribed normalcy. Having emerged from the war as the lone undamaged capitalist industrial power, the United States exerted enormous influence throughout the globe after 1945—so much that some historians have called the postwar years a “pax Americana.” In its own interest and in the interest of its Western allies, the United States engaged in a Cold War standoff with the Soviet Union over the fate of Europe and no less over the fate of developing countries on every continent. Fiercely anticommunist abroad and at home, U.S. elites stoked fears of the damage communism could do, whether in Eastern Europe or in a public school textbook. Americans of all sorts in the postwar years embraced potent ideologies justifying the prevailing order, whether that order was capitalist, patriarchal, racial, or heterosexual. They pursued a postwar “normalcy” defined by nuclear family domesticity and consumer capitalism in the shadow cast by the threat of communism and, after 1949, global thermonuclear war with the Soviet Union. This prevailing order was stultifying and its rupture in the 1960s is the origin point of the decade’s great dramas.
The social movements of that decade drew Americans from the margins of citizenship—African Americans, Latina/o, Native Americans, women, and gay men and lesbians, among others—into epochal struggles over the withheld promise of equality. For the first time since 1861, an American war deeply split the nation, nearly destroying a major political party and intensifying a generational revolt already under way. Violence, including political assassinations at the highest level, bombings and assassinations of African Americans, bombings by left-wing groups like the Weathermen, and major urban uprisings by African Americans against police and property bathed the country in more blood. The New Deal liberalism of Presidents Franklin D. Roosevelt and Harry S. Truman reached its postwar peak in 1965 under President Lyndon Johnson’s Great Society and then retreated amid acrimony and backlash, as a new conservative politics gained traction. All this took place in the context of a “global 1960s,” in which societies in Western and Eastern Europe, Latin America, Africa, and elsewhere experienced similar generational rebellions, quests for meaningful democracy, and disillusionment with American global hegemony. From the first year of the decade to the last, the 1960s were a watershed era that marked the definitive end of a “postwar America” defined by easy Cold War dualities, presumptions of national innocence, and political calcification.
To explain the foregoing, this essay is organized in five sections. First comes a broad overview of the decade, highlighting some of its indelible moments and seminal political events. The next four sections correspond to the four signature historical developments of the 1960s. Discussed first is the collapse of the political consensus that predominated in national life following World War II. We can call this consensus “Vital Center liberalism,” after the title of a 1949 book by Arthur Schlesinger Jr., or “Cold War liberalism.” Its assault from both the New Left and the New Right is one of the defining stories of the 1960s. Second is the resurgence, after a decades-long interregnum dating to Reconstruction, of African American political agency. The black freedom struggle of the 1960s was far more than a social movement for civil rights. To shape the conditions of national life and the content of public debate in ways impossible under Jim Crow, black American called for nothing less than a spiritual and political renewal of the country. Third, and following from the latter, is the emergence within the American liberal tradition of a new emphasis on expanding individual rights and ending invidious discrimination. Forged in conjunction with the black freedom movement by women, Latino/as, Asian Americans, Native Americans, and homophiles (as early gay rights activists were called) and gay liberationists, this new emphasis profoundly changed American law and set the terms of political debate for the next half century. Fourth and lastly, the 1960s witnessed the flourishing of a broad and diverse culture of anti-authoritarianism. In art, politics, and social behavior, this anti-authoritarianism took many forms, but at its heart lay two distinct historical phenomena: an ecstatic celebration of youth, manifest in the tension between the World War II generation and the baby boom generation, and an intensification of the long-standing conflict in American life between individualism and hierarchical order.
Despite the disruptions, rebellions, and challenges to authority in the decade, the political and economic elite proved remarkably resilient and preserved much of the prevailing order. This is not to discount the foregoing account of challenges to that order or to suggest that social change in the 1960s made little difference in American life. However, in grappling with this fascinating decade we are confronted with the paradox of outsized events and enormous transformations in law, ideology, and politics alongside a continuation, even an entrenchment, of traditional economic and political structures and practices.
The decade of the 1980s represented a turning point in American history—a crucial era, marked by political conservatism and an individualistic ethos. The 1980s also witnessed a dramatic series of developments in U.S. foreign relations, first an intensification of the Cold War with the Soviet Union and then a sudden relaxation of tensions and the effective end of the Cold War with an American victory. All of these developments were advanced and symbolized in the presidential administration of Ronald Reagan (1981–1989), a polarizing figure but a highly successful political leader. Reagan dominates our memories of the 1980s like few other American leaders do other eras. Reagan and the political movement he led—Reaganism—are central to the history of the 1980s. Both their successes and their failures, which became widely acknowledged in the later years of the decade, should be noted. Reaganite conservatives won political victories by rolling back state power in many realms, most of all in terms of taxation and regulation. They also succeeded in putting America at the unquestioned pinnacle of the world order through a victory over the Soviet Union in the Cold War, although this was unforeseen by America’s Cold Warriors when the 1980s began. The failures of Reaganite conservatism include its handling of rising poverty levels, the HIV/AIDS crisis, and worsening racial tensions, all problems that either Reaganites did little to stem or to which they positively contributed. In foreign affairs, Reaganites pursued a “war on terror” of questionable success, and their approach to Third World arenas of conflict, including Central America, exacted a terrible human toll.
The 1950s have typically been seen as a complacent, conservative time between the end of World War II and the radical 1960s, when anticommunism and the Cold War subverted reform and undermined civil liberties. But the era can also be seen as a very liberal time in which meeting the Communist threat led to Keynesian economic policies, the expansion of New Deal programs, and advances in civil rights. Politically, it was “the Eisenhower Era,” dominated by a moderate Republican president, a high level of bipartisan cooperation, and a foreign policy committed to containing communism. Culturally, it was an era of middle-class conformity, which also gave us abstract expressionism, rock and roll, Beat poetry, and a grassroots challenge to Jim Crow.
Ansley T. Erickson
“Urban infrastructure” calls to mind railways, highways, and sewer systems. Yet the school buildings—red brick, limestone, or concrete, low-slung, turreted, or glass-fronted—that hold and seek to shape the city’s children are ubiquitous forms of infrastructure as well. Schools occupy one of the largest line items in a municipal budget, and as many as a fifth of a city’s residents spend the majority of their waking hours in school classrooms, hallways, and gymnasiums. In the 19th and 20th centuries urban educational infrastructure grew, supported by developing consensus for publicly funded and publicly governed schools (if rarely fully accessible to all members of the public). Even before state commitment to other forms of social welfare, from pensions to public health, and infrastructure, from transit to fire, schooling was a government function.
This commitment to public education ultimately was national, but schools in cities had their own story. Schooling in the United States is chiefly a local affair: Constitutional responsibility for education lies with the states; power is then further decentralized as states entrust decisions about school function and funding to school districts. School districts can be as small as a single town or a part of a city. Such localism is one reason that it is possible to speak about schools in U.S. cities as having a particular history, determined as much by the specificities of urban life as by national questions of citizenship, economy, religion, and culture.
While city schools have been distinct, they have also been nationally influential. Urban scale both allowed for and demanded the most extensive educational system-building. Urban growth and diversity galvanized innovation, via exploration in teaching methods, curriculum, and understanding of children and communities. And it generated intense conflict. Throughout U.S. history, urban residents from myriad social, political, religious, and economic positions have struggled to define how schools would operate, for whom, and who would decide.
During the 19th and 20th centuries, U.S. residents struggled over the purposes, funding, and governance of schools in cities shaped by capitalism, nativism, and white supremacy. They built a commitment to schooling as a public function of their cities, with many compromises and exclusions. In the 21st century, old struggles re-emerged in new form, perhaps raising the question of whether schools will continue as public, urban infrastructure.
In the seventy years since the end of World War II (1939–1945), postindustrialization—the exodus of manufacturing and growth of finance and services—has radically transformed the economy of North American cities. Metropolitan areas are increasingly home to transnational firms that administer dispersed production networks that span the world. A few major global centers host large banks that coordinate flows of finance capital necessary not only for production, but also increasingly for education, infrastructure, municipal government, housing, and nearly every other aspect of life. In cities of the global north, fewer workers produce goods and more produce information, entertainment, and experiences. Women have steadily entered the paid workforce, where they often do the feminized work of caring for children and the ill, cleaning homes, and preparing meals. Like the Gilded Age city, the postindustrial city creates immense social divisions, injustices, and inequalities: penthouses worth millions and rampant homelessness, fifty-dollar burgers and an epidemic of food insecurity, and unparalleled wealth and long-standing structural unemployment all exist side by side. The key features of the postindustrial service economy are the increased concentration of wealth, the development of a privileged and celebrated workforce of professionals, and an economic system reliant on hyperexploited service workers whose availability is conditioned by race, immigration status, and gender.
Christopher W. Schmidt
One of the most significant protest campaigns of the civil rights era, the lunch counter sit-in movement began on February 1, 1960 when four young African American men sat down at the whites-only lunch counter of the Woolworth store in Greensboro, North Carolina. Refused service, the four college students sat quietly until the store closed. They continued their protest on the following days, each day joined by more fellow students. Students in other southern cities learned what was happening and started their own demonstrations, and in just weeks, lunch counter sit-ins were taking place across the South. By the end of the spring, tens of thousands of black college and high school students, joined in some cases by sympathetic white students, had joined the sit-in movement. Several thousand went to jail for their efforts after being arrested on charges of trespass, disorderly conduct, or whatever other laws southern police officers believed they could use against the protesters.
The sit-ins arrived at a critical juncture in the modern black freedom struggle. The preceding years had brought major breakthroughs, such as the Supreme Court’s Brown v. Board of Education school desegregation ruling in 1954 and the successful Montgomery bus boycott of 1955–1956, but by 1960, activists were struggling to develop next steps. The sit-in movement energized and transformed the struggle for racial equality, moving the leading edge of the movement from the courtrooms and legislative halls to the streets and putting a new, younger generation of activists on the front lines. It gave birth to the Student Nonviolent Coordinating Committee, one of the most important activist groups of the 1960s. It directed the nation’s attention to the problem of racial discrimination in private businesses that served the public, pressured business owners in scores of southern cities to open their lunch counters to African American customers, and set in motion a chain of events that would culminate in the Civil Rights Act of 1964, which banned racial discrimination in public accommodations across the nation.
Since the social sciences began to emerge as scholarly disciplines in the last quarter of the 19th century, they have frequently offered authoritative intellectual frameworks that have justified, and even shaped, a variety of U.S. foreign policy efforts. They played an important role in U.S. imperial expansion in the late 19th and early 20th centuries. Scholars devised racialized theories of social evolution that legitimated the confinement and assimilation of Native Americans and endorsed civilizing schemes in the Philippines, Cuba, and elsewhere. As attention shifted to Europe during and after World War I, social scientists working at the behest of Woodrow Wilson attempted to engineer a “scientific peace” at Versailles. The desire to render global politics the domain of objective, neutral experts intensified during World War II and the Cold War. After 1945, the social sciences became increasingly central players in foreign affairs, offering intellectual frameworks—like modernization theory—and bureaucratic tools—like systems analysis—that shaped U.S. interventions in developing nations, guided nuclear strategy, and justified the increasing use of the U.S. military around the world.
Throughout these eras, social scientists often reinforced American exceptionalism—the notion that the United States stands at the pinnacle of social and political development, and as such has a duty to spread liberty and democracy around the globe. The scholarly embrace of conventional political values was not the result of state coercion or financial co-optation; by and large social scientists and policymakers shared common American values. But other social scientists used their knowledge and intellectual authority to critique American foreign policy. The history of the relationship between social science and foreign relations offers important insights into the changing politics and ethics of expertise in American public policy.
K. Tsianina Lomawaima
In 1911, a group of American Indian intellectuals organized what would become known as the Society of American Indians, or SAI. SAI members convened in annual meetings between 1911 and 1923, and for much of that period the Society’s executive offices were a hub for political advocacy, lobbying Congress and the Office of Indian Affairs (OIA), publishing a journal, offering legal assistance to Native individuals and tribes, and maintaining an impressively voluminous correspondence across the country with American Indians, “Friends of the Indian” reformers, political allies, and staunch critics. Notable Native activists, clergy, entertainers, professionals, speakers, and writers—as well as Native representatives from on- and off-reservation communities—were active in the Society. They worked tirelessly to meet daunting, unrealistic expectations, principally to deliver a unified voice of Indian “public opinion” and to pursue controversial political goals without appearing too radical, especially obtaining U.S. citizenship for Indian individuals and allowing Indian nations to access the U.S. Court of Claims. They maintained their myriad activities with scant financial resources solely through the unpaid labor of dedicated Native volunteers. By 1923, the challenges exhausted the Society’s substantial human and miniscule financial capital. The Native “soul of unity” demanded by non-white spectators and hoped for by SAI leaders could no longer hold the center, and the SAI dissolved. Their work was not in vain, but citizenship and the ability to file claims materialized in circumscribed forms. In 1924 Congress passed the Indian Citizenship Act, granting birthright citizenship to American Indians, but citizenship for Indians was deemed compatible with continued wardship status. In 1946 Congress established an Indian Claims Commission, not a court, and successful claims could only result in monetary compensation, not regained lands.
Conceptions of what constitutes a street gang or a youth gang have varied since the seminal sociological studies on these entities in the 1920s. Organizations of teenage youths and young adults in their twenties, congregating in public spaces and acting collectively, were fixtures of everyday life in American cities throughout the 20th century. While few studies historicize gangs in their own right, historians in a range of subfields cast gangs as key actors in critical dimensions of the American urban experience: the formation and defense of ethno-racial identities and communities; the creation and maintenance of segregated metropolitan spaces; the shaping of gender norms and forms of sociability in working-class districts; the structuring of contentious political mobilization challenging police practices and municipal policies; the evolution of underground and informal economies and organized crime activities; and the epidemic of gun violence that spread through minority communities in many major cities at the end of the 20th and beginning of the 21st centuries.
Although groups of white youths patrolling the streets of working-class neighborhoods and engaging in acts of defensive localism were commonplace in the urban Northeast, Mid-Atlantic, and Midwest states by the mid-19th century, street gangs exploded onto the urban landscape in the early 20th century as a consequence of massive demographic changes related to the wave of immigration from Europe, Asia, and Latin America and the migration of African Americans from the South. As immigrants and migrants moved into urban working-class neighborhoods and industrial workplaces, street gangs proliferated at the boundaries of ethno-racially defined communities, shaping the context within which immigrant and second-generation youths negotiated Americanization and learned the meanings of race and ethnicity. Although social workers in some cities noted the appearance of some female gangs by the 1930s, the milieu of youth gangs during this era was male dominated, and codes of honor and masculinity were often at stake in increasingly violent clashes over territory and resources like parks and beaches.
The interplay of race, ethnicity, and masculinity continued to shape the world of gangs in the 1940s and 1950s, when white male gangs claiming to defend the whiteness of their communities used terror tactics to reinforce the boundaries of ghettos and barrios in many cities. Such aggressions spurred the formation of fighting gangs in black and Latino neighborhoods, where youths entered into at times deadly combat against their aggressors but also fought for honor, respect, and status with rivals within their communities. In the 1960s and 1970s, with civil rights struggles and ideologies of racial empowerment circulating through minority neighborhoods, some of these same gangs, often with the support of community organizers affiliated with political organizations like the Black Panther Party, turned toward defending the rights of their communities and participating in contentious politics. However, such projects were cut short by the fierce repression of gangs in minority communities by local police forces, working at times in collaboration with the Federal Bureau of Investigation. By the mid-1970s, following the withdrawal of the Black Panthers and other mediating organizations from cities like Chicago and Los Angeles, so-called “super-gangs” claiming the allegiance of thousands of youths began federating into opposing camps—“People” against “Folks” in Chicago, “Crips” against “Bloods” in LA—to wage war for control of emerging drug markets. In the 1980s and 1990s, with minority communities dealing with high unemployment, cutbacks in social services, failing schools, hyperincarceration, drug trafficking, gun violence, and toxic relations with increasingly militarized police forces waging local “wars” against drugs and gangs, gangs proliferated in cities throughout the urban Sun Belt. Their prominence within popular and political discourse nationwide made them symbols of the urban crisis and of the cultural deficiencies that some believed had caused it.
Becky Nicolaides and Andrew Wiese
Mass migration to suburban areas was a defining feature of American life after 1945. Before World War II, just 13% of Americans lived in suburbs. By 2010, however, suburbia was home to more than half of the U.S. population. The nation’s economy, politics, and society suburbanized in important ways. Suburbia shaped habits of car dependency and commuting, patterns of spending and saving, and experiences with issues as diverse as race and taxes, energy and nature, privacy and community. The owner occupied, single-family home, surrounded by a yard, and set in a neighborhood outside the urban core came to define everyday experience for most American households, and in the world of popular culture and the imagination, suburbia was the setting for the American dream. The nation’s suburbs were an equally critical economic landscape, home to vital high-tech industries, retailing, “logistics,” and office employment. In addition, American politics rested on a suburban majority, and over several decades, suburbia incubated political movements across the partisan spectrum, from grass-roots conservativism, to centrist meritocratic individualism, environmentalism, feminism, and social justice. In short, suburbia was a key setting for postwar American life.
Even as suburbia grew in magnitude and influence, it also grew more diverse, coming to reflect a much broader cross-section of America itself. This encompassing shift marked two key chronological stages in suburban history since 1945: the expansive, racialized, mass suburbanization of the postwar years (1945–1970) and an era of intensive social diversification and metropolitan complexity (since 1970). In the first period, suburbia witnessed the expansion of segregated white privilege, bolstered by government policies, exclusionary practices, and reinforced by grassroots political movements. By the second period, suburbia came to house a broader cross section of Americans, who brought with them a wide range of outlooks, lifeways, values, and politics. Suburbia became home to large numbers of immigrants, ethnic groups, African Americans, the poor, the elderly and diverse family types. In the face of stubborn exclusionism by affluent suburbs, inequality persisted across metropolitan areas and manifested anew in proliferating poorer, distressed suburbs. Reform efforts sought to alleviate metro-wide inequality and promote sustainable development, using coordinated regional approaches. In recent years, the twin discourses of suburban crisis and suburban rejuvenation captured the continued complexity of America’s suburbs.
Since the turn of the 20th century, teachers have tried to find a balance between bettering their own career prospects as workers and educating their students as public servants. To reach a workable combination, teachers have utilized methods drawn from union movements, the militant and labor-conscious approach favored by the American Federation of Teachers (AFT), as well as to professional organizations, the tradition from which the National Education Association (NEA) arose. Because teachers lacked the federally guaranteed labor rights that private-sector workers enjoyed after Congress passed the National Labor Relations Act in 1935, teachers’ fortunes—in terms of collective bargaining rights, control over classroom conditions, pay, and benefits—often remained tied to the broader public-sector labor movement and to state rather than federal law.
Opponents of teacher unionization consistently charged that as public servants paid by tax revenues, teachers and other public employees should not be allowed to form unions. Further, because women constituted the vast majority of teachers and union organizing often represented a “manly” domain, the opposition’s approach worked quite well, successfully preventing teachers from gaining widespread union recognition. But by the late 1960s and early 1970s, thanks to an improved economic climate and invigoration from the women’s movement, civil rights struggles, and the New Left, both AFT and NEA teacher unionism surged forward, infused with a powerful militancy devoted to strikes and other political action, and appeared poised to capture federal collective bargaining rights. Their newfound assertiveness proved ill-timed, however.
After the economic problems of the mid-1970s, opponents of teacher unions once again seized the opportunity to portray teacher unions and other public-sector unions as greedy and privileged interest groups functioning at the public’s expense. President Ronald Reagan accentuated this point when he fired all of the more than 10,000 striking air traffic controllers during the 1981 Professional Air Traffic Controllers Organization (PATCO) strike. Facing such opposition, teacher unions—and public-sector unions in general—shifted their efforts away from strikes and toward endorsing political candidates and lobbying governments to pass favorable legislation.
Given these constraints, public-sector unions enjoyed a large degree of success in the 1990s through the early 2000s, even as private-sector union membership plunged to less than 10 percent of the workforce. After the Great Recession of 2008, however, austerity politics targeted teachers and other public-sector workers and renewed political confrontations surrounding the legitimacy of teacher unions.
Timothy James LeCain
Technology and environmental history are both relatively young disciplines among Americanists, and during their early years they developed as distinctly different and even antithetical fields, at least in topical terms. Historians of technology initially focused on human-made and presumably “unnatural” technologies, whereas environmental historians focused on nonhuman and presumably “natural” environments. However, in more recent decades, both disciplines have moved beyond this oppositional framing. Historians of technology increasingly came to view anthropogenic artifacts such as cities, domesticated animals, and machines as extensions of the natural world rather than its antithesis. Even the British and American Industrial Revolutions constituted not a distancing of humans from nature, as some scholars have suggested, but rather a deepening entanglement with the material environment. At the same time, many environmental historians were moving beyond the field’s initial emphasis on the ideal of an American and often Western “wilderness” to embrace a concept of the environment as including humans and productive work. Nonetheless, many environmental historians continued to emphasize the independent agency of the nonhuman environment of organisms and things. This insistence that not everything could be reduced to human culture remained the field’s most distinctive feature.
Since the turn of millennium, the two fields have increasingly come together in a variety of synthetic approaches, including Actor Network Theory, envirotechnical analysis, and neomaterialist theory. As the influence of the cultural turn has waned, the environmental historians’ emphasis on the independent agency of the nonhuman has come to the fore, gaining wider influence as it is applied to the dynamic “nature” or “wildness” that some scholars argue exists within both the technological and natural environment. The foundational distinctions between the history of technology and environmental history may now be giving way to more materially rooted attempts to understand how a dynamic hybrid environment helps to create human history in all of its dimensions—cultural, social, and biological.
In the decade after 1965, radicals responded to the alienating features of America’s technocratic society by developing alternative cultures that emphasized authenticity, individualism, and community. The counterculture emerged from a handful of 1950s bohemian enclaves, most notably the Beat subcultures in the Bay Area and Greenwich Village. But new influences shaped an eclectic and decentralized counterculture after 1965, first in San Francisco’s Haight-Ashbury district, then in urban areas and college towns, and, by the 1970s, on communes and in myriad counter-institutions. The psychedelic drug cultures around Timothy Leary and Ken Kesey gave rise to a mystical bent in some branches of the counterculture and influenced counterculture style in countless ways: acid rock redefined popular music; tie dye, long hair, repurposed clothes, and hip argot established a new style; and sexual mores loosened. Yet the counterculture’s reactionary elements were strong. In many counterculture communities, gender roles mirrored those of mainstream society, and aggressive male sexuality inhibited feminist spins on the sexual revolution. Entrepreneurs and corporate America refashioned the counterculture aesthetic into a marketable commodity, ignoring the counterculture’s incisive critique of capitalism. Yet the counterculture became the basis of authentic “right livelihoods” for others. Meanwhile, the politics of the counterculture defy ready categorization. The popular imagination often conflates hippies with radical peace activists. But New Leftists frequently excoriated the counterculture for rejecting political engagement in favor of hedonistic escapism or libertarian individualism. Both views miss the most important political aspects of the counterculture, which centered on the embodiment of a decentralized anarchist bent, expressed in the formation of counter-institutions like underground newspapers, urban and rural communes, head shops, and food co-ops. As the counterculture faded after 1975, its legacies became apparent in the redefinition of the American family, the advent of the personal computer, an increasing ecological and culinary consciousness, and the marijuana legalization movement.