121-140 of 191 Results  for:

  • 20th Century: Post-1945 x
Clear all

Article

Austin McCoy

Rap is the musical practice of hip hop culture that features vocalists, or MCs, reciting lyrics over an instrumental beat that emerged out of the political and economic transformations of New York City after the 1960s. Black and Latinx youth, many of them Caribbean immigrants, created this new cultural form in response to racism, poverty, urban renewal, deindustrialization, and inner-city violence. These new cultural forms eventually spread beyond New York to all regions of the United States as artists from Los Angeles, New Orleans, Miami, and Chicago began releasing rap music with their own distinct sounds. Despite efforts to demonize and censor rap music and hip hop culture, rap music has served as a pathway for social mobility for many black and Latinx youth. Many artists have enjoyed crossover success in acting, advertising, and business. Rap music has also sparked new conversations about various issues such as electoral politics, gender and sexuality, crime, policing, and mass incarceration, as well as technology.

Article

Ronald Reagan’s foreign policy legacy remains hotly contested, and as new archival sources come to light, those debates are more likely to intensify than to recede into the background. In dealings with the Soviet Union, the Reagan administration set the superpowers on a course for the (largely) peaceful end of the Cold War. Reagan began his outreach to Soviet leaders almost immediately after taking office and enjoyed some success, even if the dominant theme of the period remains fears of Reagan as a “button-pusher” in the public’s perception. Mikhail Gorbachev’s election to the post of General Secretary proved the turning point. Reagan, now confident in US strength, and Gorbachev, keen to reduce the financial burden of the arms race, ushered in a new, cooperative phase of the Cold War. Elsewhere, in particular Latin America, the administration’s focus on fighting communism led it to support human rights–abusing regimes at the same time as it lambasted Moscow’s transgressions in that regard. But even so, over the course of the 1980s, the United States began pushing for democratization around the world, even where Reagan and his advisors had initially resisted it, fearing a communist takeover. In part, this was a result of public pressure, but the White House recognized and came to support the rising tide of democratization. When Reagan left office, a great many countries that had been authoritarian were no longer, often at least in part because of US policy. US–Soviet relations had improved to such an extent that Reagan’s successor, Vice President George H. W. Bush, worried that they had gone too far in working with Gorbachev and been hoodwinked.

Article

America’s tremendous diversities of faith, region, and ethnicity complicate efforts to generalize relationships between religious groups and the labor movement. Americans’ historic and widely shared commitment to Christianity masks deep divisions: between white Christians and black Christians, between Catholics and Protestants, between northern Protestants and southern Protestants, and between “modernist” Protestants (who view the Bible in metaphorical terms as a source of ethical guidance and emphasize social justice) and “fundamentalist” Protestants (who view the Bible literally and eschew social activism in favor of individual evangelizing). Work, class, and the role of the labor movement add extra dimensions to these complexities, which are multiplied when considering non-Christian traditions such as Judaism or the other world religious communities that have grown in the United States since the immigration reforms of 1965. Nevertheless, scholars accept a general narrative that delineates key periods, themes, and players over the course of the twentieth century. From the turn of the 19th century until the 1930s, the relationship between religion and labor was shaped by the centrality of the American Federation of Labor (AFL) in the labor movement, the development of a “social gospel” among northern mainline Protestants, and the massive immigration from southern and eastern Europe that brought millions of Catholic and Jewish workers into the United States before it largely ended in the 1920s. These developments were sometimes in tension. The AFL favored craft unionism and placed a premium on organizing skilled male workers; it therefore left out many of the unskilled new arrivals (as well as African Americans and most women). Consequently, the shape of “religion and labor” formed primarily around the dynamic between the AFL and Protestant social reformers, without much regard to the large masses of unorganized Catholic, Jewish, and African American workers. These dynamics shifted in the Great Depression. The Congress of Industrial Organizations (CIO), begun as a committee within the AFL in 1934, sought the organization of entire industries—skilled and unskilled alike, and ethnic Catholics and Jews became unionized in large numbers. Even traditional racial barriers in the labor movement began crumbling in some industries. And, the labor movement expanded its geographical ambition, pushing aggressively into the South. In turn, the religious voices associated with the labor movement broadened and deepened. Labor’s new alliances with Catholics, Jews, African Americans, and southern evangelicals helped to push the ranks of organized workers to historic highs in the 1950s. This coalition has faced divisive, even disastrous headwinds since the 1960s. The strength of anticommunism, especially within religious groups, caused some religious workers to retreat from the reformist ambitions of the labor movement and sparked a conservative religious movement deeply opposed to labor and liberalism. Race became an ever-hotter flashpoint. Although religiously affiliated civil rights reformers often forged alliances with unions, the backlash and resistance to civil rights among portions of the white working class undermined the efficacy of labor unions as sources of social cohesion. Perhaps most profoundly, the economy as a whole transformed from an urban-industrial to a post-urban service model. Organized labor has floundered in the wake of these changes, and the concomitant resurgence of a traditionalist, individualistic, and therapeutic religious culture has offered the remains of the labor movement little to partner with.

Article

The Great Depression of 1929–1941 brought not only economic and social crisis, but also forced families, churches, and religious organizations to reckon with individual and social suffering in ways that they had not done in the United States since the Civil War. This reckoning introduced a period of both theological and institutional transformation. Theologians wrestled not only with the domestic depression, but also with international instability as they faced questions about pacifism, economic and racial justice, and religious persecution. Ordinary people prayed for rain and revival. Many turned to their religious communities to wrestle together with the troubles they faced, or turned from those communities in disappointment and despair. During the decades before the Great Depression, religious institutions across the United States had expanded their charitable efforts and their social reform campaigns, but the Depression wiped out the support for that work just as Americans needed it most. The New Deal brought a new set of questions about the relative roles of church and state in welfare and reform and introduced a period of religious ferment and church–state realignment. At the same time, the discontent and dislocation that the Great Depression wrought on local communities meant that individuals, families, and communities wrestled with deep theological questions together, often in ways that fractured old religious alliances and forged new ones. For American Jews and some Catholics, events in Europe proved even more troubling than those at home, and local communities reorganized around international activism and engagement.

Article

Jimmy Carter’s “Crisis of Confidence Speech” of July 1979 was a critical juncture in post-1945 U.S. politics, but it also marks an exemplary pivot in post-1945 religion. Five dimensions of faith shaped the president’s sermon. The first concerned the shattered consensus of American religion. When Carter encouraged Americans to recapture a spirit of unity, he spoke in a heartfelt but spent language more suitable to Dwight Eisenhower’s presidency than his own. By 1979, the Protestant-Catholic-Jewish consensus of Eisenhower’s time was fractured into a dynamic pluralism, remaking American religion in profound ways. Carter’s speech revealed a second revolution of post-1945 religion when it decried its polarization and politicization. Carter sought to heal ruptures that were dividing the nation between what observers, two decades hence, would label “red” (conservative Republican) and “blue” (liberal Democratic) constituencies. Yet his endeavors failed, as would be evidenced in the religious politics of Ronald Reagan’s era, which followed. Carter championed community values as the answer to his society’s problems aware of yet a third dawning reality: globalization. The virtues of localism that Carter espoused were in fact implicated in (and complicated by) transnational forces of change that saw immigration, missionary enterprises, and state and non-state actors internationalizing the American religious experience. A fourth illuminating dimension of Carter’s speech was its critique of America’s gospel of wealth. Although this “born-again” southerner was a product of the evangelical South’s revitalized free-market capitalism, he lamented how laissez-faire Christianity had become America’s lingua franca. Finally, Carter wrestled with secularization, revealing a fifth feature of post-1945 America. Even though faith commitments were increasingly cordoned off from formal state functions during this time, the nation’s political discourse acquired a pronounced religiosity. Carter contributed by framing mundane issues (such as energy) in moral contexts that drew no hard-and-fast boundaries between matters of the soul and governance. Drawn from the political and economic crises of his moment, Carter’s speech thus also reveals the all-enveloping tide of religion in America’s post-1945 age.

Article

The “Chinese 49’ers” who arrived in the United States a decade before the American Civil War constituted the first large wave of Asian migrants to America and transplanted the first Asian cuisine to America. Chinese food was the first ethnic cuisine to be highly commodified at the national level as a type of food primarily to be prepared and consumed away from home. At the end of the 19th century, food from China began to attract a fast-growing non-Chinese clientele of diverse ethnic backgrounds in major cities across the nation, and by 1980 Chinese food had become the most popular ethnic cuisine in the United States, aided by a renewal of Chinese immigration to America. Chinese food also has been a vital economic lifeline for Chinese Americans as one of the two main sources of employment (laundries being the other) for Chinese immigrants and families for decades. Its development, therefore, is an important chapter in American history and a central part of the Chinese American experience. The multiple and often divergent trends in the U.S. Chinese-food industry show that it is at a crossroads today. Its future hinges on the extent to which Chinese Americans can significantly alter their position in the social and political arena and on China’s ability to transform the economic equation in its relationship with the United States.

Article

Katherine R. Jewell

The term “Sunbelt” connotes a region defined by its environment. “Belt” suggests the broad swath of states from the Atlantic coast, stretching across Texas and Oklahoma, the Southwest, to southern California. “Sun” suggests its temperate—even hot—climate. Yet in contrast to the industrial northeastern and midwestern Rust Belt, or perhaps, “Frost” Belt, the term’s emergence at the end of the 1960s evoked an optimistic, opportunistic brand. Free from snowy winters, with spaces cooled by air conditioners, and Florida’s sandy beaches or California’s surfing beckoning, it is true that more Americans moved to the Sunbelt states in the 1950s and 1960s than to the deindustrializing centers of the North and East. But the term “Sunbelt” also captures an emerging political culture that defies regional boundaries. The term originates more from the diagnosis of this political climate, rather than an environmental one, associated with the new patterns of migration in the mid-20th century. The term defined a new regional identity: politically, economically, in policy, demographically, and socially, as well as environmentally. The Sunbelt received federal money in an unprecedented manner, particularly because of rising Cold War defense spending in research and military bases, and its urban centers grew in patterns unlike those in the old Northeast and Midwest, thanks to the policy innovations wrought by local boosters, business leaders, and politicians, which defined politics associated with the region after the 1970s. Yet from its origin, scholars debate whether the Sunbelt’s emergence reflects a new regional identity, or something else.

Article

Eric Weisbard

Rock and roll, a popular music craze of the mid-1950s, turned a loud, fast, and sexy set of sounds rooted in urban, black, working class, and southern America into the pop preference as well of suburban, white, young, and northern America. By the late 1960s, those fans and British counterparts made their own version, more politicized and experimental and just called rock—the summoning sound of the counterculture. Rock’s aura soon faded: it became as much entertainment staple as dissident form, with subcategories disparate as singer-songwriter, heavy metal, alternative, and “classic rock.” Where rock and roll was integrated and heterogeneous, rock was largely white and homogeneous, policing its borders. Notoriously, rock fans detonated disco records in 1979. By the 1990s, rock and roll style was hip-hop, with its youth appeal and rebelliousness; post‒baby boomer bands gave rock some last vanguard status; and suburbanites found classic rock in New Country. This century’s notions of rock and roll have blended thoroughly, from genre “mash-ups” to superstar performers almost categories unto themselves and new sounds such as EDM beats. Still, crossover moments evoke rock and roll; assertions of authenticity evoke rock. Because rock and roll, and rock, epitomize cultural ideals and group identities, their definitions have been constantly debated. Initial argument focused on challenging genteel, professional notions of musicianship and behavior. Later discourse took up cultural incorporation and social empowerment, with issues of gender and commercialism as prominent as race and artistry. Rock and roll promised one kind of revolution to the post-1945 United States; rock another. The resulting hope and confusion has never been fully sorted, with mixed consequences for American music and cultural history.

Article

While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.

Article

In 1835, Alexis de Tocqueville argued in Democracy in America that there were “two great nations in the world.” They had started from different historical points but seemed to be heading in the same direction. As expanding empires, they faced the challenges of defeating nature and constructing a civilization for the modern era. Although they adhered to different governmental systems, “each of them,” de Tocqueville declared, “seems marked out by the will of Heaven to sway the destinies of half the globe.” De Tocqueville’s words were prophetic. In the 19th century, Russian and American intellectuals and diplomats struggled to understand the roles that their countries should play in the new era of globalization and industrialization. Despite their differing understandings of how development should happen, both sides believed in their nation’s vital role in guiding the rest of the world. American adherents of liberal developmentalism often argued that a free flow of enterprise, trade, investment, information, and culture was the key to future growth. They held that the primary obligation of American foreign policy was to defend that freedom by pursuing an “open door” policy and free access to markets. They believed that the American model would work for everyone and that the United States had an obligation to share its system with the old and underdeveloped nations around it. A similar sense of mission developed in Russia. Russian diplomats had for centuries struggled to establish defensive buffers around the periphery of their empire. They had linked economic development to national security, and they had argued that their geographic expansion represented a “unification” of peoples as opposed to a conquering of them. In the 19th century, after the Napoleonic Wars and the failed Decembrist Revolution, tsarist policymakers fought to defend autocracy, orthodoxy, and nationalism from domestic and international critics. As in the United States, Imperial and later Soviet leaders envisioned themselves as the emissaries of the Enlightenment to the backward East and as protectors of tradition and order for the chaotic and revolutionary West. These visions of order clashed in the 20th century as the Soviet Union and the United States became superpowers. Conflicts began early, with the American intervention in the 1918–1921 Russian civil war. Tensions that had previously been based on differing geographic and strategic interests then assumed an ideological valence, as the fight between East and West became a struggle between the political economies of communism and capitalism. Foreign relations between the two countries experienced boom and bust cycles that took the world to the brink of nuclear holocaust and yet maintained a strategic balance that precluded the outbreak of global war for fifty years. This article will examine how that relationship evolved and how it shaped the modern world.

Article

Robert O. Self

Few decades in American history reverberate with as much historical reach or glow as brightly in living mythology as the 1960s. During those years Americans reanimated and reinvented the core political principles of equality and liberty but, in a primal clash that resonates more than half a century later, fiercely contested what those principles meant, and for whom. For years afterward, the decade’s appreciators considered the era to have its own “spirit,” defined by greater freedoms and a deeper, more authentic personhood, and given breath by a youthful generation’s agitation for change in nearly every dimension of national life. To its detractors in subsequent decades, the era was marked by immature radical fantasies and dangerous destabilizations of the social order, behind which lay misguided youthful enthusiasms and an overweening, indulgent federal government. We need not share either conviction to appreciate the long historical shadow cast by the decade’s clashing of left, right, and center and its profound influence over the political debates, cultural logics, and social practices of the many years that followed. The decade’s political and ideological clashes registered with such force because post–World War II American life was characterized by a society-wide embrace of antiradicalism and a prescribed normalcy. Having emerged from the war as the lone undamaged capitalist industrial power, the United States exerted enormous influence throughout the globe after 1945—so much that some historians have called the postwar years a “pax Americana.” In its own interest and in the interest of its Western allies, the United States engaged in a Cold War standoff with the Soviet Union over the fate of Europe and no less over the fate of developing countries on every continent. Fiercely anticommunist abroad and at home, U.S. elites stoked fears of the damage communism could do, whether in Eastern Europe or in a public school textbook. Americans of all sorts in the postwar years embraced potent ideologies justifying the prevailing order, whether that order was capitalist, patriarchal, racial, or heterosexual. They pursued a postwar “normalcy” defined by nuclear family domesticity and consumer capitalism in the shadow cast by the threat of communism and, after 1949, global thermonuclear war with the Soviet Union. This prevailing order was stultifying and its rupture in the 1960s is the origin point of the decade’s great dramas. The social movements of that decade drew Americans from the margins of citizenship—African Americans, Latina/o, Native Americans, women, and gay men and lesbians, among others—into epochal struggles over the withheld promise of equality. For the first time since 1861, an American war deeply split the nation, nearly destroying a major political party and intensifying a generational revolt already under way. Violence, including political assassinations at the highest level, bombings and assassinations of African Americans, bombings by left-wing groups like the Weathermen, and major urban uprisings by African Americans against police and property bathed the country in more blood. The New Deal liberalism of Presidents Franklin D. Roosevelt and Harry S. Truman reached its postwar peak in 1965 under President Lyndon Johnson’s Great Society and then retreated amid acrimony and backlash, as a new conservative politics gained traction. All this took place in the context of a “global 1960s,” in which societies in Western and Eastern Europe, Latin America, Africa, and elsewhere experienced similar generational rebellions, quests for meaningful democracy, and disillusionment with American global hegemony. From the first year of the decade to the last, the 1960s were a watershed era that marked the definitive end of a “postwar America” defined by easy Cold War dualities, presumptions of national innocence, and political calcification. To explain the foregoing, this essay is organized in five sections. First comes a broad overview of the decade, highlighting some of its indelible moments and seminal political events. The next four sections correspond to the four signature historical developments of the 1960s. Discussed first is the collapse of the political consensus that predominated in national life following World War II. We can call this consensus “Vital Center liberalism,” after the title of a 1949 book by Arthur Schlesinger Jr., or “Cold War liberalism.” Its assault from both the New Left and the New Right is one of the defining stories of the 1960s. Second is the resurgence, after a decades-long interregnum dating to Reconstruction, of African American political agency. The black freedom struggle of the 1960s was far more than a social movement for civil rights. To shape the conditions of national life and the content of public debate in ways impossible under Jim Crow, black American called for nothing less than a spiritual and political renewal of the country. Third, and following from the latter, is the emergence within the American liberal tradition of a new emphasis on expanding individual rights and ending invidious discrimination. Forged in conjunction with the black freedom movement by women, Latino/as, Asian Americans, Native Americans, and homophiles (as early gay rights activists were called) and gay liberationists, this new emphasis profoundly changed American law and set the terms of political debate for the next half century. Fourth and lastly, the 1960s witnessed the flourishing of a broad and diverse culture of anti-authoritarianism. In art, politics, and social behavior, this anti-authoritarianism took many forms, but at its heart lay two distinct historical phenomena: an ecstatic celebration of youth, manifest in the tension between the World War II generation and the baby boom generation, and an intensification of the long-standing conflict in American life between individualism and hierarchical order. Despite the disruptions, rebellions, and challenges to authority in the decade, the political and economic elite proved remarkably resilient and preserved much of the prevailing order. This is not to discount the foregoing account of challenges to that order or to suggest that social change in the 1960s made little difference in American life. However, in grappling with this fascinating decade we are confronted with the paradox of outsized events and enormous transformations in law, ideology, and politics alongside a continuation, even an entrenchment, of traditional economic and political structures and practices.

Article

Doug Rossinow

The decade of the 1980s represented a turning point in American history—a crucial era, marked by political conservatism and an individualistic ethos. The 1980s also witnessed a dramatic series of developments in U.S. foreign relations, first an intensification of the Cold War with the Soviet Union and then a sudden relaxation of tensions and the effective end of the Cold War with an American victory. All of these developments were advanced and symbolized in the presidential administration of Ronald Reagan (1981–1989), a polarizing figure but a highly successful political leader. Reagan dominates our memories of the 1980s like few other American leaders do other eras. Reagan and the political movement he led—Reaganism—are central to the history of the 1980s. Both their successes and their failures, which became widely acknowledged in the later years of the decade, should be noted. Reaganite conservatives won political victories by rolling back state power in many realms, most of all in terms of taxation and regulation. They also succeeded in putting America at the unquestioned pinnacle of the world order through a victory over the Soviet Union in the Cold War, although this was unforeseen by America’s Cold Warriors when the 1980s began. The failures of Reaganite conservatism include its handling of rising poverty levels, the HIV/AIDS crisis, and worsening racial tensions, all problems that either Reaganites did little to stem or to which they positively contributed. In foreign affairs, Reaganites pursued a “war on terror” of questionable success, and their approach to Third World arenas of conflict, including Central America, exacted a terrible human toll.

Article

Jennifer Delton

The 1950s have typically been seen as a complacent, conservative time between the end of World War II and the radical 1960s, when anticommunism and the Cold War subverted reform and undermined civil liberties. But the era can also be seen as a very liberal time in which meeting the Communist threat led to Keynesian economic policies, the expansion of New Deal programs, and advances in civil rights. Politically, it was “the Eisenhower Era,” dominated by a moderate Republican president, a high level of bipartisan cooperation, and a foreign policy committed to containing communism. Culturally, it was an era of middle-class conformity, which also gave us abstract expressionism, rock and roll, Beat poetry, and a grassroots challenge to Jim Crow.

Article

L. Benjamin Rolsky

Few decades in the history of America resonate more with the American people than the 1960s. Freedom, justice, and equality seemed to define the immediate futures of many of America’s historically most ostracized citizens. Despite the nostalgia that tends to characterize past and present analyses of the sixties, this imaginative work is important to consider when narrating the subsequent decade: the 1970s. Such nostalgia in considering the 1960s speaks to a sense of loss, or something worked at but not quite achieved in the eyes of the nation and its inhabitants. What happened to their aspirations? Where did they retreat to? And, perhaps more importantly, to what extent did “the spirit” of the 1960s catalyze its antithesis in the 1970s? In many ways the 1970s was a transitional period for the nation because these years were largely defined by various instances of cultural, or tribal, warfare. These events and their key actors are often under-represented in histories of late-20th-century America, yet they were formative experiences for the nation and their legacy endures in contemporary moments of polarization, division, and contestation. In this sense the 1970s were neither “liberal” nor “conservative,” but instead laid the groundwork for such terms to calcify into the non-negotiable discourse now known simply as the culture wars. The tone of the time was somber for many, and the period may be best understood as having occasioned a kind of “collective nervous breakdown.” For some, the erosion of trust in America’s governing institutions presented an unparalleled opportunity for political and electoral revolution. For others, it was the stuff of nightmares. America had fractured, and it was not clear how the pieces would be put back together.

Article

Sergio González

In the spring of 1982, six faith communities in Arizona and California declared themselves places of safe harbor for the hundreds of thousands of Salvadorans and Guatemalans that had been denied legal proceedings for political asylum in the United States. Alleging that immigration officials had intentionally miscategorized Central Americans as “economic migrants” in order to accelerate their deportation, humanitarian organizations, legal advocates, and religious bodies sought alternatives for aid within their faiths’ scriptural teachings and the juridical parameters offered by international and national human rights and refugee law. Known as the sanctuary movement, this decade-long interfaith mobilization of lay and clerical activists indicted the US detention and deportation system and the country’s foreign policy initiatives in Latin America as morally bankrupt while arguing that human lives, regardless of documentation status, were sacred. In accusing the United States of being a violator of both domestic and international refugee legislation, subsequently exposing hundreds of thousands of people to persecution, torture, and death, the movement tested the idea that the country had always extended welcome to victims of persecution. Along with a broad network of anti-interventionist and humanitarian aid organizations, sanctuary galvanized more than 60,000 participants in 500 faith communities across the nation. By the 1990s, the movement had spurred congressional action in support of Central American asylees and served as the model for a renewed movement for sanctuary in support of undocumented Americans in the 21st century.

Article

Victor McFarland

The relationship between the United States and Saudi Arabia has shaped the history of both countries. Soon after the Saudi kingdom was founded in 1932, American geologists discovered enormous oil reserves near the Persian Gulf. Oil-driven development transformed Saudi society. Many Americans came to work in Saudi Arabia, while thousands of Saudis studied and traveled in the United States. During the mid-20th century, the American-owned oil company Aramco and the US government worked to strengthen the Saudi regime and empower conservative forces in the kingdom—not only to protect American oil interests, but also to suppress nationalist and leftist movements in Saudi Arabia and elsewhere in the Middle East. The partnership was complicated by disagreement over Israel, triggering an Arab oil embargo against the United States in 1973–1974. During the 1970s, Saudi Arabia became the world’s largest oil exporter, nationalized Aramco, and benefited from surging oil prices. In partnership with the United States, it used its new wealth at home to launch a huge economic development program, and abroad to subsidize political allies like the Afghan mujahideen. The United States led a massive military operation to expel Iraqi forces from Kuwait in 1990–1991, protecting the Saudi regime but angering Saudis who opposed their government’s close relationship with the United States. One result was the rise of Osama bin Laden’s al-Qaeda network and the 9/11 attacks, carried out by a largely Saudi group of hijackers. Despite public opposition on both sides, after 2001 the United States and Saudi Arabia continued their commercial relationship and their political partnership, originally directed against the Soviet Union and Nasser’s Egypt, and later increasingly aimed at Iran.

Article

Ansley T. Erickson

“Urban infrastructure” calls to mind railways, highways, and sewer systems. Yet the school buildings—red brick, limestone, or concrete, low-slung, turreted, or glass-fronted—that hold and seek to shape the city’s children are ubiquitous forms of infrastructure as well. Schools occupy one of the largest line items in a municipal budget, and as many as a fifth of a city’s residents spend the majority of their waking hours in school classrooms, hallways, and gymnasiums. In the 19th and 20th centuries urban educational infrastructure grew, supported by developing consensus for publicly funded and publicly governed schools (if rarely fully accessible to all members of the public). Even before state commitment to other forms of social welfare, from pensions to public health, and infrastructure, from transit to fire, schooling was a government function. This commitment to public education ultimately was national, but schools in cities had their own story. Schooling in the United States is chiefly a local affair: Constitutional responsibility for education lies with the states; power is then further decentralized as states entrust decisions about school function and funding to school districts. School districts can be as small as a single town or a part of a city. Such localism is one reason that it is possible to speak about schools in U.S. cities as having a particular history, determined as much by the specificities of urban life as by national questions of citizenship, economy, religion, and culture. While city schools have been distinct, they have also been nationally influential. Urban scale both allowed for and demanded the most extensive educational system-building. Urban growth and diversity galvanized innovation, via exploration in teaching methods, curriculum, and understanding of children and communities. And it generated intense conflict. Throughout U.S. history, urban residents from myriad social, political, religious, and economic positions have struggled to define how schools would operate, for whom, and who would decide. During the 19th and 20th centuries, U.S. residents struggled over the purposes, funding, and governance of schools in cities shaped by capitalism, nativism, and white supremacy. They built a commitment to schooling as a public function of their cities, with many compromises and exclusions. In the 21st century, old struggles re-emerged in new form, perhaps raising the question of whether schools will continue as public, urban infrastructure.

Article

In the seventy years since the end of World War II (1939–1945), postindustrialization—the exodus of manufacturing and growth of finance and services—has radically transformed the economy of North American cities. Metropolitan areas are increasingly home to transnational firms that administer dispersed production networks that span the world. A few major global centers host large banks that coordinate flows of finance capital necessary not only for production, but also increasingly for education, infrastructure, municipal government, housing, and nearly every other aspect of life. In cities of the global north, fewer workers produce goods and more produce information, entertainment, and experiences. Women have steadily entered the paid workforce, where they often do the feminized work of caring for children and the ill, cleaning homes, and preparing meals. Like the Gilded Age city, the postindustrial city creates immense social divisions, injustices, and inequalities: penthouses worth millions and rampant homelessness, fifty-dollar burgers and an epidemic of food insecurity, and unparalleled wealth and long-standing structural unemployment all exist side by side. The key features of the postindustrial service economy are the increased concentration of wealth, the development of a privileged and celebrated workforce of professionals, and an economic system reliant on hyperexploited service workers whose availability is conditioned by race, immigration status, and gender.

Article

Christopher W. Schmidt

One of the most significant protest campaigns of the civil rights era, the lunch counter sit-in movement began on February 1, 1960 when four young African American men sat down at the whites-only lunch counter of the Woolworth store in Greensboro, North Carolina. Refused service, the four college students sat quietly until the store closed. They continued their protest on the following days, each day joined by more fellow students. Students in other southern cities learned what was happening and started their own demonstrations, and in just weeks, lunch counter sit-ins were taking place across the South. By the end of the spring, tens of thousands of black college and high school students, joined in some cases by sympathetic white students, had joined the sit-in movement. Several thousand went to jail for their efforts after being arrested on charges of trespass, disorderly conduct, or whatever other laws southern police officers believed they could use against the protesters. The sit-ins arrived at a critical juncture in the modern black freedom struggle. The preceding years had brought major breakthroughs, such as the Supreme Court’s Brown v. Board of Education school desegregation ruling in 1954 and the successful Montgomery bus boycott of 1955–1956, but by 1960, activists were struggling to develop next steps. The sit-in movement energized and transformed the struggle for racial equality, moving the leading edge of the movement from the courtrooms and legislative halls to the streets and putting a new, younger generation of activists on the front lines. It gave birth to the Student Nonviolent Coordinating Committee, one of the most important activist groups of the 1960s. It directed the nation’s attention to the problem of racial discrimination in private businesses that served the public, pressured business owners in scores of southern cities to open their lunch counters to African American customers, and set in motion a chain of events that would culminate in the Civil Rights Act of 1964, which banned racial discrimination in public accommodations across the nation.

Article

Since the social sciences began to emerge as scholarly disciplines in the last quarter of the 19th century, they have frequently offered authoritative intellectual frameworks that have justified, and even shaped, a variety of U.S. foreign policy efforts. They played an important role in U.S. imperial expansion in the late 19th and early 20th centuries. Scholars devised racialized theories of social evolution that legitimated the confinement and assimilation of Native Americans and endorsed civilizing schemes in the Philippines, Cuba, and elsewhere. As attention shifted to Europe during and after World War I, social scientists working at the behest of Woodrow Wilson attempted to engineer a “scientific peace” at Versailles. The desire to render global politics the domain of objective, neutral experts intensified during World War II and the Cold War. After 1945, the social sciences became increasingly central players in foreign affairs, offering intellectual frameworks—like modernization theory—and bureaucratic tools—like systems analysis—that shaped U.S. interventions in developing nations, guided nuclear strategy, and justified the increasing use of the U.S. military around the world. Throughout these eras, social scientists often reinforced American exceptionalism—the notion that the United States stands at the pinnacle of social and political development, and as such has a duty to spread liberty and democracy around the globe. The scholarly embrace of conventional political values was not the result of state coercion or financial co-optation; by and large social scientists and policymakers shared common American values. But other social scientists used their knowledge and intellectual authority to critique American foreign policy. The history of the relationship between social science and foreign relations offers important insights into the changing politics and ethics of expertise in American public policy.