121-140 of 435 Results

Article

Over the course of the 19th century, American cities developed from small seaports and trading posts to large metropolises. Not surprisingly, foodways and other areas of daily life changed accordingly. In 1800, the dietary habits of urban Americans were similar to those of the colonial period. Food provisioning was very local. Farmers, hunters, fishermen, and dairymen from a few miles away brought food by rowboats and ferryboats and by horse carts to centralized public markets within established cities. Dietary options were seasonal as well as regional. Few public dining options existed outside of taverns, which offered lodging as well as food. Most Americans, even in urban areas, ate their meals at home, which in many cases were attached to their workshops, countinghouses, and offices. These patterns changed significantly over the course of the19th century, thanks largely to demographic changes and technological developments. By the turn of the 20th century, urban Americans relied on a food-supply system that was highly centralized and in the throes of industrialization. Cities developed complex restaurant sectors, and majority immigrant populations dramatically shaped and reshaped cosmopolitan food cultures. Furthermore, with growing populations, lax regulation, and corrupt political practices in many cities, issues arose periodically concerning the safety of the food supply. In sum, the roots of today’s urban food systems were laid down over the course of the 19th century.

Article

Changing foodways, the consumption and production of food, access to food, and debates over food shaped the nature of American cities in the 20th century. As American cities transformed from centers of industrialization at the start of the century to post-industrial societies at the end of the 20th century, food cultures in urban America shifted in response to the ever-changing urban environment. Cities remained centers of food culture, diversity, and food reform despite these shifts. Growing populations and waves of immigration changed the nature of food cultures throughout the United States in the 20th century. These changes were significant, all contributing to an evolving sense of American food culture. For urban denizens, however, food choice and availability were dictated and shaped by a variety of powerful social factors, including class, race, ethnicity, gender, and laboring status. While cities possessed an abundance of food in a variety of locations to consume food, fresh food often remained difficult for the urban poor to obtain as the 20th century ended. As markets expanded from 1900 to 1950, regional geography became a less important factor in determining what types of foods were available. In the second half of the 20th century, even global geography became less important to food choices. Citrus fruit from the West Coast was readily available in northeastern markets near the start of the century, and off-season fruits and vegetables from South America filled shelves in grocery stores by the end of the 20th century. Urban Americans became further disconnected from their food sources, but this dislocation spurred counter-movements that embraced ideas of local, seasonal foods and a rethinking of the city’s relationship with its food sources.

Article

Jeffrey F. Taffet

In the first half of the 20th century, and more actively in the post–World War II period, the United States government used economic aid programs to advance its foreign policy interests. US policymakers generally believed that support for economic development in poorer countries would help create global stability, which would limit military threats and strengthen the global capitalist system. Aid was offered on a country-by-country basis to guide political development; its implementation reflected views about how humanity had advanced in richer countries and how it could and should similarly advance in poorer regions. Humanitarianism did play a role in driving US aid spending, but it was consistently secondary to political considerations. Overall, while funding varied over time, amounts spent were always substantial. Between 1946 and 2015, the United States offered almost $757 billion in economic assistance to countries around the world—$1.6 trillion in inflation-adjusted 2015 dollars. Assessing the impact of this spending is difficult; there has long been disagreement among scholars and politicians about how much economic growth, if any, resulted from aid spending and similar disputes about its utility in advancing US interests. Nevertheless, for most political leaders, even without solid evidence of successes, aid often seemed to be the best option for constructively engaging poorer countries and trying to create the kind of world in which the United States could be secure and prosperous.

Article

Daniel Sargent

Foreign economic policy involves the mediation and management of economic flows across borders. Over two and a half centuries, the context for U.S. foreign economic policy has transformed. Once a fledgling republic on the periphery of the world economy, the United States has become the world’s largest economy, the arbiter of international economic order, and a predominant influence on the global economy. Throughout this transformation, the making of foreign economic policy has entailed delicate tradeoffs between diverse interests—political and material, foreign and domestic, sectional and sectoral, and so on. Ideas and beliefs have also shaped U.S. foreign economic policy—from Enlightenment-era convictions about the pacifying effects of international commerce to late 20th-century convictions about the efficacy of free markets.

Article

Economic nationalism tended to dominate U.S. foreign trade policy throughout the long 19th century, from the end of the American Revolution to the beginning of World War I, owing to a pervasive American sense of economic and geopolitical insecurity and American fear of hostile powers, especially the British but also the French and Spanish and even the Barbary States. Following the U.S. Civil War, leading U.S. protectionist politicians sought to curtail European trade policies and to create a U.S.-dominated customs union in the Western Hemisphere. American proponents of trade liberalization increasingly found themselves outnumbered in the halls of Congress, as the “American System” of economic nationalism grew in popularity alongside the perceived need for foreign markets. Protectionist advocates in the United States viewed the American System as a panacea that not only promised to provide the federal government with revenue but also to artificially insulate American infant industries from undue foreign-market competition through high protective tariffs and subsidies, and to retaliate against real and perceived threats to U.S. trade. Throughout this period, the United States itself underwent a great struggle over foreign trade policy. By the late 19th century, the era’s boom-and-bust global economic system led to a growing perception that the United States needed more access to foreign markets as an outlet for the country’s surplus goods and capital. But whether the United States would obtain foreign market access through free trade or through protectionism led to a great debate over the proper course of U.S. foreign trade policy. By the time that the United States acquired a colonial empire from the Spanish in 1898, this same debate over U.S. foreign trade policy had effectively merged into debates over the course of U.S. imperial expansion. The country’s more expansionist-minded economic nationalists came out on top. The overwhelming 1896 victory of William McKinley—the Republican party’s “Napoleon of Protection”—marked the beginning of substantial expansion of U.S. foreign trade through a mixture of protectionism and imperialism in the years leading up to World War I.

Article

Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.

Article

Kathryn C. Statler

U.S.-French relations are long-standing, complex, and primarily cooperative in nature. Various crises have punctuated long periods of stability in the alliance, but after each conflict the Franco-American friendship emerged stronger than ever. Official U.S.-French relations began during the early stages of the American Revolution, when Louis XVI’s regime came to America’s aid by providing money, arms, and military advisers. French assistance, best symbolized by the Marquis de Lafayette, was essential in the revolution’s success. The subsequent French Revolution and Napoleon Bonaparte’s rise to power also benefitted the United States when Napoleon’s woes in Europe and the Caribbean forced him to sell the entire Louisiana territory to the United States, in 1803. Franco-American economic and cultural contacts increased throughout the 19th century, as trade between the two countries prospered and as Americans flocked to France to study art, architecture, music, and medicine. The French gift of the Statue of Liberty in the late 19th century solidified Franco-American bonds, which became even more secure during World War I. Indeed, during the war, the United States provided France with trade, loans, military assistance, and millions of soldiers, viewing such aid as repayment for French help during the American Revolution. World War II once again saw the United States fighting in France to liberate the country from Nazi control. The Cold War complicated the Franco-American relationship in new ways as American power waxed and French power waned. Washington and Paris clashed over military conflict in Vietnam, the Suez Crisis, and European security (the North Atlantic Treaty Organization or NATO, in particular) during the 1950s and 1960s. Ultimately, after French President Charles de Gaulle’s retirement, the Franco-American alliance stabilized by the mid-1970s and has flourished ever since, despite brief moments of crisis, such as the 2003 Second Gulf War in Iraq.

Article

Franklin D. Roosevelt was US president in extraordinarily challenging times. The impact of both the Great Depression and World War II make discussion of his approach to foreign relations by historians highly contested and controversial. He was one of the most experienced people to hold office, having served in the Wilson administration as Assistant Secretary of the Navy, completed two terms as Governor of New York, and held a raft of political offices. At heart, he was an internationalist who believed in an engaged and active role for the United States in world. During his first two terms as president, Roosevelt had to temper his international engagement in response to public opinion and politicians wanting to focus on domestic problems and wary of the risks of involvement in conflict. As the world crisis deepened in the 1930s, his engagement revived. He adopted a gradualist approach to educating the American people in the dangers facing their country and led them to eventual participation in war and a greater role in world affairs. There were clearly mistakes in his diplomacy along the way and his leadership often appeared flawed, with an ambiguous legacy founded on political expediency, expanded executive power, vague idealism, and a chronic lack of clarity to prepare Americans for postwar challenges. Nevertheless, his policies to prepare the United States for the coming war saw his country emerge from years of depression to become an economic superpower. Likewise, his mobilization of his country’s enormous resources, support of key allies, and the holding together of a “Grand Alliance” in World War II not only brought victory but saw the United States become a dominant force in the world. Ultimately, Roosevelt’s idealistic vision, tempered with a sound appreciation of national power, would transform the global position of the United States and inaugurate what Henry Luce described as “the American Century.”

Article

Sam Lebovic

According to the First Amendment of the US Constitution, Congress is barred from abridging the freedom of the press (“Congress shall make no law . . . abridging the freedom of speech, or of the press”). In practice, the history of press freedom is far more complicated than this simple constitutional right suggests. Over time, the meaning of the First Amendment has changed greatly. The Supreme Court largely ignored the First Amendment until the 20th century, leaving the scope of press freedom to state courts and legislatures. Since World War I, jurisprudence has greatly expanded the types of publication protected from government interference. The press now has broad rights to publish criticism of public officials, salacious material, private information, national security secrets, and much else. To understand the shifting history of press freedom, however, it is important to understand not only the expansion of formal constitutional rights but also how those rights have been shaped by such factors as economic transformations in the newspaper industry, the evolution of professional standards in the press, and the broader political and cultural relations between politicians and the press.

Article

Bacon’s Rebellion (1676–1677) was an uprising in the Virginia colony that its participants experienced as both a civil breakdown and a period of intense cosmic disorder. Although Thomas Hobbes had introduced his theory of state sovereignty a quarter century earlier, the secularizing connotations of his highly naturalized conceptualization of power had yet to make major inroads on a post-Reformation culture that was only gradually shifting from Renaissance providentialism to Enlightenment rationalism. Instead, the period witnessed a complicated interplay of providential beliefs and Hobbist doctrines. In the aftermath of the English civil war (1642–1651), this mingling of ideologies had prompted the Puritans’ own experimentation with Hobbes’s ideas, often in tandem with a Platonic spiritualism that was quite at odds with Hobbes’s own philosophical skepticism. The Restoration of 1660 had given an additional boost to Hobbism as his ideas won a number of prominent adherents in Charles II’s government. The intermingling of providentialism and Hobbism gave Bacon’s Rebellion its particular aura of heightened drama and frightening uncertainty. In the months before the uprising, the outbreak of a war on the colony’s frontier with the Doeg and Susquehannock peoples elicited fears in the frontier counties of a momentous showdown between faithful planters and God’s enemies. In contrast, Governor Sir William Berkeley’s establishmentarian Protestantism encouraged him to see the frontiersmen’s vigilantism as impious, and the government’s more measured response to the conflict as inherently godlier because tied to time-tested hierarchies and institutions. Greatly complicating this already confusing scene, the colony also confronted a further destabilizing force in the form of the new Hobbist politics emerging from the other side of the ocean. In addition to a number of alarming policies emanating from Charles II’s court in the 1670s that sought to enhance the English state’s supremacy over the colonies, Hobbes’s doctrines also informed the young Nathaniel Bacon Jr.’s stated rationale for leading frontiersmen against local Indian communities without Berkeley’s authorization. Drawing on the Hobbes-influenced civil war-era writings of his relation the Presbyterian lawyer Nathaniel Bacon, the younger Bacon made the protection of the colony’s Christian brotherhood a moral priority that outweighed even the preservation of existing civil relations and public institutions. While Berkeley’s antagonism toward this Hobbesian argument led him to lash out forcibly against Bacon as a singularly great threat to Virginia’s commonwealth, it was ordinary Virginians who most consequentially resisted Bacon’s strange doctrines. Yet a division persisted. Whereas the interior counties firmly rejected Bacon’s Hobbism in favor of the colony’s more traditional bonds to God and king, the frontier counties remained more open to a Hobbesian politics that promised their protection.

Article

Carolyn Podruchny and Stacy Nation-Knapper

From the 15th century to the present, the trade in animal fur has been an economic venture with far-reaching consequences for both North Americans and Europeans (in which North Americans of European descent are included). One of the earliest forms of exchange between Europeans and North Americans, the trade in fur was about the garment business, global and local politics, social and cultural interaction, hunting, ecology, colonialism, gendered labor, kinship networks, and religion. European fashion, specifically the desire for hats that marked male status, was a primary driver for the global fur-trade economy until the late 19th century, while European desires for marten, fox, and other luxury furs to make and trim clothing comprised a secondary part of the trade. Other animal hides including deer and bison provided sturdy leather from which belts for the machines of the early Industrial Era were cut. European cloth, especially cotton and wool, became central to the trade for Indigenous peoples who sought materials that were lighter and dried faster than skin clothing. The multiple perspectives on the fur trade included the European men and indigenous men and women actually conducting the trade; the indigenous male and female trappers; European trappers; the European men and women producing trade goods; indigenous “middlemen” (men and women) who were conducting their own fur trade to benefit from European trade companies; laborers hauling the furs and trade goods; all those who built, managed, and sustained trading posts located along waterways and trails across North America; and those Europeans who manufactured and purchased the products made of fur and the trade goods desired by Indigenous peoples. As early as the 17th century, European empires used fur-trade monopolies to establish colonies in North America and later fur trading companies brought imperial trading systems inland, while Indigenous peoples drew Europeans into their own patterns of trade and power. By the 19th century, the fur trade had covered most of the continent and the networks of business, alliances, and families, and the founding of new communities led to new peoples, including the Métis, who were descended from the mixing of European and Indigenous peoples. Trading territories, monopolies, and alliances with Indigenous peoples shaped how European concepts of statehood played out in the making of European-descended nation-states, and the development of treaties with Indigenous peoples. The fur trade flourished in northern climes until well into the 20th century, after which time economic development, resource exploitation, changes in fashion, and politics in North America and Europe limited its scope and scale. Many Indigenous people continue today to hunt and trap animals and have fought in courts for Indigenous rights to resources, land, and sovereignty.

Article

While American gambling has a historical association with the lawlessness of the frontier and with the wasteful leisure practices of Southern planters, it was in large cities where American gambling first flourished as a form of mass leisure, and as a commercial enterprise of significant scale. In the urban areas of the Mid-Atlantic, the Northeast, and the upper Mid-West, for the better part of two centuries the gambling economy was deeply intertwined with municipal politics and governance, the practices of betting were a prominent feature of social life, and controversies over the presence of gambling both legal and illegal, were at the center of public debate. In New York and Chicago in particular, but also in Cleveland, Pittsburgh, Detroit, Baltimore, and Philadelphia, gambling channeled money to municipal police forces and sustained machine politics. In the eyes of reformers, gambling corrupted governance and corroded social and economic interactions. Big city gambling has changed over time, often in a manner reflecting important historical processes and transformations in economics, politics, and demographics. Yet irrespective of such change, from the onset of Northern urbanization during the 19th century, through much of the 20th century, gambling held steady as a central feature of city life and politics. From the poolrooms where recently arrived Irish New Yorkers bet on horseracing after the Civil War, to the corner stores where black and Puerto Rican New Yorkers bet on the numbers game in the 1960s, the gambling activity that covered the urban landscape produced argument and controversy, particularly with respect to drawing the line between crime and leisure, and over the question of where and to what ends the money of the gambling public should be directed.

Article

Jerry Watkins

Regional variation, race, gender presentation, and class differences mean that there are many “Gay Souths.” Same-sex desire has been a feature of the human experience since the beginning, but the meanings, expressions, and ability to organize one’s life around desire have shifted profoundly since the invention of sexuality in the mid-19th century. World War II represented a key transition in gay history, as it gave many people a language for their desires. During the Cold War, government officials elided sex, race, and gender transgression with subversion and punished accordingly by state committees. These forces profoundly shaped gay social life, and rather than a straight line from closet to liberation, gays in the South have meandered. Movement rather than stasis, circulation rather than congregation, and the local rather than the stranger as well as creative uses of space and place mean that the gay South is distinctive, though not wholly unique, from the rest of the country.

Article

Throughout US history, Americans have used ideas about gender to understand power, international relations, military behavior, and the conduct of war. Since Joan Wallach Scott called on scholars in 1986 to consider gender a “useful category of analysis,” historians have looked beyond traditional diplomatic and military sources and approaches to examine cultural sources, the media, and other evidence to try to understand the ideas that Americans have relied on to make sense of US involvement in the world. From casting weak nations as female to assuming that all soldiers are heterosexual males, Americans have deployed mainstream assumptions about men’s and women’s proper behavior to justify US diplomatic and military interventions in the world. State Department pamphlets describing newly independent countries in the 1950s and 1960s featured gendered imagery like the picture of a young Vietnamese woman on a bicycle that was meant to symbolize South Vietnam, a young nation in need of American guidance. Language in news reports and government cables, as well as film representations of international affairs and war, expressed gendered dichotomies such as protector and protected, home front and battlefront, strong and weak leadership, and stable and rogue states. These and other episodes illustrate how thoroughly gender shaped important dimensions about the character and the making of US foreign policy and historians’ examinations of diplomatic and military history.

Article

Throughout American history, gender, meaning notions of essential differences between women and men, has shaped how Americans have defined and engaged in productive activity. Work has been a key site where gendered inequalities have been produced, but work has also been a crucible for rights claims that have challenged those inequalities. Federal and state governments long played a central role in generating and upholding gendered policy. Workers and advocates have debated whether to advance laboring women’s cause by demanding equality with men or different treatment that accounted for women’s distinct responsibilities and disadvantages. Beginning in the colonial period, constructions of dependence and independence derived from the heterosexual nuclear family underscored a gendered division of labor that assigned distinct tasks to the sexes, albeit varied by race and class. In the 19th century, gendered expectations shaped all workers’ experiences of the Industrial Revolution, slavery and its abolition, and the ideology of free labor. Early 20th-century reform movements sought to beat back the excesses of industrial capitalism by defining the sexes against each other, demanding protective labor laws for white women while framing work done by women of color and men as properly unregulated. Policymakers reinforced this framework in the 1930s as they built a welfare state that was rooted in gendered and racialized constructions of citizenship. In the second half of the 20th century, labor rights claims that reasoned from the sexes’ distinctiveness increasingly gave way to assertions of sex equality, even as the meaning of that equality was contested. As the sex equality paradigm triumphed in the late 20th and early 21st centuries, seismic economic shifts and a conservative business climate narrowed the potential of sex equality laws to deliver substantive changes to workers.

Article

The late 20th century saw gender roles transformed as the so-called Second Wave of American feminism that began in the 1960s gained support. By the early 1970s public opinion increasingly favored the movement and politicians in both major political parties supported it. In 1972 Congress overwhelmingly approved the Equal Rights Amendment (ERA) and sent it to the states. Many quickly ratified, prompting women committed to traditional gender roles to organize. However, by 1975 ERA opponents led by veteran Republican activist Phyllis Schlafly, founder of Stop ERA, had slowed the ratification process, although federal support for feminism continued. Congresswoman Bella Abzug (D-NY), inspired by the United Nations’ International Women’s Year (IWY) program, introduced a bill approved by Congress that mandated state and national IWY conferences at which women would produce recommendations to guide the federal government on policy regarding women. Federal funding of these conferences (held in 1977), and the fact that feminists were appointed to organize them, led to an escalation in tensions between feminist and conservative women, and the conferences proved to be profoundly polarizing events. Feminists elected most of the delegates to the culminating IWY event, the National Women’s Conference held in Houston, Texas, and the “National Plan of Action” adopted there endorsed a wide range of feminist goals including the ERA, abortion rights, and gay rights. But the IWY conferences presented conservatives with a golden opportunity to mobilize, and anti-ERA, pro-life, and anti-gay groups banded together as never before. By the end of 1977, these groups, supported by conservative Catholics, Mormons, and evangelical and fundamentalist Protestants, had come together to form a “Pro-Family Movement” that became a powerful force in American politics. By 1980 they had persuaded the Republican Party to drop its support for women’s rights. Afterward, as Democrats continued to support feminist goals and the GOP presented itself as the defender of “family values,” national politics became more deeply polarized and bitterly partisan.

Article

The issue of genocide and American Indian history has been contentious. Many writers see the massive depopulation of the indigenous population of the Americas after 1492 as a clear-cut case of the genocide. Other writers, however, contend that European and U.S. actions toward Indians were deplorable but were rarely if ever genocidal. To a significant extent, disagreements about the pervasiveness of genocide in the history of the post-Columbian Western Hemisphere, in general, and U.S. history, in particular, pivot on definitions of genocide. Conservative definitions emphasize intentional actions and policies of governments that result in very large population losses, usually from direct killing. More liberal definitions call for less stringent criteria for intent, focusing more on outcomes. They do not necessarily require direct sanction by state authorities; rather, they identify societal forces and actors. They also allow for several intersecting forces of destruction, including dispossession and disease. Because debates about genocide easily devolve into quarrels about definitions, an open-ended approach to the question of genocide that explores several phases and events provides the possibility of moving beyond the present stalemate. However one resolves the question of genocide in American Indian history, it is important to recognize that European and U.S. settler colonial projects unleashed massively destructive forces on Native peoples and communities. These include violence resulting directly from settler expansion, intertribal violence (frequently aggravated by colonial intrusions), enslavement, disease, alcohol, loss of land and resources, forced removals, and assaults on tribal religion, culture, and language. The configuration and impact of these forces varied considerably in different times and places according to the goals of particular colonial projects and the capacities of colonial societies and institutions to pursue them. The capacity of Native people and communities to directly resist, blunt, or evade colonial invasions proved equally important.

Article

Gentrification is one of the most controversial issues in American cities today. But it also remains one of the least understood. Few agree on how to define it or whether it is boon or curse for cities. Gentrification has changed over time and has a history dating back to the early 20th century. Historically, gentrification has had a smaller demographic impact on American cities than suburbanization or immigration. But since the late 1970s, gentrification has dramatically reshaped cities like Seattle, San Francisco, and Boston. Furthermore, districts such as the French Quarter in New Orleans, New York City’s Greenwich Village, and Georgetown in Washington DC have had an outsized influence on the political, cultural, and architectural history of cities. Gentrification thus must be examined alongside suburbanization as one of the major historical trends shaping the 20th-century American metropolis.

Article

Not many bilateral relationships in modern world history have as many twists and turns as the one shared by the United States and Germany. Their relationship has waxed and waned like few others: from mild indifference to faint and uncomplicated appreciation in the 18th and 19th centuries, to growing awareness, rivalry, and then outright hostility in the early 20th century, to codependent enabling and then horrific existential conflict in the mid-20th century, and finally to occupation, reconstruction, and mutually supportive and global-stabilizing friendship after 1945. With early trajectories that informed and mirrored each other, the paths of the United States and Germany eventually collided in the First World War, a conflict born of issues that were only half-resolved in its immediate aftermath. The outbreak of the Second World War laid bare the incompleteness of the Versailles settlement, leading to the complete and utter destruction of German civil society and ushering in an era of American supremacy in the Western world. The United States’ ambitious effort to reconstruct German society along American lines defined the relationship during the postwar period, but a turn toward nationalism in both countries in the early 21st century resurrected old questions and fears.

Article

American cities have been transnational in nature since the first urban spaces emerged during the colonial period. Yet the specific shape of the relationship between American cities and the rest of the world has changed dramatically in the intervening years. In the mid-20th century, the increasing integration of the global economy within the American economy began to reshape US cities. In the Northeast and Midwest, the once robust manufacturing centers and factories that had sustained their residents—and their tax bases—left, first for the South and West, and then for cities and towns outside the United States, as capital grew more mobile and businesses sought lower wages and tax incentives elsewhere. That same global capital, combined with federal subsidies, created boomtowns in the once-rural South and West. Nationwide, city boosters began to pursue alternatives to heavy industry, once understood to be the undisputed guarantor of a healthy urban economy. Increasingly, US cities organized themselves around the service economy, both in high-end, white-collar sectors like finance, consulting, and education, and in low-end pink-collar and no-collar sectors like food service, hospitality, and health care. A new legal infrastructure related to immigration made US cities more racially, ethnically, and linguistically diverse than ever before. At the same time, some US cities were agents of economic globalization themselves. Dubbed “global cities” by celebrants and critics of the new economy alike, these cities achieved power and prestige in the late 20th century not only because they had survived the ruptures of globalization but because they helped to determine its shape. By the end of the 20th century, cities that are not routinely listed among the “global city” elite jockeyed to claim “world-class” status, investing in high-end art, entertainment, technology, education, and health care amenities to attract and retain the high-income white-collar workers understood to be the last hope for cities hollowed out by deindustrialization and global competition. Today, the extreme differences between “global cities” and the rest of US cities, and the extreme socioeconomic stratification seen in cities of all stripes, is a key concern of urbanists.