Conceptions of what constitutes a street gang or a youth gang have varied since the seminal sociological studies on these entities in the 1920s. Organizations of teenage youths and young adults in their twenties, congregating in public spaces and acting collectively, were fixtures of everyday life in American cities throughout the 20th century. While few studies historicize gangs in their own right, historians in a range of subfields cast gangs as key actors in critical dimensions of the American urban experience: the formation and defense of ethno-racial identities and communities; the creation and maintenance of segregated metropolitan spaces; the shaping of gender norms and forms of sociability in working-class districts; the structuring of contentious political mobilization challenging police practices and municipal policies; the evolution of underground and informal economies and organized crime activities; and the epidemic of gun violence that spread through minority communities in many major cities at the end of the 20th and beginning of the 21st centuries.
Although groups of white youths patrolling the streets of working-class neighborhoods and engaging in acts of defensive localism were commonplace in the urban Northeast, Mid-Atlantic, and Midwest states by the mid-19th century, street gangs exploded onto the urban landscape in the early 20th century as a consequence of massive demographic changes related to the wave of immigration from Europe, Asia, and Latin America and the migration of African Americans from the South. As immigrants and migrants moved into urban working-class neighborhoods and industrial workplaces, street gangs proliferated at the boundaries of ethno-racially defined communities, shaping the context within which immigrant and second-generation youths negotiated Americanization and learned the meanings of race and ethnicity. Although social workers in some cities noted the appearance of some female gangs by the 1930s, the milieu of youth gangs during this era was male dominated, and codes of honor and masculinity were often at stake in increasingly violent clashes over territory and resources like parks and beaches.
The interplay of race, ethnicity, and masculinity continued to shape the world of gangs in the 1940s and 1950s, when white male gangs claiming to defend the whiteness of their communities used terror tactics to reinforce the boundaries of ghettos and barrios in many cities. Such aggressions spurred the formation of fighting gangs in black and Latino neighborhoods, where youths entered into at times deadly combat against their aggressors but also fought for honor, respect, and status with rivals within their communities. In the 1960s and 1970s, with civil rights struggles and ideologies of racial empowerment circulating through minority neighborhoods, some of these same gangs, often with the support of community organizers affiliated with political organizations like the Black Panther Party, turned toward defending the rights of their communities and participating in contentious politics. However, such projects were cut short by the fierce repression of gangs in minority communities by local police forces, working at times in collaboration with the Federal Bureau of Investigation. By the mid-1970s, following the withdrawal of the Black Panthers and other mediating organizations from cities like Chicago and Los Angeles, so-called “super-gangs” claiming the allegiance of thousands of youths began federating into opposing camps—“People” against “Folks” in Chicago, “Crips” against “Bloods” in LA—to wage war for control of emerging drug markets. In the 1980s and 1990s, with minority communities dealing with high unemployment, cutbacks in social services, failing schools, hyperincarceration, drug trafficking, gun violence, and toxic relations with increasingly militarized police forces waging local “wars” against drugs and gangs, gangs proliferated in cities throughout the urban Sun Belt. Their prominence within popular and political discourse nationwide made them symbols of the urban crisis and of the cultural deficiencies that some believed had caused it.
From the 1890s to World War I, progressive reformers in the United States called upon their local, state, and federal governments to revitalize American democracy and address the most harmful social consequences of industrialization. The emergence of an increasingly powerful administrative state, which intervened on behalf of the public welfare in the economy and society, generated significant levels of conflict. Some of the opposition came from conservative business interests, who denounced state labor laws and other market regulations as meddlesome interferences with liberty of contract. But the historical record of the Progressive Era also reveals a broad undercurrent of resistance from ordinary Americans, who fought for personal liberty against the growth of police power in such areas as public health administration and the regulation of radical speech. Their struggles in the streets, statehouses, and courtrooms of the United States in the early 20th century shaped the legal culture of the period and revealed the contested meaning of individual liberty in a new social age.
Ann Durkin Keating
Since the beginning of the 19th century, outlying areas of American cities have been home to a variety of settlements and enterprises with close links to urban centers. Beginning in the early 19th century, the increasing scale of business and industrial enterprises separated workplaces from residences. This allowed some urban dwellers to live at a distance from their place of employment and commute to work. Others lived in the shadow of factories located at some distance from the city center. Still others provided food or raw materials for urban residents and businesses. The availability of employment led to further suburban growth. Changing intracity transportation, including railroads, interurbans, streetcars, and cable cars, enabled people and businesses to locate beyond the limits of a walking city.
By the late 19th century, metropolitan areas across the United States included outlying farm centers, industrial towns, residential rail (or streetcar) suburbs, and recreational/institutional centers. With suburbs generally located along rail or ferry lines into the early 20th century, the physical development of metropolitan areas often resembled a hub and spokes. However, across metropolitan regions, suburbs had a great range of function and diversity of populations. With the advent of automobile commutation and the growing use of trucks to haul freight, suburban development took place between railroad lines, filling in the earlier hub-and-spokes patterns into a more deliberate built-up area.
Although suburban settlements were integrally connected to their neighbors and within a metropolitan economy and society, independent suburban governments emerged to serve these outlying settlements and keep them separate. Developers often took the lead in providing differential services (and regulations). Suburban governments emerged as hybrid forms, serving relatively homogeneous populations by providing only some urban functions. Well before 1945, suburbs were home to a wide range of work and residents.
Since the turn of the 20th century, teachers have tried to find a balance between bettering their own career prospects as workers and educating their students as public servants. To reach a workable combination, teachers have utilized methods drawn from union movements, the militant and labor-conscious approach favored by the American Federation of Teachers (AFT), as well as to professional organizations, the tradition from which the National Education Association (NEA) arose. Because teachers lacked the federally guaranteed labor rights that private-sector workers enjoyed after Congress passed the National Labor Relations Act in 1935, teachers’ fortunes—in terms of collective bargaining rights, control over classroom conditions, pay, and benefits—often remained tied to the broader public-sector labor movement and to state rather than federal law.
Opponents of teacher unionization consistently charged that as public servants paid by tax revenues, teachers and other public employees should not be allowed to form unions. Further, because women constituted the vast majority of teachers and union organizing often represented a “manly” domain, the opposition’s approach worked quite well, successfully preventing teachers from gaining widespread union recognition. But by the late 1960s and early 1970s, thanks to an improved economic climate and invigoration from the women’s movement, civil rights struggles, and the New Left, both AFT and NEA teacher unionism surged forward, infused with a powerful militancy devoted to strikes and other political action, and appeared poised to capture federal collective bargaining rights. Their newfound assertiveness proved ill-timed, however.
After the economic problems of the mid-1970s, opponents of teacher unions once again seized the opportunity to portray teacher unions and other public-sector unions as greedy and privileged interest groups functioning at the public’s expense. President Ronald Reagan accentuated this point when he fired all of the more than 10,000 striking air traffic controllers during the 1981 Professional Air Traffic Controllers Organization (PATCO) strike. Facing such opposition, teacher unions—and public-sector unions in general—shifted their efforts away from strikes and toward endorsing political candidates and lobbying governments to pass favorable legislation.
Given these constraints, public-sector unions enjoyed a large degree of success in the 1990s through the early 2000s, even as private-sector union membership plunged to less than 10 percent of the workforce. After the Great Recession of 2008, however, austerity politics targeted teachers and other public-sector workers and renewed political confrontations surrounding the legitimacy of teacher unions.
H. Paul Thompson Jr.
The temperance and prohibition movement—a social reform movement that pursued many approaches to limit or prohibit the use and/or sale of alcoholic beverages—is arguably the longest-running reform movement in US history, extending from the 1780s through the repeal of national prohibition in 1933. During this 150-year period the movement experienced many ideological, organizational, and methodological changes. Probably the most widely embraced antebellum reform, many of its earliest assumptions and much of its earliest literature was explicitly evangelical, but over time the movement assumed an increasingly secular image while retaining strong ties to organized religion. During the movement’s first fifty years, its definition of temperance evolved successively from avoiding drunkenness, to abstaining from all distilled beverages, to abstaining from all intoxicating beverages (i.e., “teetotalism”). During these years, reformers sought merely to persuade others of their views—what was called “moral suasion.” But by the 1840s many reformers began seeking the coercive power of local and state governments to prohibit the “liquor traffic.” These efforts were called “legal suasion,” and in the early 20th century, when local and state laws were deemed insufficient, movement leaders turned to the federal government. Throughout its history, movement leaders produced an extensive and well-preserved serial and monographic literature to chronicle their efforts, which makes the movement relatively easy to study.
No less than five national temperance organizations rose and fell across the movement’s history, aided by many other organizations also promoted the message with great effect. Grass roots reformers organized innumerable state and local temperance societies and fraternal lodges committed to abstinence. Temperance reformers, hailing from nearly every conceivable demographic, networked through a series of national and international temperance conventions, and at any given time were pursuing a diverse and often conflicting array of priorities and methodologies.
Finally, during the Progressive Era, reformers focused their hatred for alcohol almost exclusively on saloons and the liquor traffic. Through groundbreaking lobbying efforts and a fortuitous convergence of social and political forces, reformers witnessed the ratification of the Eighteenth Amendment in January 1919 that established national prohibition. Despite such a long history of reform, the success seemed sudden and caught many in the movement off guard. The rise of liquor-related violence, a transformation in federal-state relations, increasingly organized and outspoken opposition, the Great Depression, and a re-alignment of political party coalitions all culminated in the sweeping repudiation of prohibition and its Republican supporters in the 1932 presidential election. On December 5, 1933, the Twenty-first Amendment to the Constitution repealed the Eighteenth Amendment, returning liquor regulation to the states, which have since maintained a wide variety of ever changing laws controlling the sale of alcoholic beverages. But national prohibition permanently altered the federal government’s role in law enforcement, and its legacy remains.
Ross A. Kennedy
World War I profoundly affected the United States. It led to an expansion of America’s permanent military establishment, a foreign policy focused on reforming world politics, and American preeminence in international finance. In domestic affairs, America’s involvement in the war exacerbated class, racial, and ethnic conflict. It also heightened both the ethos of voluntarism in progressive ideology and the progressive desire to step up state intervention in the economy and society. These dual impulses had a coercive thrust that sometimes advanced progressive goals of a more equal, democratic society and sometimes repressed any perceived threat to a unified war effort. Ultimately the combination of progressive and repressive coercion undermined support for the Democratic Party, shifting the nation’s politics in a conservative direction as it entered the 1920s.
One of the pervasive myths about the United States is that it has never had a socialist movement comparable to other industrialized nations. Yet in the early 20th century a vibrant Socialist Party and socialist movement flourished in the United States. Created in 1901, the Socialist Party of America unsurprisingly declared its primary goal to be the collectivization of the means of production. Yet the party’s highly decentralized and democratic structure enabled it to adapt to the needs and cultures of diverse constituencies in different regions of the country. Among those attracted to the movement in its heyday were immigrant and native-born workers and their families, tenant farmers, middle-class intellectuals, socially conscious millionaires, urban reformers, and feminists. Party platforms regularly included the reform interests of these groups as well as the long-term goal of eradicating capitalism. By 1912, the Socialist Party boasted an impressive record of electoral successes at the local, state, and national levels. U.S. Socialists could also point with pride to over three hundred English and foreign-language Socialist periodicals, some with subscription rates that rivaled those of the major urban daily newspapers.
Yet Socialists faced numerous challenges in their efforts to build a viable third-party movement in the United States. On the one hand, progressive reformers in the Democratic and Republican parties sought to coopt Socialists. On the other hand, the Socialist Party encountered challenges on the left from anarchists, syndicalists, communists, and Farmer-Labor Party activists. The Socialist Party was particularly weakened by government repression during World War I, by the postwar Red Scare, and by a communist insurgency within its ranks in the aftermath of the war. By the onset of the Great Depression, the Communist Party would displace the Socialist Party as the leading voice of radical change in the United States.
Ted R. Bromund
The Special Relationship is a term used to describe the close relations between the United States and the United Kingdom. It applies particularly to the governmental realms of foreign, defense, security, and intelligence policy, but it also captures a broader sense that both public and private relations between the United States and Britain are particularly deep and close. The Special Relationship is thus a term for a reality that came into being over time as the result of political leadership as well as ideas and events outside the formal arena of politics.
After the political break of the American Revolution and in spite of sporadic cooperation in the 19th century, it was not until the Great Rapprochement of the 1890s that the idea that Britain and the United States had a special kind of relationship took hold. This decade, in turn, created the basis for the Special Relationship, a term first used by Winston Churchill in 1944. Churchill did the most to build the relationship, convinced as he was that close friendship between Britain and the United States was the cornerstone of world peace and prosperity. During and after the Second World War, many others on both sides of the Atlantic came to agree with Churchill.
The post-1945 era witnessed a flowering of the relationship, which was cemented—not without many controversies and crises—by the emerging Cold War against the Soviet Union. After the end of the Cold War in 1989, the relationship remained close, though it was severely tested by further security crises, Britain’s declining defense spending, the evolving implications of Britain’s membership in the European Union, the relative decline of Europe, and an increasing U.S. interest in Asia. Yet on many public and private levels, relations between the United States and Britain continue to be particularly deep, and thus the Special Relationship endures.
Although the League of Nations was the first permanent organization established with the purpose of maintaining international peace, it built on the work of a series of 19th-century intergovernmental institutions. The destructiveness of World War I led American and British statesmen to champion a league as a means of maintaining postwar global order. In the United States, Woodrow Wilson followed his predecessors, Theodore Roosevelt and William Howard Taft, in advocating American membership of an international peace league, although Wilson’s vision for reforming global affairs was more radical. In Britain, public opinion had begun to coalesce in favor of a league from the outset of the war, though David Lloyd George and many of his Cabinet colleagues were initially skeptical of its benefits. However, Lloyd George was determined to establish an alliance with the United States and warmed to the league idea when Jan Christian Smuts presented a blueprint for an organization that served that end.
The creation of the League was a predominantly British and American affair. Yet Wilson was unable to convince Americans to commit themselves to membership in the new organization. The Franco-British-dominated League enjoyed some early successes. Its high point was reached when Europe was infused with the “Spirit of Locarno” in the mid-1920s and the United States played an economically crucial, if politically constrained, role in advancing Continental peace. This tenuous basis for international order collapsed as a result of the economic chaos of the early 1930s, as the League proved incapable of containing the ambitions of revisionist powers in Europe and Asia. Despite its ultimate limitations as a peacekeeping body, recent scholarship has emphasized the League’s relative successes in stabilizing new states, safeguarding minorities, managing the evolution of colonies into notionally sovereign states, and policing transnational trafficking; in doing so, it paved the way for the creation of the United Nations.
For almost a century and a half, successive American governments adopted a general policy of neutrality on the world stage, eschewing involvement in European conflicts and, after the Quasi War with France, alliances with European powers. Neutrality, enshrined as a core principle of American foreign relations by the outgoing President George Washington in 1796, remained such for more than a century.
Finally, in the 20th century, the United States emerged as a world power and a belligerent in the two world wars and the Cold War. This article explores the modern conflict between traditional American attitudes toward neutrality and the global agenda embraced by successive U.S. governments, beginning with entry in the First World War. With the United States immersed in these titanic struggles, the traditional U.S. support for neutrality eroded considerably. During the First World War, the United States showed some sympathy for the predicaments of the remaining neutral powers. In the Second World War it applied considerable pressure to those states still trading with Germany. During the Cold War, the United States was sometimes impatient with the choices of states to remain uncommitted in the global struggle, while at times it showed understanding for neutrality and pursued constructive relations with neutral states. The wide varieties of neutrality in each of these conflicts complicated the choices of U.S. policy makers. Americans remained torn between memory of their own long history of neutrality and a capacity to understand its potential value, on one hand, and a predilection to approach conflicts as moral struggles, on the other.
Paul V. Murphy
Americans grappled with the implications of industrialization, technological progress, urbanization, and mass immigration with startling vigor and creativity in the 1920s even as wide numbers kept their eyes as much on the past as on the future. American industrial engineers and managers were global leaders in mass production, and millions of citizens consumed factory-made products, including electric refrigerators and vacuum cleaners, technological marvels like radios and phonographs, and that most revolutionary of mass-produced durables, the automobile. They flocked to commercial amusements (movies, sporting events, amusement parks) and absorbed mass culture in their homes, through the radio and commercial recordings. In the major cities, skyscrapers drew Americans upward while thousands of new miles of roads scattered them across the country. Even while embracing the dynamism of modernity, Americans repudiated many of the progressive impulses of the preceding era. The transition from war to peace in 1919 and 1920 was tumultuous, marked by class conflict, a massive strike wave, economic crisis, and political repression. Exhausted by reform, war, and social experimentation, millions of Americans recoiled from central planning and federal power and sought determinedly to bypass traditional politics in the 1920s. This did not mean a retreat from active and engaged citizenship; Americans fought bitterly over racial equality, immigration, religion, morals, Prohibition, economic justice, and politics. In a greatly divided nation, citizens experimented with new forms of nationalism, cultural identity, and social order that could be alternatively exclusive and pluralistic. Whether repressive or tolerant, such efforts held the promise of unity amid diversity; even those in the throes of reaction sought new ways of integration. The result was a nation at odds with itself, embracing modernity, sometimes heedlessly, while seeking desperately to retain a grip on the past.
Between 1880 and 1929, industrialization and urbanization expanded in the United States faster than ever before. Industrialization, meaning manufacturing in factory settings using machines plus a labor force with unique, divided tasks to increase production, stimulated urbanization, meaning the growth of cities in both population and physical size. During this period, urbanization spread out into the countryside and up into the sky, thanks to new methods of building taller buildings. Having people concentrated into small areas accelerated economic activity, thereby producing more industrial growth. Industrialization and urbanization thus reinforced one another, augmenting the speed with which such growth would have otherwise occurred.
Industrialization and urbanization affected Americans everywhere, but especially in the Northeast and Midwest. Technological developments in construction, transportation, and illumination, all connected to industrialization, changed cities forever, most immediately those north of Washington, DC and east of Kansas City. Cities themselves fostered new kinds of industrial activity on large and small scales. Cities were also the places where businessmen raised the capital needed to industrialize the rest of the United States. Later changes in production and transportation made urbanization less acute by making it possible for people to buy cars and live further away from downtown areas in new suburban areas after World War II ended.
James J. Connolly
The convergence of mass politics and the growth of cities in 19th-century America produced sharp debates over the character of politics in urban settings. The development of what came to be called machine politics, primarily in the industrial cities of the East and Midwest, generated sharp criticism of its reliance on the distribution of patronage and favor trading, its emphatic partisanship, and the plebian character of the “bosses” who practiced it. Initially, upper- and middle-class businessmen spearheaded opposition to this kind of politics, but during the late nineteenth and early 20th centuries, labor activists, women reformers, and even some ethnic spokespersons confronted “boss rule” as well. These challenges did not succeed in bringing an end to machine politics where it was well established, but the reforms they generated during the Progressive Era reshaped local government in most cities. In the West and Southwest, where cities were younger and partisan organizations less entrenched, business leaders implemented Progressive municipal reforms to consolidate their power. Whether dominated by reform regime or a party machine, urban politics and governance became more centralized by 1940 and less responsive to the concerns and demands of workers and immigrants.
Relations between the United States and Argentina can be best described as a cautious embrace punctuated by moments of intense frustration. Although never the center of U.S.–Latin American relations, Argentina has attempted to create a position of influence in the region. As a result, the United States has worked with Argentina and other nations of the Southern Cone—the region of South America that comprises Uruguay, Paraguay, Argentina, Chile, and southern Brazil—on matters of trade and economic development as well as hemispheric security and leadership. While Argentina has attempted to assert its position as one of Latin America’s most developed nations and therefore a regional leader, the equal partnership sought from the United States never materialized for the Southern Cone nation. Instead, competition for markets and U.S. interventionist and unilateral tendencies kept Argentina from attaining the influence and wealth it so desired. At the same time, the United States saw Argentina as an unreliable ally too sensitive to the pull of its volatile domestic politics. The two nations enjoyed moments of cooperation in World War I, the Cold War, and the 1990s, when Argentine leaders could balance this particular external partnership with internal demands. Yet at these times Argentine leaders found themselves walking a fine line as detractors back home saw cooperation with the United States as a violation of their nation’s sovereignty and autonomy. There has always been potential for a productive partnership, but each side’s intransigence and unique concerns limited this relationship’s accomplishments and led to a historical imbalance of power.
James F. Siekmeier
Throughout the 19th and 20th centuries, U.S. officials often viewed Bolivia as both a potential “test case” for U.S. economic foreign policy and a place where Washington’s broad visions for Latin America might be implemented relatively easily. After World War II, Washington leaders sought to show both Latin America and the nonindustrialized world that a relatively open economy could produce significant economic wealth for Bolivia’s working and middle classes, thus giving the United States a significant victory in the Cold War. Washington sought a Bolivia widely open to U.S. influence, and Bolivia often seemed an especially pliable country. In order to achieve their goals in Bolivia, U.S. leaders dispensed a large amount of economic assistance to Bolivia in the 1950s—a remarkable development in two senses. First, the U.S. government, generally loath to aid Third World nations, gave this assistance to a revolutionary regime. Second, the U.S. aid program for Bolivia proved to be a precursor to the Alliance for Progress, the massive aid program for Latin America in the 1960s that comprised the largest U.S. economic aid program in the Third World. Although U.S. leaders achieved their goal of a relatively stable, noncommunist Bolivia, the decision in the late 1950s to significantly increase U.S. military assistance to Bolivia’s relatively small military emboldened that military, which staged a coup in 1964, snuffing out democracy for nearly two decades. The country’s long history of dependency in both export markets and public- and private-sector capital investment led Washington leaders to think that dependency would translate into leverage over Bolivian policy. However, the historical record is mixed in this regard. Some Bolivian governments have accommodated U.S. demands; others have successfully resisted them.
Patrick William Kelly
The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.
Jason C. Parker
The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded.
Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.
U.S. imperialism took a variety of forms in the early 20th century, ranging from colonies in Puerto Rico and the Philippines to protectorates in Cuba, Panama, and other countries in Latin America, and open door policies such as that in China. Formal colonies would be ruled with U.S.-appointed colonial governors and supported by U.S. troops. Protectorates and open door policies promoted business expansion overseas through American oversight of foreign governments and, in the case of threats to economic and strategic interests, the deployment of U.S. marines. In all of these imperial forms, U.S. empire-building both reflected and shaped complex social, cultural, and political histories with ramifications for both foreign nations and America itself.
Melissa A. McEuen
The Second World War changed the United States for women, and women in turn transformed their nation. Over three hundred fifty thousand women volunteered for military service, while twenty times as many stepped into civilian jobs, including positions previously closed to them. More than seven million women who had not been wage earners before the war joined eleven million women already in the American work force. Between 1941 and 1945, an untold number moved away from their hometowns to take advantage of wartime opportunities, but many more remained in place, organizing home front initiatives to conserve resources, to build morale, to raise funds, and to fill jobs left by men who entered military service.
The U.S. government, together with the nation’s private sector, instructed women on many fronts and carefully scrutinized their responses to the wartime emergency. The foremost message to women—that their activities and sacrifices would be needed only “for the duration” of the war—was both a promise and an order, suggesting that the war and the opportunities it created would end simultaneously. Social mores were tested by the demands of war, allowing women to benefit from the shifts and make alterations of their own. Yet dominant gender norms provided ways to maintain social order amidst fast-paced change, and when some women challenged these norms, they faced harsh criticism. Race, class, sexuality, age, religion, education, and region of birth, among other factors, combined to limit opportunities for some women while expanding them for others.
However temporary and unprecedented the wartime crisis, American women would find that their individual and collective experiences from 1941 to 1945 prevented them from stepping back into a prewar social and economic structure. By stretching and reshaping gender norms and roles, World War II and the women who lived it laid solid foundations for the various civil rights movements that would sweep the United States and grip the American imagination in the second half of the 20th century.
After World War II, Okinawa was placed under U.S. military rule and administratively separated from mainland Japan. This occupation lasted from 1945 to 1972, and in these decades Okinawa became the “Keystone of the Pacific,” a leading strategic site in U.S. military expansionism in Asia and the Pacific. U.S. rule during this Cold War period was characterized by violence and coercion, resulting in an especially staggering scale of sexual violence against Okinawan women by U.S. military personnel. At the same time, the occupation also facilitated numerous cultural encounters between the occupiers and the occupied, leading to a flourishing cross-cultural grassroots exchange. A movement to establish American-style domestic science (i.e., home economics) in the occupied territory became a particularly important feature of this exchange, one that mobilized an assortment of women—home economists, military wives, club women, university students, homemakers—from the United States, Okinawa, and mainland Japan. The postwar domestic science movement turned Okinawa into a vibrant theater of Cold War cultural performance where women of diverse backgrounds collaborated to promote modern homemaking and build friendship across racial and national divides. As these women took their commitment to domesticity and multiculturalism into the larger terrain of the Pacific, they articulated the complex intertwining that occurred among women, domesticity, the military, and empire.