You are looking at 281-300 of 306 articles
Ronald Reagan’s foreign policy legacy remains hotly contested, and as new archival sources come to light, those debates are more likely to intensify than to recede into the background. In dealings with the Soviet Union, the Reagan administration set the superpowers on a course for the (largely) peaceful end of the Cold War. Reagan began his outreach to Soviet leaders almost immediately after taking office and enjoyed some success, even if the dominant theme of the period remains fears of Reagan as a “button-pusher” in the public’s perception. Mikhail Gorbachev’s election to the post of General Secretary proved the turning point. Reagan, now confident in US strength, and Gorbachev, keen to reduce the financial burden of the arms race, ushered in a new, cooperative phase of the Cold War. Elsewhere, in particular Latin America, the administration’s focus on fighting communism led it to support human rights–abusing regimes at the same time as it lambasted Moscow’s transgressions in that regard. But even so, over the course of the 1980s, the United States began pushing for democratization around the world, even where Reagan and his advisors had initially resisted it, fearing a communist takeover. In part, this was a result of public pressure, but the White House recognized and came to support the rising tide of democratization. When Reagan left office, a great many countries that had been authoritarian were no longer, often at least in part because of US policy. US–Soviet relations had improved to such an extent that Reagan’s successor, Vice President George H. W. Bush, worried that they had gone too far in working with Gorbachev and been hoodwinked.
Economic nationalism tended to dominate U.S. foreign trade policy throughout the long 19th century, from the end of the American Revolution to the beginning of World War I, owing to a pervasive American sense of economic and geopolitical insecurity and American fear of hostile powers, especially the British but also the French and Spanish and even the Barbary States. Following the U.S. Civil War, leading U.S. protectionist politicians sought to curtail European trade policies and to create a U.S.-dominated customs union in the Western Hemisphere. American proponents of trade liberalization increasingly found themselves outnumbered in the halls of Congress, as the “American System” of economic nationalism grew in popularity alongside the perceived need for foreign markets. Protectionist advocates in the United States viewed the American System as a panacea that not only promised to provide the federal government with revenue but also to artificially insulate American infant industries from undue foreign-market competition through high protective tariffs and subsidies, and to retaliate against real and perceived threats to U.S. trade.
Throughout this period, the United States itself underwent a great struggle over foreign trade policy. By the late 19th century, the era’s boom-and-bust global economic system led to a growing perception that the United States needed more access to foreign markets as an outlet for the country’s surplus goods and capital. But whether the United States would obtain foreign market access through free trade or through protectionism led to a great debate over the proper course of U.S. foreign trade policy. By the time that the United States acquired a colonial empire from the Spanish in 1898, this same debate over U.S. foreign trade policy had effectively merged into debates over the course of U.S. imperial expansion. The country’s more expansionist-minded economic nationalists came out on top. The overwhelming 1896 victory of William McKinley—the Republican party’s “Napoleon of Protection”—marked the beginning of substantial expansion of U.S. foreign trade through a mixture of protectionism and imperialism in the years leading up to World War I.
Kathryn C. Statler
U.S.-French relations are long-standing, complex, and primarily cooperative in nature. Various crises have punctuated long periods of stability in the alliance, but after each conflict the Franco-American friendship emerged stronger than ever. Official U.S.-French relations began during the early stages of the American Revolution, when Louis XVI’s regime came to America’s aid by providing money, arms, and military advisers. French assistance, best symbolized by the Marquis de Lafayette, was essential in the revolution’s success. The subsequent French Revolution and Napoleon Bonaparte’s rise to power also benefitted the United States when Napoleon’s woes in Europe and the Caribbean forced him to sell the entire Louisiana territory to the United States, in 1803. Franco-American economic and cultural contacts increased throughout the 19th century, as trade between the two countries prospered and as Americans flocked to France to study art, architecture, music, and medicine. The French gift of the Statue of Liberty in the late 19th century solidified Franco-American bonds, which became even more secure during World War I. Indeed, during the war, the United States provided France with trade, loans, military assistance, and millions of soldiers, viewing such aid as repayment for French help during the American Revolution. World War II once again saw the United States fighting in France to liberate the country from Nazi control. The Cold War complicated the Franco-American relationship in new ways as American power waxed and French power waned. Washington and Paris clashed over military conflict in Vietnam, the Suez Crisis, and European security (the North Atlantic Treaty Organization or NATO, in particular) during the 1950s and 1960s. Ultimately, after French President Charles de Gaulle’s retirement, the Franco-American alliance stabilized by the mid-1970s and has flourished ever since, despite brief moments of crisis, such as the 2003 Second Gulf War in Iraq.
U.S. imperialism took a variety of forms in the early 20th century, ranging from colonies in Puerto Rico and the Philippines to protectorates in Cuba, Panama, and other countries in Latin America, and open door policies such as that in China. Formal colonies would be ruled with U.S.-appointed colonial governors and supported by U.S. troops. Protectorates and open door policies promoted business expansion overseas through American oversight of foreign governments and, in the case of threats to economic and strategic interests, the deployment of U.S. marines. In all of these imperial forms, U.S. empire-building both reflected and shaped complex social, cultural, and political histories with ramifications for both foreign nations and America itself.
David A. Nichols
From 1783 to 1830, American Indian policy reflected the new American nation-state’s desire to establish its own legitimacy and authority, by controlling Native American peoples and establishing orderly and prosperous white settlements in the continental interior. The Federalists focused on securing against Native American claims and attacks several protected enclaves of white settlement (Ohio, Kentucky, Tennessee), established—often violently—during the Revolutionary War. They used treaties to draw a legal boundary between these enclaves and Indian communities, and annuities and military force to keep Indians on their side of the line. The Jeffersonian Republicans adopted a more expansive plan of development, coupled with the promotion of Native American dependency. Treaty commissioners persuaded chiefs to cede road easements and riverfront acreage that the government used to link and develop dispersed white settlements. Meanwhile, the War Department built trading factories whose cheap merchandise would lure Indians into commercial dependency, and agents offered Indian families agricultural equipment and training, hoping that Native American farmers would no longer need “extensive forests” to support themselves. These pressures helped engender nativist movements in the Old Northwest and southeast, and Indian men from both regions fought the United States in the War of 1812, reinforcing frontier settlers’ view that Indians were a security threat. After this war’s end, the United States adopted a strategy of containment, pressuring Indian leaders to cede most of their peoples’ lands, confining Indians to enclaves, financing vocational schooling for Indian children, and encouraging Native peoples voluntarily to move west of the Mississippi. This policy, however, proved too respectful of Indian autonomy for the frontier settlers and politicians steadily gaining influence in the national government. After these settlers elected one of their own, Andrew Jackson, to the presidency, American Indian policy would enter a much more coercive and violent phase, as white Americans redefined the nation-state as a domain of white supremacy ethnically cleansed of indigenous peoples.
Oil played a central role in shaping US policy toward Iraq over the course of the 20th century. The United States first became involved in Iraq in the 1920s as part of an effort secure a role for American companies in Iraq’s emerging oil industry. As a result of State Department efforts, American companies gained a 23.75 percent ownership share of the Iraq Petroleum Company in 1928. In the 1940s, US interest in the country increased as a result of the Cold War with the Soviet Union. To defend against a perceived Soviet threat to Middle East oil, the US supported British efforts to “secure” the region. After nationalist officers overthrew Iraq’s British-supported Hashemite monarchy in 1958 and established friendly relations with the Soviet Union, the United States cultivated an alliance with the Iraqi Baath Party as an alternative to the Soviet-backed regime. The effort to cultivate an alliance with the Baath foundered as a result the Baath’s perceived support for Arab claims against Israel. The breakdown of US-Baath relations led the Baath to forge an alliance with the Soviet Union. With Soviet support, the Baath nationalized the Iraq Petroleum Company in 1972. Rather than resulting in a “supply cutoff,” Soviet economic and technical assistance allowed for a rapid expansion of the Iraqi oil industry and an increase in Iraqi oil flowing to world markets. As Iraq experienced a dramatic oil boom in the 1970s, the United States looked to the country as a lucrative market for US exports goods and adopted a policy of accommodation with regard to Baath. This policy of accommodation gave rise to close strategic and military cooperation throughout the 1980s as Iraq waged war against Iran. When Iraq invaded Kuwait and seized control of its oil fields in 1990, the United States shifted to a policy of Iraqi containment. The United States organized an international coalition that quickly ejected Iraqi forces from Kuwait, but chose not to pursue regime change for fear of destabilizing the country and wider region. Throughout the 1990s, the United States adhered to a policy of Iraqi containment but came under increasing pressure to overthrow the Baath and dismantle its control over the Iraqi oil industry. In 2003, the United States seized upon the 9/11 terrorist attacks as an opportunity to implement this policy of regime change and oil reprivatization.
Olivia L. Sohns
Moral, political, and strategic factors have contributed to the emergence and durability of the U.S.-Israel alliance. It took decades for American support for Israel to evolve from “a moral stance” to treating Israel as a “strategic asset” to adopting a policy of “strategic cooperation.” The United States supported Israel’s creation in 1948 not only because of the lobbying efforts of American Jews but also due to humanitarian considerations stemming from the Holocaust. Beginning in the 1950s, Israel sought to portray itself as an ally of the United States on grounds that America and Israel were fellow liberal democracies and shared a common Judeo-Christian cultural heritage. By the mid-1960s, Israel was considered a strategic proxy of American power in the Middle East in the Cold War, while the Soviet Union armed the radical Arab nationalist states and endorsed a Palestinian “people’s wars of national liberation” against Israel. Over the subsequent decades, Israel repeatedly sought to demonstrate that it was allied with the United States in opposing instability in the region that might threaten U.S. interests. Israel also sought to portray itself as a liberal democracy despite its continued occupation of territories that it conquered in the Arab-Israeli War of 1967. After the terrorist attacks of September 11, 2001, and the rise of regional instability and radicalism in the Middle East following the 2003 U.S. invasion of Iraq and the Arab Spring of 2011, Israel’s expertise in the realms of counterterrorism and homeland security provided a further basis for U.S.-Israel military-strategic cooperation. Although American and Israeli interests are not identical, and there have been disagreements between the two countries regarding the best means to secure comprehensive Arab-Israeli and Israeli-Palestinian peace, the foundations of the relationship are strong enough to overcome crises that would imperil a less robust alliance.
Relations between the United States and Mexico have rarely been easy. Ever since the United States invaded its southern neighbor and seized half of its national territory in the 19th century, the two countries have struggled to establish a relationship based on mutual trust and respect. Over the two centuries since Mexico’s independence, the governments and citizens of both countries have played central roles in shaping each other’s political, economic, social, and cultural development. Although this process has involved—even required—a great deal of cooperation, relations between the United States and Mexico have more often been characterized by antagonism, exploitation, and unilateralism. This long history of tensions has contributed to the three greatest challenges that these countries face together today: economic development, immigration, and drug-related violence.
The United States–Mexico War was the first war in which the United States engaged in a conflict with a foreign nation for the purpose of conquest. It was also the first conflict in which trained soldiers (from West Point) played a large role. The war’s end transformed the United States into a continental nation as it acquired a vast portion of Mexico’s northern territories. In addition to shaping U.S.–Mexico relations into the present, the conflict also led to the forcible incorporation of Mexicans (who became Mexican Americans) as the nation’s first Latinos. Yet, the war has been identified as the nation’s “forgotten war” because few Americans know the causes and consequences of this conflict. Within fifteen years of the war’s end, the conflict faded from popular memory, but it did not disappear, due to the outbreak of the U.S. Civil War. By contrast, the U.S.–Mexico War is prominently remembered in Mexico as having caused the loss of half of the nation’s territory, and as an event that continues to shape Mexico’s relationship with the United States. Official memories (or national histories) of war affect international relations, and also shape how each nation’s population views citizens of other countries. Not surprisingly, there is a stark difference in the ways that American citizens and Mexican citizens remember and forget the war (e.g., Americans refer to the “Mexican American War” or the “U.S.–Mexican War,” for example, while Mexicans identify the conflict as the “War of North American Intervention”).
On April 4, 1949, twelve nations signed the North Atlantic Treaty: the United States, Canada, Iceland, the United Kingdom, Belgium, the Netherlands, Luxembourg, France, Portugal, Italy, Norway, and Denmark. For the United States, the North Atlantic Treaty signaled a major shift in foreign policy. Gone was the traditional aversion to “entangling alliances,” dating back to George Washington’s farewell address. The United States had entered into a collective security arrangement designed to preserve peace in Europe.
With the creation of the North Atlantic Treaty Organization (NATO), the United States took on a clear leadership role on the European continent. Allied defense depended on US military power, most notably the nuclear umbrella. Reliance on the United States unsurprisingly created problems. Doubts about the strength of the transatlantic partnership and rumors of a NATO in shambles were (and are) commonplace, as were anxieties about the West’s strength in comparison to NATO’s Eastern counterpart, the Warsaw Pact. NATO, it turned out, was more than a Cold War institution. After the fall of the Berlin Wall and the collapse of the Soviet Union, the Alliance remained vital to US foreign policy objectives. The only invocation of Article V, the North Atlantic Treaty’s collective defense clause, came in the wake of the September 11, 2001 terrorist attacks. Over the last seven decades, NATO has symbolized both US power and its challenges.
Little Saigon is the preferred name of Vietnamese refugee communities throughout the world. This article focuses primarily on the largest such community, in Orange County, California. This suburban ethnic enclave is home to the largest concentration of overseas Vietnamese, nearly 200,000, or 10 percent of the Vietnamese American population. Because of its size, location, and demographics, Little Saigon is also home to some of the most influential intellectuals, entertainers, businesspeople, and politicians in the Vietnamese diaspora, many of whom are invested in constructing Little Saigon as a transnational oppositional party to the government of Vietnam. Unlike traditional immigrant ethnic enclaves, Little Saigon is a refugee community whose formation and development emerged in large part from America’s efforts to atone for its epic defeat in Vietnam by at least sparing some of its wartime allies a life under communism. Much of Little Saigon’s cultural politics revolve around this narrative of rescue, although the number guilt-ridden Americans grows smaller and more conservative, while the loyalists of the pre-1975 Saigon regime struggle to instill in the younger generation of Vietnamese an appreciation of their refugee roots.
In the early 20th century, West Virginia coal miners and mine operators fought a series of bloody battles that raged for two decades and prompted national debates over workers’ rights. Miners in the southern part of the state lived in towns wholly owned by coal companies and attempted to join the United Mine Workers of America (UMWA) to negotiate better working conditions but most importantly to restore their civil liberties. Mine operators saw unionization as a threat to their businesses and rights and hired armed guards to patrol towns and prevent workers from organizing. The operators’ allies in local and state government used their authority to help break strikes by sending troops to strike districts, declaring martial law, and jailing union organizers in the name of law and order. Observers around the country were shocked at the levels of violence as well as the conditions that fueled the battles. The Mine Wars include the Paint Creek–Cabin Creek Strike of 1912–1913, the so-called 1920 Matewan Massacre, the 1920 Three Days Battle, and the 1921 Battle of Blair Mountain. In this struggle over unionism, the coal operators prevailed, and West Virginia miners continued to work in nonunion mines and live in company towns through the 1920s.
An overview of Euro-American internal migration in the United States between 1940 and 1980 explores the overall population movement away from rural areas to cities and suburban areas. Although focused on white Americans and their migrations, there are similarities to the Great Migration of African Americans, who continued to move out of the South during the mid-20th century. In the early period, the industrial areas in the North and West attracted most of the migrants. Mobilization for World War II loosened rural dwellers who were long kept in place by low wages, political disfranchisement, and low educational attainment. The war also attracted significant numbers of women to urban centers in the North and West. After the war, migration increased, enticing white Americans to become not just less rural but also increasingly suburban. The growth of suburbs throughout the country was prompted by racial segregation in housing that made many suburban areas white and earmarked many urban areas for people of color. The result was incredible growth in suburbia: from 22 million living in those areas in 1940 to triple that in 1970. Later in the period, as the Steelbelt rusted, the rise of the West as a migration magnet was spurred by development strategies, federal investment in infrastructure, and military bases. Sunbelt areas were making investments that stood ready to recruit industries and of course people, especially from Rustbelt areas in the North. By the dawn of the 21st century, half of the American population resided in suburbs.
Rebecca J. Mead
Woman suffragists in the United States engaged in a sustained, difficult, and multigenerational struggle: seventy-two years elapsed between the Seneca Falls convention (1848) and the passage of the Nineteenth Amendment (1920). During these years, activists gained confidence, developed skills, mobilized resources, learned to maneuver through the political process, and built a social movement. This essay describes key turning points and addresses internal tensions as well as external obstacles in the U.S. woman suffrage movement. It identifies important strategic, tactical, and rhetorical approaches that supported women’s claims for the vote and influenced public opinion, and shows how the movement was deeply connected to contemporaneous social, economic, and political contexts.
Kelly A. Ryan
Patriarchy profoundly affected social relations and the daily lives of individuals in early America by supporting the elaboration of both racial differences and sexual hierarchies. Patriarchal ideals held that men should supervise women and that economic, sexual, legal, and political power rested with men. Laws and religious practices demanded women’s subordination to men, and governmental and extralegal controls on women’s sexual and familial lives buttressed patriarchal ideals and practices by enforcing their dependence on white men.
Women played a variety of roles within households, which differed according to region, race, generation, and condition of servitude. Marriage was central to the delineation of white women’s roles, and slavery was critical to developing ideas and laws affecting African American women’s place in society. Interactions with Europeans brought patriarchal influences into native women’s lives. Indian servitude and slavery, European missionary efforts, and cross-cultural diplomacy resulted in the transmission of patriarchal practices that undermined Indian women’s access to political, sexual, economic, and religious power
Women gained esteem for fulfilling their duties within the household and community, while others resisted patriarchal customs and forged their own paths. Some women served as agents of patriarchy and used their status or positions to oppress other women. White women often held power over others in their households, including servants and slaves, and in the early republic some of the public sphere activities of middle-class white women targeted the homes of Native Americans, African Americans, and poor women for uplift. Other women resisted subordination and found autonomy by pursuing their own goals. Sexuality was a critical arena in which women could breech dictates on behavior and advance their own agenda, though not always without consequences. Women in urban communities found greater economic opportunities, and some religious communities, like the Society of Friends, allowed women a larger role in decision making and religious speech. Though patriarchal structures would change over time, the idea of men as the leaders of the household and society was remarkably resilient through the 19th century.
Sheila L. Skemp
Historians once assumed that, because women in the era of the American Revolution could not vote and showed very little interest in attaining the franchise, they were essentially apolitical beings. Scholars now recognize that women were actively engaged in the debates that accompanied the movement toward independence, and that after the war many sought a more expansive political role for themselves. Moreover, men welcomed women’s support for the war effort. If they saw women as especially fit for domestic duties, many continued to seek women’s political guidance and help even after the war ended.
Granted, those women who wanted a more active and unmediated relationship to the body politic faced severe legal and ideological obstacles. The common law system of coverture gave married women no control over their bodies or to property, and thus accorded them no formal venue to express their political opinions. Religious convention had it that women, the “weaker sex,” were the authors of original sin. The ideology associated with “republicanism” argued that the attributes of independence, self-reliance, physical strength, and bravery were exclusively masculine virtues. Many observers characterized women as essentially selfish and frivolous creatures who hungered after luxuries and could not contain their carnal appetites. Nevertheless, some women carved out political roles for themselves.
In the lead up to the war, many women played active, even essential roles in various non-consumption movements, promising to refrain from purchasing English goods, and attacking those merchants who refused to boycott prohibited goods. Some took to the streets, participating in riots that periodically disturbed the tranquility of colonial cities. A few published plays and poems proclaiming their patriotic views. Those women, who would become loyalists, were also active, never reluctant, to express their disapproval of the protest movement.
During the war, many women demonstrated their loyalty to the patriot cause by shouldering the burdens of absent husbands. They managed farms and businesses. First in Philadelphia, and then in other cities, women went from door to door collecting money for the Continental Army. Some accompanied husbands to the battlefront, where they tended to the material needs of soldiers. A very few disguised themselves as men and joined the army, exposing as a lie the notion that only men had the capacity to sacrifice their lives for the good of the country. Loyalist women continued to express their political views, even though doing so brought them little more than physical suffering and emotional pain. African American women took advantage of wartime chaos to run away from their masters and forge new, independent lives for themselves.
After the war, women marched in parades, lobbied and petitioned legislators, attended sessions of Congress, and participated in political rallies—lending their support to particular candidates or factions. Elite women published novels, poems, and plays. Some hosted salons where men and women gathered to discuss political issues. In New Jersey, single property-owning women voted.
By the end of the century, however, proponents of women’s political rights lost ground, in part because new “scientific” notions of gender difference prepared the way for the concept of “separate spheres.” Politics became more organized, leaving little room for women to express their views “out of doors,” even as judges and legislators defined women as naturally dependent. Still, white, middle class women in particular took advantage of better educational opportunities, finding ways to influence the public sphere without demanding formal political rights. They read, wrote, and organized benevolent societies, laying the groundwork for the antebellum reform movements of the mid-19th century.
Catherine A. Brekus
Historically, women in colonial North America and the United States have been deeply influenced by their religious traditions. Even though world religions like Judaism, Christianity, Buddhism, Hinduism, and Islam are based on scriptural traditions that portray women as subordinate to men, women have made up the majority of most religious groups in America. While some Americans have used religious arguments to limit women’s legal, political, and economic rights, others have drawn on scripture to defend women’s dignity and equality. Women’s religious beliefs have shaped every aspect of their lives, including their choices about how to structure their time, their attitudes toward sexuality and the body, and their understanding of suffering. Unlike early American Catholic women, who saw their highest religious calling as the sisterhood, most white colonial women identified their primary religious vocation as ministering to their families. In the 19th century, however, white Protestant women become increasingly involved in reform movements like temperance, abolitionism, and women’s suffrage, and African-American, Native American, Asian-American, and Latina women used religious arguments to challenge assumptions about white racial supremacy. In the 20th century, growing numbers of women from many different religious traditions have served as religious leaders, and in some cases they have also demanded ordination. Despite these dramatic changes in religious life, however, many religiously conservative women opposed the Equal Rights Amendment during the 1970s and early 1980s, and in the first decades of the 21st century they have continued to identify feminism and religion as antithetical.
Shannon K. Withycombe
Throughout the 19th century, American women experienced vast changes regarding possibilities for childbirth and for enhancing or restricting fertility control. At the beginning of the century, issues involving reproduction were discussed primarily in domestic, private settings among women’s networks that included family members, neighbors, or midwives. In the face of massive social and economic changes due to industrialization, urbanization, and immigration, many working-class women became separated from these traditional networks and knowledge and found themselves reliant upon emerging medical systems for care and advice during pregnancy and childbirth. At the same time, upper-class women sought out men in the emerging profession of obstetrics to deliver their babies in hopes of beating the frightening odds against maternal and infant health and even survival. Nineteenth-century reproduction was altered drastically with the printing and commercial boom of the middle of the century. Families could now access contraception and abortion methods and information, which was available earlier in the century albeit in a more private and limited manner, through newspapers, popular books, stores, and from door-to-door salesmen. As fertility control entered these public spaces, many policy makers became concerned about the impacts of such practices on the character and future of the nation. By the 1880s, contraception and abortion came under legal restrictions, just as women and their partners gained access to safer and more effective products than ever before. When the 19th century closed, legislatures and the medical profession raised obstacles that hindered the ability of most women to limit the size of their families as the national fertility rate reached an all-time low. Clearly, American families eagerly seized opportunities to exercise control over their reproductive destinies and their lives.
Jessica M. Frazier
Women on all sides of the US war in Vietnam pushed for an end to the conflict. At a time of renewed feminist fervor, women stepped outside conventional gender roles by publicly speaking out, traveling to a war zone, and entering the male-dominated realm of foreign affairs. Even so, some claimed to stand squarely within the boundaries of womanhood as they undertook such unusual activities. Some American women argued that, as mothers or sisters of soldiers and draft-age men, they held special insight into the war. They spoke of their duty to their families, communities, and nation to act in untraditional, but nevertheless feminine, ways. But women did not act uniformly. Some joined the military as nurses or service personnel to help in the war effort, while others protested the war and served as draft counselors. By the end of the war, some anti-war protestors developed feminist critiques of US involvement in Vietnam that pointed to the war as a symptom of an unjust society that prioritized military dominance over social welfare. As in wars past, the US war in Vietnam created upheavals in gender roles, and as nurses, mothers, lovers, officers, entertainers, and activists, women created new spaces in a changing society.
Melissa A. McEuen
The Second World War changed the United States for women, and women in turn transformed their nation. Over three hundred fifty thousand women volunteered for military service, while twenty times as many stepped into civilian jobs, including positions previously closed to them. More than seven million women who had not been wage earners before the war joined eleven million women already in the American work force. Between 1941 and 1945, an untold number moved away from their hometowns to take advantage of wartime opportunities, but many more remained in place, organizing home front initiatives to conserve resources, to build morale, to raise funds, and to fill jobs left by men who entered military service.
The U.S. government, together with the nation’s private sector, instructed women on many fronts and carefully scrutinized their responses to the wartime emergency. The foremost message to women—that their activities and sacrifices would be needed only “for the duration” of the war—was both a promise and an order, suggesting that the war and the opportunities it created would end simultaneously. Social mores were tested by the demands of war, allowing women to benefit from the shifts and make alterations of their own. Yet dominant gender norms provided ways to maintain social order amidst fast-paced change, and when some women challenged these norms, they faced harsh criticism. Race, class, sexuality, age, religion, education, and region of birth, among other factors, combined to limit opportunities for some women while expanding them for others.
However temporary and unprecedented the wartime crisis, American women would find that their individual and collective experiences from 1941 to 1945 prevented them from stepping back into a prewar social and economic structure. By stretching and reshaping gender norms and roles, World War II and the women who lived it laid solid foundations for the various civil rights movements that would sweep the United States and grip the American imagination in the second half of the 20th century.