You are looking at 381-400 of 409 articles
Gregory A. Daddis
For nearly a decade, American combat soldiers fought in South Vietnam to help sustain an independent, noncommunist nation in Southeast Asia. After U.S. troops departed in 1973, the collapse of South Vietnam in 1975 prompted a lasting search to explain the United States’ first lost war. Historians of the conflict and participants alike have since critiqued the ways in which civilian policymakers and uniformed leaders applied—some argued misapplied—military power that led to such an undesirable political outcome. While some claimed U.S. politicians failed to commit their nation’s full military might to a limited war, others contended that most officers fundamentally misunderstood the nature of the war they were fighting. Still others argued “winning” was essentially impossible given the true nature of a struggle over Vietnamese national identity in the postcolonial era. On their own, none of these arguments fully satisfy. Contemporary policymakers clearly understood the difficulties of waging a war in Southeast Asia against an enemy committed to national liberation. Yet the faith of these Americans in their power to resolve deep-seated local and regional sociopolitical problems eclipsed the possibility there might be limits to that power. By asking military strategists to simultaneously fight a war and build a nation, senior U.S. policymakers had asked too much of those crafting military strategy to deliver on overly ambitious political objectives. In the end, the Vietnam War exposed the limits of what American military power could achieve in the Cold War era.
The meaning of the Vietnam War has enduringly divided Americans in the postwar period. In part because the political splits opened up by the war made it an awkward topic for conversation, Vietnam veterans felt a barrier of silence separating them from their fellow citizens. The situation of returning veterans in the war’s waning years serves as a baseline against which to measure subsequent attempts at their social reintegration. Veterans, as embodiments of the experience of the war, became vehicles through which American society could assimilate its troubled and troubling memories.
By the 1980s, greater public understanding of the difficulties of veterans’ homecoming experiences—particularly after the recognition in 1980 of the psychiatric condition, post-traumatic stress disorder (PTSD)—helped accelerate the efforts to recognize the service and sacrifices of Americans who fought in Vietnam through the creation of memorials. Because the homecoming experience was seen as crucial to the difficulties which a substantial minority suffered, the concept emerged that the nation needed to embrace its veterans in order to help restore their well-being.
Characteristic ways of talking about the veterans’ experiences coalesced into truisms and parables: the nation and its veterans needed to “reconcile” and “heal”; America must “never again” send young men to fight a war unless the government goes all-out for victory; protesters spat on the veterans and called them “baby killers” when they returned from Vietnam.
Strategists debated what the proper “lessons” of the Vietnam War were and how they should be applied to other military interventions. After the prevalent “overwhelming force” doctrine was discarded in 2003 in the invasion of Iraq, new “lessons” emerged from the Vietnam War: first came the concept of “rapid decisive operations,” and then counterinsurgency came back into vogue. In these interrelated dimensions, American society and politics shaped the memory of the Vietnam War.
The Watergate affair has become synonymous with political corruption and conspiracy. The crisis has, through fact, fiction, and debate, become considerably more than the arrest of five men breaking into the Democratic Party’s national headquarters in the Watergate complex in Washington DC in the early hours of Saturday, June 17, 1972. Instead, the term “Watergate” has since come to represent the burglary, its failed cover-up, the press investigation, the Senate enquiry, and the eventual resignation of the thirty-seventh president of the United States, Richard Nixon. Arguably, Watergate has come to encompass all the illegalities of the Nixon administration. The crisis broke when the Vietnam War had already sunk public confidence in the executive to a low ebb, and in the context of a society already fractured by the turbulence of the 1960s. As such, Watergate is seen as the nadir of American democracy in the 20th century.
Perversely, despite contemporaries’ genuine fears for the future of the US democratic system, the scandal highlighted the efficiency of the US governmental machine. The investigations that constituted the Watergate enquiry, which were conducted by the legislative and judicial branches and the fourth estate, exposed corruption in the executive of the United States that stretched to the holder of the highest office. The post-war decades had allowed an imperial presidency to develop, which had threatened the country’s political equilibrium. Watergate disclosed that the presidency had overreached its constitutional powers and responsibilities and had conspired to keep those moves hidden from the electorate. More significantly, however, the forced resignation of Richard Nixon revealed that the checks-and-balances system of government, which was conceived almost 200 years before the Watergate affair, worked as those who devised it had planned. Watergate should illustrate to Americans not just the dangers of consolidating great power in the office of the president, but also the means to counteract such growth.
Mark W. Deets
Since the founding of the United States of America, coinciding with the height of the Atlantic slave trade, U.S. officials have based their relations with West Africa primarily on economic interests. Initially, these interests were established on the backs of slaves, as the Southern plantation economy quickly vaulted the United States to prominence in the Atlantic world. After the U.S. abolition of the slave trade in 1808, however, American relations with West Africa focused on the establishment of the American colony of Liberia as a place of “return” for formerly enslaved persons. Following the turn to “legitimate commerce” in the Atlantic and the U.S. Civil War, the United States largely withdrew from large-scale interaction with West Africa. Liberia remained the notable exception, where prominent Pan-African leaders like Edward Blyden, W. E. B. DuBois, and Marcus Garvey helped foster cultural and intellectual ties between West Africa and the Diaspora in the early 1900s. These ties to Liberia were deepened in the 1920s when Firestone Rubber Corporation of Akron, Ohio established a long-term lease to harvest rubber. World War II marked a significant increase in American presence and influence in West Africa. Still focused on Liberia, the war years saw the construction of infrastructure that would prove essential to Allied war efforts and to American security interests during the Cold War. After 1945, the United States competed with the Soviet Union in West Africa for influence and access to important economic and national security resources as African nations ejected colonial regimes across most of the continent. West African independence quickly demonstrated a turn from nationalism to ethnic nationalism, as civil wars engulfed several countries in the postcolonial, and particularly the post-Cold War, era. After a decade of withdrawal, American interest in West Africa revived with the need for alternative sources of petroleum and concerns about transnational terrorism following the attacks of September 11, 2001.
In the early 20th century, West Virginia coal miners and mine operators fought a series of bloody battles that raged for two decades and prompted national debates over workers’ rights. Miners in the southern part of the state lived in towns wholly owned by coal companies and attempted to join the United Mine Workers of America (UMWA) to negotiate better working conditions but most importantly to restore their civil liberties. Mine operators saw unionization as a threat to their businesses and rights and hired armed guards to patrol towns and prevent workers from organizing. The operators’ allies in local and state government used their authority to help break strikes by sending troops to strike districts, declaring martial law, and jailing union organizers in the name of law and order. Observers around the country were shocked at the levels of violence as well as the conditions that fueled the battles. The Mine Wars include the Paint Creek–Cabin Creek Strike of 1912–1913, the so-called 1920 Matewan Massacre, the 1920 Three Days Battle, and the 1921 Battle of Blair Mountain. In this struggle over unionism, the coal operators prevailed, and West Virginia miners continued to work in nonunion mines and live in company towns through the 1920s.
An overview of Euro-American internal migration in the United States between 1940 and 1980 explores the overall population movement away from rural areas to cities and suburban areas. Although focused on white Americans and their migrations, there are similarities to the Great Migration of African Americans, who continued to move out of the South during the mid-20th century. In the early period, the industrial areas in the North and West attracted most of the migrants. Mobilization for World War II loosened rural dwellers who were long kept in place by low wages, political disfranchisement, and low educational attainment. The war also attracted significant numbers of women to urban centers in the North and West. After the war, migration increased, enticing white Americans to become not just less rural but also increasingly suburban. The growth of suburbs throughout the country was prompted by racial segregation in housing that made many suburban areas white and earmarked many urban areas for people of color. The result was incredible growth in suburbia: from 22 million living in those areas in 1940 to triple that in 1970. Later in the period, as the Steelbelt rusted, the rise of the West as a migration magnet was spurred by development strategies, federal investment in infrastructure, and military bases. Sunbelt areas were making investments that stood ready to recruit industries and of course people, especially from Rustbelt areas in the North. By the dawn of the 21st century, half of the American population resided in suburbs.
Michael Patrick Cullinane
Between 1897 and 1901 the administration of Republican President William McKinley transformed US foreign policy traditions and set a course for empire through interconnected economic policies and an open aspiration to achieve greater US influence in global affairs. The primary changes he undertook as president included the arrangement of inter-imperial agreements with world powers, a willingness to use military intervention as a political solution, the establishment of a standing army, and the adoption of a “large policy” that extended American jurisdiction beyond the North American continent. Opposition to McKinley’s policies coalesced around the annexation of the Philippines and the suppression of the Boxer Rebellion in China. Anti-imperialists challenged McKinley’s policies in many ways, but despite fierce debate, the president’s actions and advocacy for greater American power came to define US policymaking for generations to come. McKinley’s administration merits close study.
Stephen P. Randolph
Best known as Abraham Lincoln’s secretary of state during the Civil War, William Henry Seward conducted full careers as a statesman, politician, and visionary of America’s future, both before and after that traumatic conflict. His greatest legacy, however, lay in his service as the secretary of state, leading the diplomatic effort to prevent European intervention in the conflict. His success in that effort marked the margin between the salvation and the destruction of the Union. Beyond his role as diplomat, Seward’s signature qualities of energy, optimism, ambition, and opportunism enabled him to assume a role in the Lincoln administration extending well beyond his diplomatic role as the secretary of state. Those same qualities secured a close working relationship with the president as Seward overcame a rocky first few weeks in office to become Lincoln’s confidant and sounding board.
Seward’s career in politics stretched from the 1830s until 1869. Through that time, he maintained a vision of a United States of America built on opportunity and free labor, powered by government’s active role in internal improvement and education. He foresaw a nation fated to expand across the continent and overseas, with expansion occurring peacefully as a result of American industrial and economic strength and its model of government. During his second term as secretary of state, under the Johnson administration, Seward attempted a series of territorial acquisitions in the Caribbean, the Pacific, and on the North American continent. The state of the post-war nation and its fractious politics precluded success in most of these attempts, but Seward was successful in negotiating and securing Congressional ratification of the purchase of Alaska in 1867. In addition, Seward pursued a series of policies establishing paths followed later by US diplomats, including the open door in China and the acquisition of Hawaii and US naval bases in the Caribbean.
An ungainly word, it has proven tenacious. Since the early Cold War, “Wilsonianism” has been employed by historians and analysts of US foreign policy to denote two historically related but ideologically and operationally distinct approaches to world politics. One is the foreign policy of the term’s eponym, President Woodrow Wilson, during and after World War I—in particular his efforts to engage the United States and other powerful nations in the cooperative maintenance of order and peace through a League of Nations. The other is the tendency of later administrations and political elites to deem an assertive, interventionist, and frequently unilateralist foreign policy necessary to advance national interests and preserve domestic institutions. Both versions of Wilsonianism have exerted massive impacts on US and international politics and culture. Yet both remain difficult to assess or even define. As historical phenomena they are frequently conflated; as philosophical labels they are ideologically freighted. Perhaps the only consensus is that the term implies the US government’s active rather than passive role in the international order.
It is nevertheless important to distinguish Wilson’s “Wilsonianism” from certain doctrines and practices later attributed to him or traced to his influence. The major reasons are two. First, misconceptions surrounding the aims and outcomes of Wilson’s international policies continue to distort historical interpretation in multiple fields, including American political, cultural, and diplomatic history and the history of international relations. Second, these distortions encourage the conflation of Wilsonian internationalism with subsequent yet distinct developments in American foreign policy. The confused result promotes ideological over historical readings of the nation’s past, which in turn constrain critical and creative thinking about its present and future as a world power.
Rebecca J. Mead
Woman suffragists in the United States engaged in a sustained, difficult, and multigenerational struggle: seventy-two years elapsed between the Seneca Falls convention (1848) and the passage of the Nineteenth Amendment (1920). During these years, activists gained confidence, developed skills, mobilized resources, learned to maneuver through the political process, and built a social movement. This essay describes key turning points and addresses internal tensions as well as external obstacles in the U.S. woman suffrage movement. It identifies important strategic, tactical, and rhetorical approaches that supported women’s claims for the vote and influenced public opinion, and shows how the movement was deeply connected to contemporaneous social, economic, and political contexts.
Meg D. O'Sullivan
Women in the United States have drunk, made, bought, sold, and organized both against and for the consumption of alcohol throughout the nation’s history. During the second half of the 20th century, however, women became increasingly visible as social drinkers and alcoholics. Specifically, the 1970s and 1980s marked women’s relationship to alcohol in interesting ways that both echoed moments from the past and ushered in new realities. Throughout these decades, women emerged as: (1) alcoholics who sought recovery in Alcoholics Anonymous or a lesser-known all-women’s sobriety program; (2) anti-alcohol activists who drew authority from their status as mothers; (3) potential criminals who harmed their progeny via fetal alcohol syndrome; and (4) recovery memoirists who claimed their addictions in unprecedented ways.
Two images dominated popular portrayals of American women in the 1950s. One was the fictional June Cleaver, the female lead character in the popular television program, “Leave It to Beaver,” which portrayed Cleaver as the stereotypical happy American housewife, the exemplar of postwar American domesticity. The other was Cleaver’s alleged real-life opposite, described in Betty Friedan’s The Feminine Mystique (1963) as miserable, bored, isolated, addicted to tranquilizers, and trapped in look-alike suburban tract houses, which Friedan termed “comfortable concentration camps.” Both stereotypes ignore significant proportions of the postwar female population, both offer simplistic and partial views of domesticity, but both reveal the depth of the influence that lay behind the idea of domesticity, real or fictional. Aided and abetted by psychology, social science theory, advertising, popular media, government policy, law, and discriminatory private sector practices, domesticity was both a myth and a powerful ideology that shaped the trajectories of women’s lives.
Kelly A. Ryan
Patriarchy profoundly affected social relations and the daily lives of individuals in early America by supporting the elaboration of both racial differences and sexual hierarchies. Patriarchal ideals held that men should supervise women and that economic, sexual, legal, and political power rested with men. Laws and religious practices demanded women’s subordination to men, and governmental and extralegal controls on women’s sexual and familial lives buttressed patriarchal ideals and practices by enforcing their dependence on white men.
Women played a variety of roles within households, which differed according to region, race, generation, and condition of servitude. Marriage was central to the delineation of white women’s roles, and slavery was critical to developing ideas and laws affecting African American women’s place in society. Interactions with Europeans brought patriarchal influences into native women’s lives. Indian servitude and slavery, European missionary efforts, and cross-cultural diplomacy resulted in the transmission of patriarchal practices that undermined Indian women’s access to political, sexual, economic, and religious power
Women gained esteem for fulfilling their duties within the household and community, while others resisted patriarchal customs and forged their own paths. Some women served as agents of patriarchy and used their status or positions to oppress other women. White women often held power over others in their households, including servants and slaves, and in the early republic some of the public sphere activities of middle-class white women targeted the homes of Native Americans, African Americans, and poor women for uplift. Other women resisted subordination and found autonomy by pursuing their own goals. Sexuality was a critical arena in which women could breech dictates on behavior and advance their own agenda, though not always without consequences. Women in urban communities found greater economic opportunities, and some religious communities, like the Society of Friends, allowed women a larger role in decision making and religious speech. Though patriarchal structures would change over time, the idea of men as the leaders of the household and society was remarkably resilient through the 19th century.
Sheila L. Skemp
Historians once assumed that, because women in the era of the American Revolution could not vote and showed very little interest in attaining the franchise, they were essentially apolitical beings. Scholars now recognize that women were actively engaged in the debates that accompanied the movement toward independence, and that after the war many sought a more expansive political role for themselves. Moreover, men welcomed women’s support for the war effort. If they saw women as especially fit for domestic duties, many continued to seek women’s political guidance and help even after the war ended.
Granted, those women who wanted a more active and unmediated relationship to the body politic faced severe legal and ideological obstacles. The common law system of coverture gave married women no control over their bodies or to property, and thus accorded them no formal venue to express their political opinions. Religious convention had it that women, the “weaker sex,” were the authors of original sin. The ideology associated with “republicanism” argued that the attributes of independence, self-reliance, physical strength, and bravery were exclusively masculine virtues. Many observers characterized women as essentially selfish and frivolous creatures who hungered after luxuries and could not contain their carnal appetites. Nevertheless, some women carved out political roles for themselves.
In the lead up to the war, many women played active, even essential roles in various non-consumption movements, promising to refrain from purchasing English goods, and attacking those merchants who refused to boycott prohibited goods. Some took to the streets, participating in riots that periodically disturbed the tranquility of colonial cities. A few published plays and poems proclaiming their patriotic views. Those women, who would become loyalists, were also active, never reluctant, to express their disapproval of the protest movement.
During the war, many women demonstrated their loyalty to the patriot cause by shouldering the burdens of absent husbands. They managed farms and businesses. First in Philadelphia, and then in other cities, women went from door to door collecting money for the Continental Army. Some accompanied husbands to the battlefront, where they tended to the material needs of soldiers. A very few disguised themselves as men and joined the army, exposing as a lie the notion that only men had the capacity to sacrifice their lives for the good of the country. Loyalist women continued to express their political views, even though doing so brought them little more than physical suffering and emotional pain. African American women took advantage of wartime chaos to run away from their masters and forge new, independent lives for themselves.
After the war, women marched in parades, lobbied and petitioned legislators, attended sessions of Congress, and participated in political rallies—lending their support to particular candidates or factions. Elite women published novels, poems, and plays. Some hosted salons where men and women gathered to discuss political issues. In New Jersey, single property-owning women voted.
By the end of the century, however, proponents of women’s political rights lost ground, in part because new “scientific” notions of gender difference prepared the way for the concept of “separate spheres.” Politics became more organized, leaving little room for women to express their views “out of doors,” even as judges and legislators defined women as naturally dependent. Still, white, middle class women in particular took advantage of better educational opportunities, finding ways to influence the public sphere without demanding formal political rights. They read, wrote, and organized benevolent societies, laying the groundwork for the antebellum reform movements of the mid-19th century.
Catherine A. Brekus
Historically, women in colonial North America and the United States have been deeply influenced by their religious traditions. Even though world religions like Judaism, Christianity, Buddhism, Hinduism, and Islam are based on scriptural traditions that portray women as subordinate to men, women have made up the majority of most religious groups in America. While some Americans have used religious arguments to limit women’s legal, political, and economic rights, others have drawn on scripture to defend women’s dignity and equality. Women’s religious beliefs have shaped every aspect of their lives, including their choices about how to structure their time, their attitudes toward sexuality and the body, and their understanding of suffering. Unlike early American Catholic women, who saw their highest religious calling as the sisterhood, most white colonial women identified their primary religious vocation as ministering to their families. In the 19th century, however, white Protestant women become increasingly involved in reform movements like temperance, abolitionism, and women’s suffrage, and African-American, Native American, Asian-American, and Latina women used religious arguments to challenge assumptions about white racial supremacy. In the 20th century, growing numbers of women from many different religious traditions have served as religious leaders, and in some cases they have also demanded ordination. Despite these dramatic changes in religious life, however, many religiously conservative women opposed the Equal Rights Amendment during the 1970s and early 1980s, and in the first decades of the 21st century they have continued to identify feminism and religion as antithetical.
Shannon K. Withycombe
Throughout the 19th century, American women experienced vast changes regarding possibilities for childbirth and for enhancing or restricting fertility control. At the beginning of the century, issues involving reproduction were discussed primarily in domestic, private settings among women’s networks that included family members, neighbors, or midwives. In the face of massive social and economic changes due to industrialization, urbanization, and immigration, many working-class women became separated from these traditional networks and knowledge and found themselves reliant upon emerging medical systems for care and advice during pregnancy and childbirth. At the same time, upper-class women sought out men in the emerging profession of obstetrics to deliver their babies in hopes of beating the frightening odds against maternal and infant health and even survival. Nineteenth-century reproduction was altered drastically with the printing and commercial boom of the middle of the century. Families could now access contraception and abortion methods and information, which was available earlier in the century albeit in a more private and limited manner, through newspapers, popular books, stores, and from door-to-door salesmen. As fertility control entered these public spaces, many policy makers became concerned about the impacts of such practices on the character and future of the nation. By the 1880s, contraception and abortion came under legal restrictions, just as women and their partners gained access to safer and more effective products than ever before. When the 19th century closed, legislatures and the medical profession raised obstacles that hindered the ability of most women to limit the size of their families as the national fertility rate reached an all-time low. Clearly, American families eagerly seized opportunities to exercise control over their reproductive destinies and their lives.
In the United States, the history of sexual assault in the first half of the 20th century involves multiple contradictions between the ordinary, almost invisible accounts of women of all colors who were raped by fathers, husbands, neighbors, boarders, bosses, hired hands, and other known individuals versus the sensational myths that involved rapacious black men, sly white slavers, libertine elites, and virginal white female victims. Much of the debate about sexual assault revolved around the “unwritten law” that justified “honorable” white men avenging the “defilement” of their women. Both North and South, white people defended lynching and the murder of presumed rapists as “honor killings.” In courtrooms, defense attorneys linked the unwritten law to insanity pleas, arguing that after hearing women tell about their assault, husbands and fathers experienced an irresistible compulsion to avenge the rape of their women. Over time, however, notorious court cases from New York to San Francisco, Indianapolis and Honolulu, to Scottsboro, Alabama, shifted the discourse away from the unwritten law and extralegal “justice” to a more complicated script that demonized unreliable women and absolved imperfect men. National coverage of these cases, made possible by wire services and the Hearst newspaper empire, spurred heated debates concerning the proper roles of men and women. Blockbuster movies like The Birth of a Nation and Gone with the Wind and Book of the Month Club selections such as John Steinbeck’s Of Mice and Men and Richard Wright’s Native Son joined the sensationalized media coverage of high-profile court cases to create new national stereotypes about sexual violence and its causes and culprits. During the 1930s, journalists, novelists, playwrights, and moviemakers increasingly emphasized the culpability of women who, according to this narrative, made themselves vulnerable to assault by stepping outside of their appropriate sphere and tempting men into harming them.
Jessica M. Frazier
Women on all sides of the US war in Vietnam pushed for an end to the conflict. At a time of renewed feminist fervor, women stepped outside conventional gender roles by publicly speaking out, traveling to a war zone, and entering the male-dominated realm of foreign affairs. Even so, some claimed to stand squarely within the boundaries of womanhood as they undertook such unusual activities. Some American women argued that, as mothers or sisters of soldiers and draft-age men, they held special insight into the war. They spoke of their duty to their families, communities, and nation to act in untraditional, but nevertheless feminine, ways. But women did not act uniformly. Some joined the military as nurses or service personnel to help in the war effort, while others protested the war and served as draft counselors. By the end of the war, some anti-war protestors developed feminist critiques of US involvement in Vietnam that pointed to the war as a symptom of an unjust society that prioritized military dominance over social welfare. As in wars past, the US war in Vietnam created upheavals in gender roles, and as nurses, mothers, lovers, officers, entertainers, and activists, women created new spaces in a changing society.
Erica J. Ryan
The first Red Scare, after World War I, and the Red Scare that followed World War II, both impacted American women in remarkably similar ways. Many women found their lives hemmed in by antifeminism and the conservative gender ideology that underwrote anticommunist national identity in 1919, and then again in the late 1940s. This cultural nationalism tied traditional gender norms to the defense of American values and ideals, positioning the family as a bulwark against communism while making women’s performance of gender roles symbolic of national health or sickness. Within this gendered nationalism, the first Red Scare offered opportunities for conservative women to join the antiradical cause as protectors of the home. These same antiradicals maligned radical and progressive women for their feminism and their social activism. The second Red Scare played out in similar fashion. Anticommunism provided a safe platform for conservative women to engage in political activism in defense of the family, and in turn, they participated in broader efforts that attacked and weakened civil rights claims and the social justice efforts of women on the left. In each Red Scare the symbols and rhetoric of anticommunism prioritized women’s relationship to the family, positioning them either as bastions of American virtue or as fundamental threats to the social and political order. Gender proved critical to the construction of patriotism and national identity.
Melissa A. McEuen
The Second World War changed the United States for women, and women in turn transformed their nation. Over three hundred fifty thousand women volunteered for military service, while twenty times as many stepped into civilian jobs, including positions previously closed to them. More than seven million women who had not been wage earners before the war joined eleven million women already in the American work force. Between 1941 and 1945, an untold number moved away from their hometowns to take advantage of wartime opportunities, but many more remained in place, organizing home front initiatives to conserve resources, to build morale, to raise funds, and to fill jobs left by men who entered military service.
The U.S. government, together with the nation’s private sector, instructed women on many fronts and carefully scrutinized their responses to the wartime emergency. The foremost message to women—that their activities and sacrifices would be needed only “for the duration” of the war—was both a promise and an order, suggesting that the war and the opportunities it created would end simultaneously. Social mores were tested by the demands of war, allowing women to benefit from the shifts and make alterations of their own. Yet dominant gender norms provided ways to maintain social order amidst fast-paced change, and when some women challenged these norms, they faced harsh criticism. Race, class, sexuality, age, religion, education, and region of birth, among other factors, combined to limit opportunities for some women while expanding them for others.
However temporary and unprecedented the wartime crisis, American women would find that their individual and collective experiences from 1941 to 1945 prevented them from stepping back into a prewar social and economic structure. By stretching and reshaping gender norms and roles, World War II and the women who lived it laid solid foundations for the various civil rights movements that would sweep the United States and grip the American imagination in the second half of the 20th century.