Emerson W. Baker
The Salem Witch Trials are one of the best known, most studied, and most important events in early American history. The afflictions started in Salem Village (present-day Danvers), Massachusetts, in January 1692, and by the end of the year the outbreak had spread throughout Essex County, and threatened to bring down the newly formed Massachusetts Bay government of Sir William Phips. It may have even helped trigger a witchcraft crisis in Connecticut that same year. The trials are known for their heavy reliance on spectral evidence, and numerous confessions, which helped the accusations grow. A total of 172 people are known to have been formally charged or informally cried out upon for witchcraft in 1692. Usually poor and marginalized members of society were the victims of witchcraft accusations, but in 1692 many of the leading members of the colony were accused. George Burroughs, a former minister of Salem Village, was one of the nineteen people convicted and executed. In addition to these victims, one man, Giles Cory, was pressed to death, and five died in prison. The last executions took place in September 1692, but it was not until May 1693 that the last trial was held and the last of the accused was freed from prison.
The trials would have lasting repercussions in Massachusetts and signaled the beginning of the end of the Puritan City upon a Hill, an image of American exceptionalism still regularly invoked. The publications ban issued by Governor Phips to prevent criticism of the government would last three years, but ultimately this effort only ensured that the failure of the government to protect innocent lives would never be forgotten. Pardons and reparations for some of the victims and their families were granted by the government in the early 18th century, and the legislature would regularly take up petitions, and discuss further reparations until 1749, more than fifty years after the trials. The last victims were formally pardoned by the governor and legislature of Massachusetts in 2001.
Rachel Hope Cleves
The task of recovering the history of same-sex love among early American women faces daunting challenges of definition and sources. Modern conceptions of same-sex sexuality did not exist in early America, but alternative frameworks did. Many indigenous nations had social roles for female-bodied individuals who lived as men, performed male work, and acquired wives. Early Christian settlers viewed sexual encounters between women as sodomy, but also valued loving dyadic bonds between religious women. Primary sources indicate that same-sex sexual practices existed within western and southern African societies exploited by the slave trade, but little more is known. The word “lesbian” has been used to signify erotics between women since roughly the 10th century, but historians must look to women who led lesbian-like lives in early America rather than to women who self-identified as lesbians. Stories of female husbands who passed as men and married other women were popular in the 18th century. Tales of passing women who served in the military, in the navy, and as pirates also amused audiences and raised the spectre of same-sex sexuality. Some female religious leaders trespassed conventional gender roles and challenged the marital sexual order. Other women conformed to female gender roles, but constructed loving female households. 18th-century pornography depicting lesbian sexual encounters indicates that early Americans were familiar with the concept of sex between women. A few court records exist from prosecutions of early American women for engaging in lewd acts together. Far more common, by the end of the 18th century, were female-authored letters and diaries describing the culture of romantic friendship, which sometimes extended to sexual intimacy. Later in the 19th century, romantic friendship became an important ingredient in the development of lesbian culture and identity.
Steven K. Green
Separation of church and state has long been viewed as a cornerstone of American democracy. At the same time, the concept has remained highly controversial in the popular culture and law. Much of the debate over the application and meaning of the phrase focuses on its historical antecedents. This article briefly examines the historical origins of the concept and its subsequent evolutions in the nineteenth century.
Both sexuality and religion are terms as vexatious to define as they can be alluring to pursue. In the contemporary period, figuring out one’s sexual feelings, identity, and preferences has become a signal aspect of self-formation. Understanding one’s religious feelings, identity, and preferences may seem less imminent, but is certainly no less complicated. Both terms cause no small amount of confusion. Clearing up some of this confusion requires speaking frankly about delicate matters, and also speaking flatly about enormously complex experiences. Popular media coverage of ecclesiastical sex scandals in America suggests that people enjoy hearing about the profanation of religious duty. Despite the observed, inferred, and accused sexuality in American religious history, or maybe because of it, eroticism suffuses narrative accounts of American religious history and descriptions of religious actors. In U.S. history, sexuality has often been a key lens through which we have understood the nature of religion, the leaders of religions, and the reason for religious commitment.
Anne Sarah Rubin
Sherman’s March, more accurately known as the Georgia and Carolinas Campaigns, cut a swath across three states in 1864–1865. It was one of the most significant campaigns of the war, making Confederate civilians “howl” as farms and plantations were stripped of everything edible and all their valuables. Outbuildings, and occasionally homes, were burned, railroads were destroyed, and enslaved workers were emancipated. Long after the war ended, Sherman’s March continued to shape American’s memories as one of the most symbolically powerful aspects of the Civil War.
Sherman’s March began with the better-known March to the Sea, which started in Atlanta on November 15, 1864, and concluded in Savannah on December 22 of the same year. Sherman’s men proceeded through South Carolina and North Carolina in February, March, and April of 1865. The study of this military campaign illuminates the relationships between Sherman’s soldiers and Southern white civilians, especially women, and African Americans. Sherman’s men were often uncomfortable with their role as an army of liberation, and African Americans, in particular, found the March to be a double-edged sword.
The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.
The patterns of urban slavery in North American and pre-Civil War US cities reveal the ways in which individual men and women, as well as businesses, institutions, and governmental bodies employed slave labor and readily adapted the system of slavery to their economic needs and desires. Colonial cities east and west of the Mississippi River founded initially as military forts, trading posts, and maritime ports, relied on African and Native American slave labor from their beginnings. The importance of slave labor increased in Anglo-American East Coast urban settings in the 18th century as the number of enslaved Africans increased in these colonies, particularly in response to the growth of the tobacco, wheat, and rice industries in the southern colonies. The focus on African slavery led most Anglo-American colonies to outlaw the enslavement of Native Americans, and urban slavery on the East Coast became associated almost solely with people of African descent. In addition, these cities became central nodes in the circum-Atlantic transportation and sale of enslaved people, slave-produced goods, and provisions for slave colonies whose economies centered on plantation goods. West of the Mississippi, urban enslavement of Native Americans, Mexicans, and even a few Europeans continued through the 19th century.
As the thirteen British colonies transitioned to the United States during and after the Revolutionary War, three different directions emerged regarding the status of slavery, which would affect the status of slavery and people of African descent in cities. The gradual emancipation of enslaved people in states north of Delaware led to the creation of the so-called free states, with large numbers of free blacks moving into cities to take full advantage of freedom and the possibility of creating family and community. Although antebellum northern cities were located within areas where legalized slavery ended, these cities retained economic and political ties to southern slavery. At the same time, the radical antislavery movement developed in Philadelphia, Boston, and New York. Thus, Northern cities were the site of political conflicts between pro- and antislavery forces. In the Chesapeake, as the tobacco economy declined, slave owners manumitted enslaved blacks for whom they did not have enough work, creating large groups of free blacks in cities. But these states began to participate heavily in the domestic slave trade, with important businesses located in cities. And in the Deep South, the recommitment to slavery following the Louisiana Purchase and the emergence of the cotton economy led to the creation of a string of wealthy port cities critical to the transportation of slaves and goods. These cities were situated in local economic geographies that connected rural plantations to urban settings and in national and international economies of exchange of raw and finished goods that fueled industries throughout the Atlantic world. The vast majority of enslaved people employed in the antebellum South worked on rural farms, but slave labor was a key part of the labor force in southern cities. Only after the Civil War did slavery and cities become separate in the minds of Americans, as postwar whites north and south created a mythical South in which romanticized antebellum cotton plantations became the primary symbol of American slavery, regardless of the long history of slavery that preceded their existence.
Peter C. Baldwin
Today the term nightlife typically refers to social activities in urban commercial spaces—particularly drinking, dancing, dining, and listening to live musical performances. This was not always so. Cities in the 18th and early 19th centuries knew relatively limited nightlife, most of it occurring in drinking places for men. Theater attracted mixed-gender audiences but was sometimes seen as disreputable in both its content and the character of the audience. Theater owners worked to shed this negative reputation starting in the mid-19th century, while nightlife continued to be tainted by the profusion of saloons, brothels, and gambling halls. Gradual improvements in street lighting and police protection encouraged people to go out at night, as did growing incomes and decreasing hours of labor. Nightlife attracted more women in the decades around 1900 as it expanded and diversified. Dance halls, vaudeville houses, movie theaters, restaurants, and cabarets thrived in the electrified “bright lights” districts of central cities. Commercial entertainment contracted again in the 1950s and 1960s as Americans spent more of their evening leisure hours watching television and began to regard urban public spaces with suspicion. Still, nightlife is viewed as an important component of urban economic life and is actively promoted by many municipal governments.
H. Paul Thompson Jr.
The temperance and prohibition movement—a social reform movement that pursued many approaches to limit or prohibit the use and/or sale of alcoholic beverages—is arguably the longest-running reform movement in US history, extending from the 1780s through the repeal of national prohibition in 1933. During this 150-year period the movement experienced many ideological, organizational, and methodological changes. Probably the most widely embraced antebellum reform, many of its earliest assumptions and much of its earliest literature was explicitly evangelical, but over time the movement assumed an increasingly secular image while retaining strong ties to organized religion. During the movement’s first fifty years, its definition of temperance evolved successively from avoiding drunkenness, to abstaining from all distilled beverages, to abstaining from all intoxicating beverages (i.e., “teetotalism”). During these years, reformers sought merely to persuade others of their views—what was called “moral suasion.” But by the 1840s many reformers began seeking the coercive power of local and state governments to prohibit the “liquor traffic.” These efforts were called “legal suasion,” and in the early 20th century, when local and state laws were deemed insufficient, movement leaders turned to the federal government. Throughout its history, movement leaders produced an extensive and well-preserved serial and monographic literature to chronicle their efforts, which makes the movement relatively easy to study.
No less than five national temperance organizations rose and fell across the movement’s history, aided by many other organizations also promoted the message with great effect. Grass roots reformers organized innumerable state and local temperance societies and fraternal lodges committed to abstinence. Temperance reformers, hailing from nearly every conceivable demographic, networked through a series of national and international temperance conventions, and at any given time were pursuing a diverse and often conflicting array of priorities and methodologies.
Finally, during the Progressive Era, reformers focused their hatred for alcohol almost exclusively on saloons and the liquor traffic. Through groundbreaking lobbying efforts and a fortuitous convergence of social and political forces, reformers witnessed the ratification of the Eighteenth Amendment in January 1919 that established national prohibition. Despite such a long history of reform, the success seemed sudden and caught many in the movement off guard. The rise of liquor-related violence, a transformation in federal-state relations, increasingly organized and outspoken opposition, the Great Depression, and a re-alignment of political party coalitions all culminated in the sweeping repudiation of prohibition and its Republican supporters in the 1932 presidential election. On December 5, 1933, the Twenty-first Amendment to the Constitution repealed the Eighteenth Amendment, returning liquor regulation to the states, which have since maintained a wide variety of ever changing laws controlling the sale of alcoholic beverages. But national prohibition permanently altered the federal government’s role in law enforcement, and its legacy remains.
Ross A. Kennedy
World War I profoundly affected the United States. It led to an expansion of America’s permanent military establishment, a foreign policy focused on reforming world politics, and American preeminence in international finance. In domestic affairs, America’s involvement in the war exacerbated class, racial, and ethnic conflict. It also heightened both the ethos of voluntarism in progressive ideology and the progressive desire to step up state intervention in the economy and society. These dual impulses had a coercive thrust that sometimes advanced progressive goals of a more equal, democratic society and sometimes repressed any perceived threat to a unified war effort. Ultimately the combination of progressive and repressive coercion undermined support for the Democratic Party, shifting the nation’s politics in a conservative direction as it entered the 1920s.
In the decade after 1965, radicals responded to the alienating features of America’s technocratic society by developing alternative cultures that emphasized authenticity, individualism, and community. The counterculture emerged from a handful of 1950s bohemian enclaves, most notably the Beat subcultures in the Bay Area and Greenwich Village. But new influences shaped an eclectic and decentralized counterculture after 1965, first in San Francisco’s Haight-Ashbury district, then in urban areas and college towns, and, by the 1970s, on communes and in myriad counter-institutions. The psychedelic drug cultures around Timothy Leary and Ken Kesey gave rise to a mystical bent in some branches of the counterculture and influenced counterculture style in countless ways: acid rock redefined popular music; tie dye, long hair, repurposed clothes, and hip argot established a new style; and sexual mores loosened. Yet the counterculture’s reactionary elements were strong. In many counterculture communities, gender roles mirrored those of mainstream society, and aggressive male sexuality inhibited feminist spins on the sexual revolution. Entrepreneurs and corporate America refashioned the counterculture aesthetic into a marketable commodity, ignoring the counterculture’s incisive critique of capitalism. Yet the counterculture became the basis of authentic “right livelihoods” for others. Meanwhile, the politics of the counterculture defy ready categorization. The popular imagination often conflates hippies with radical peace activists. But New Leftists frequently excoriated the counterculture for rejecting political engagement in favor of hedonistic escapism or libertarian individualism. Both views miss the most important political aspects of the counterculture, which centered on the embodiment of a decentralized anarchist bent, expressed in the formation of counter-institutions like underground newspapers, urban and rural communes, head shops, and food co-ops. As the counterculture faded after 1975, its legacies became apparent in the redefinition of the American family, the advent of the personal computer, an increasing ecological and culinary consciousness, and the marijuana legalization movement.
David M. Robinson
New England transcendentalism is the first significant literary movement in American history, notable principally for the influential works of Ralph Waldo Emerson, Margaret Fuller, and Henry David Thoreau. The movement emerged in the 1830s as a religious challenge to New England Unitarianism. Building on the writings of the Unitarian leader William Ellery Channing, Emerson and others such as Frederic Henry Hedge, George Ripley, James Freeman Clarke, and Theodore Parker developed a theology based on interior, intuitive experience rather than the historical truth of the Bible. By 1836 transcendentalist books from several important religious thinkers began to appear, including Emerson’s Nature, which employed idealist philosophy and Romantic symbolism to examine human interaction with the natural world. Emerson’s Harvard addresses, “The American Scholar” (1837) and the controversial “Divinity School Address” (1838), gave transcendental ideas a wider prominence, and also generated strong resistance that added an element of experiment and danger to the movement’s reputation. In 1840 the transcendentalists founded a journal for their work, and Fuller became the Dial’s first editor, a position that gave her an important role in the movement and a crucial outlet for her own work in literary criticism and women’s rights.
Though it had begun as a religious movement, by the middle 1840s transcendentalism could be better described as a literary movement with growing political engagements on several fronts. Emerson proclaimed it as an era of reform and aligned the transcendentalists with those who resisted the social and political status quo. In her feminist manifesto Woman in the Nineteenth Century (1845), Fuller called for the removal of both legal and social barriers to women’s full potential. In 1845 Henry David Thoreau went to live in the woods by Walden Pond; his memoir of his experience, Walden (1854), became a founding text of modern environmental thinking. Antislavery also became a key concern for many of the transcendentalists, who condemned the Fugitive Slave Act of 1850 and actively resisted the execution of the law after its passage. The transcendentalists, a nineteenth-century cultural avant-garde, continue to exert cultural influence through the durability of their writings, works that shaped many aspects of American national development.
Paul V. Murphy
Americans grappled with the implications of industrialization, technological progress, urbanization, and mass immigration with startling vigor and creativity in the 1920s even as wide numbers kept their eyes as much on the past as on the future. American industrial engineers and managers were global leaders in mass production, and millions of citizens consumed factory-made products, including electric refrigerators and vacuum cleaners, technological marvels like radios and phonographs, and that most revolutionary of mass-produced durables, the automobile. They flocked to commercial amusements (movies, sporting events, amusement parks) and absorbed mass culture in their homes, through the radio and commercial recordings. In the major cities, skyscrapers drew Americans upward while thousands of new miles of roads scattered them across the country. Even while embracing the dynamism of modernity, Americans repudiated many of the progressive impulses of the preceding era. The transition from war to peace in 1919 and 1920 was tumultuous, marked by class conflict, a massive strike wave, economic crisis, and political repression. Exhausted by reform, war, and social experimentation, millions of Americans recoiled from central planning and federal power and sought determinedly to bypass traditional politics in the 1920s. This did not mean a retreat from active and engaged citizenship; Americans fought bitterly over racial equality, immigration, religion, morals, Prohibition, economic justice, and politics. In a greatly divided nation, citizens experimented with new forms of nationalism, cultural identity, and social order that could be alternatively exclusive and pluralistic. Whether repressive or tolerant, such efforts held the promise of unity amid diversity; even those in the throes of reaction sought new ways of integration. The result was a nation at odds with itself, embracing modernity, sometimes heedlessly, while seeking desperately to retain a grip on the past.
The transformation of post-industrial American life in the late 20th and early 21st centuries includes several economically robust metropolitan centers that stand as new models of urban and economic life, featuring well-educated populations that engage in professional practices in education, medical care, design and legal services, and artistic and cultural production. By the early 21st century, these cities dominated the nation’s consciousness economically and culturally, standing in for the most dynamic and progressive sectors of the economy, driven by collections of technical and creative spark. The origins of these academic and knowledge centers are rooted in the political economy, including investments shaped by federal policy and philanthropic ambition. Education and health care communities were and remain frequently economically robust but also rife with racial, economic, and social inequality, and riddled with resulting political tensions over development. These information communities fundamentally incubated and directed the proceeds of the new economy, but also constrained who accessed this new mode of wealth in the knowledge economy.
Christopher P. Loss
Until World War II, American universities were widely regarded as good but not great centers of research and learning. This changed completely in the press of wartime, when the federal government pumped billions into military research, anchored by the development of the atomic bomb and radar, and into the education of returning veterans under the GI Bill of 1944. The abandonment of decentralized federal–academic relations marked the single most important development in the history of the modern American university. While it is true that the government had helped to coordinate and fund the university system prior to the war—most notably the country’s network of public land-grant colleges and universities—government involvement after the war became much more hands-on, eventually leading to direct financial support to and legislative interventions on behalf of core institutional activities, not only the public land grants but the nation’s mix of private institutions as well. However, the reliance on public subsidies and legislative and judicial interventions of one kind or another ended up being a double-edged sword: state action made possible the expansion in research and in student access that became the hallmarks of the post-1945 American university; but it also created a rising tide of expectations for continued support that has proven challenging in fiscally stringent times and in the face of ongoing political fights over the government’s proper role in supporting the sector.
Megan Kate Nelson
During the American Civil War, Union and Confederate commanders made the capture and destruction of enemy cities a central feature of their military campaigns. They did so for two reasons. First, most mid-19th-century cities had factories, foundries, and warehouses within their borders, churning out and storing war materiel; military officials believed that if they interrupted or incapacitated the enemy’s ability to arm or clothe themselves, the war would end. Second, it was believed that the widespread destruction of property—especially in major or capital cities—would also damage civilians’ morale, undermining their political convictions and decreasing their support for the war effort.
Both Union and Confederate armies bombarded and burned cities with these goals in mind. Sometimes they fought battles on city streets but more often, Union troops initiated long-term sieges in order to capture Confederate cities and demoralize their inhabitants. Soldiers on both sides were motivated by vengeance when they set fire to city businesses and homes; these acts were controversial, as was defensive burning—the deliberate destruction of one’s own urban center in order to keep its war materiel out of the hands of the enemy.
Urban destruction, particularly long-term sieges, took a psychological toll on (mostly southern) city residents. Many were wounded, lost property, or were forced to become refugees. Because of this, the destruction of cities during the American Civil War provoked widespread discussions about the nature of “civilized warfare” and the role that civilians played in military strategy. Both soldiers and civilians tried to make sense of the destruction of cities in writing, and also in illustrations and photographs; images in particular shaped both northern and southern memories of the war and its costs.
Rioting in the United States since 1800 has adhered to three basic traditions: regulating communal morality, defending community from outside threats, and protesting government abuse of power. Typically, crowds have had the shared interests of class, group affiliation, geography, or a common enemy. Since American popular disorder has frequently served as communal policing, the state—especially municipal police—has had an important role in facilitating, constraining, or motivating unrest.
Rioting in the United States retained strong legitimacy and popular resonance from 1800 to the 1960s. In the decades after the founding, Americans adapted English traditions of restrained mobbing to more diverse, urban conditions. During the 19th century, however, rioting became more violent and ambitious as Americans—especially white men—asserted their right to use violence to police heterogeneous public space. In the 1840s and 1850s, whites combined the lynch mob with the disorderly crowd to create a lethal and effective instrument of white settler sovereignty both in the western territories and in the states. From the 1860s to the 1930s, white communities across the country, particularly in the South, used racial killings and pogroms to seize political power and establish and enforce Jim Crow segregation. Between the 1910s and the 1970s, African Americans and Latinos, increasingly living in cities, rioted to defend their communities against civilian and police violence. The frequency of rioting declined after the urban rebellions of the 1960s, partly due to the militarization of local police. Yet the continued use of aggressive police tactics against racial minorities has contributed to a surge in rioting in US cities in the early 21st century.
J. Mark Souther
Prior to the railroad age, American cities generally lacked reputations as tourist travel destinations. As railroads created fast, reliable, and comfortable transportation in the 19th century, urban tourism emerged in many cities. Luxury hotels, tour companies, and guidebooks were facilitating and shaping tourists’ experience of cities by the turn of the 20th century. Many cities hosted regional or international expositions that served as significant tourist attractions from the 1870s to 1910s. Thereafter, cities competed more keenly to attract conventions. Tourism promotion, once handled chiefly by railroad companies, became increasingly professionalized with the formation of convention and visitor bureaus. The rise of the automobile spurred the emergence of motels and theme parks on the suburban periphery, but renewed interest in historic urban core areas spurred historic preservation activism and adaptive reuse of old structures for dining, shopping, and entertainment. Although a few cities, especially Las Vegas, had relied heavily on tourism almost from their inception, by the last few decades of the 20th century few cities could afford to ignore tourism development. New waterfront parks, aquariums, stadiums, and other tourist and leisure attractions facilitated the symbolic transformation of cities from places of production to sites of consumption. Long aimed at the a mass market, especially affluent and middle-class whites, tourism promotion embraced market segmentation in the closing years of the 20th century, and a number of attractions and tours appealed to African Americans or LGBTQ communities. If social commentators often complained that cities were developing “tourist bubbles” that concentrated the advantages of tourism in too-small areas and in too few hands, recent trends point to a greater willingness to disperse tourist activity more widely in cities. By the 21st century, urban tourism was indispensable to many cities even as it continued to contribute to uneven development.
Little Saigon is the preferred name of Vietnamese refugee communities throughout the world. This article focuses primarily on the largest such community, in Orange County, California. This suburban ethnic enclave is home to the largest concentration of overseas Vietnamese, nearly 200,000, or 10 percent of the Vietnamese American population. Because of its size, location, and demographics, Little Saigon is also home to some of the most influential intellectuals, entertainers, businesspeople, and politicians in the Vietnamese diaspora, many of whom are invested in constructing Little Saigon as a transnational oppositional party to the government of Vietnam. Unlike traditional immigrant ethnic enclaves, Little Saigon is a refugee community whose formation and development emerged in large part from America’s efforts to atone for its epic defeat in Vietnam by at least sparing some of its wartime allies a life under communism. Much of Little Saigon’s cultural politics revolve around this narrative of rescue, although the number guilt-ridden Americans grows smaller and more conservative, while the loyalists of the pre-1975 Saigon regime struggle to instill in the younger generation of Vietnamese an appreciation of their refugee roots.
Rebecca J. Mead
Woman suffragists in the United States engaged in a sustained, difficult, and multigenerational struggle: seventy-two years elapsed between the Seneca Falls convention (1848) and the passage of the Nineteenth Amendment (1920). During these years, activists gained confidence, developed skills, mobilized resources, learned to maneuver through the political process, and built a social movement. This essay describes key turning points and addresses internal tensions as well as external obstacles in the U.S. woman suffrage movement. It identifies important strategic, tactical, and rhetorical approaches that supported women’s claims for the vote and influenced public opinion, and shows how the movement was deeply connected to contemporaneous social, economic, and political contexts.