You are looking at 221-240 of 341 articles
The 1950s have typically been seen as a complacent, conservative time between the end of World War II and the radical 1960s, when anticommunism and the Cold War subverted reform and undermined civil liberties. But the era can also be seen as a very liberal time in which meeting the Communist threat led to Keynesian economic policies, the expansion of New Deal programs, and advances in civil rights. Politically, it was “the Eisenhower Era,” dominated by a moderate Republican president, a high level of bipartisan cooperation, and a foreign policy committed to containing communism. Culturally, it was an era of middle-class conformity, which also gave us abstract expressionism, rock and roll, Beat poetry, and a grassroots challenge to Jim Crow.
Emerson W. Baker
The Salem Witch Trials are one of the best known, most studied, and most important events in early American history. The afflictions started in Salem Village (present-day Danvers), Massachusetts, in January 1692, and by the end of the year the outbreak had spread throughout Essex County, and threatened to bring down the newly formed Massachusetts Bay government of Sir William Phips. It may have even helped trigger a witchcraft crisis in Connecticut that same year. The trials are known for their heavy reliance on spectral evidence, and numerous confessions, which helped the accusations grow. A total of 172 people are known to have been formally charged or informally cried out upon for witchcraft in 1692. Usually poor and marginalized members of society were the victims of witchcraft accusations, but in 1692 many of the leading members of the colony were accused. George Burroughs, a former minister of Salem Village, was one of the nineteen people convicted and executed. In addition to these victims, one man, Giles Cory, was pressed to death, and five died in prison. The last executions took place in September 1692, but it was not until May 1693 that the last trial was held and the last of the accused was freed from prison.
The trials would have lasting repercussions in Massachusetts and signaled the beginning of the end of the Puritan City upon a Hill, an image of American exceptionalism still regularly invoked. The publications ban issued by Governor Phips to prevent criticism of the government would last three years, but ultimately this effort only ensured that the failure of the government to protect innocent lives would never be forgotten. Pardons and reparations for some of the victims and their families were granted by the government in the early 18th century, and the legislature would regularly take up petitions, and discuss further reparations until 1749, more than fifty years after the trials. The last victims were formally pardoned by the governor and legislature of Massachusetts in 2001.
Rachel Hope Cleves
The task of recovering the history of same-sex love among early American women faces daunting challenges of definition and sources. Modern conceptions of same-sex sexuality did not exist in early America, but alternative frameworks did. Many indigenous nations had social roles for female-bodied individuals who lived as men, performed male work, and acquired wives. Early Christian settlers viewed sexual encounters between women as sodomy, but also valued loving dyadic bonds between religious women. Primary sources indicate that same-sex sexual practices existed within western and southern African societies exploited by the slave trade, but little more is known. The word “lesbian” has been used to signify erotics between women since roughly the 10th century, but historians must look to women who led lesbian-like lives in early America rather than to women who self-identified as lesbians. Stories of female husbands who passed as men and married other women were popular in the 18th century. Tales of passing women who served in the military, in the navy, and as pirates also amused audiences and raised the spectre of same-sex sexuality. Some female religious leaders trespassed conventional gender roles and challenged the marital sexual order. Other women conformed to female gender roles, but constructed loving female households. 18th-century pornography depicting lesbian sexual encounters indicates that early Americans were familiar with the concept of sex between women. A few court records exist from prosecutions of early American women for engaging in lewd acts together. Far more common, by the end of the 18th century, were female-authored letters and diaries describing the culture of romantic friendship, which sometimes extended to sexual intimacy. Later in the 19th century, romantic friendship became an important ingredient in the development of lesbian culture and identity.
The United States was extremely reluctant to get drawn into the wars that erupted in Asia in 1937 and Europe in 1939. Deeply disillusioned with the experience of World War I, when the large number of trench warfare casualties had resulted in a peace that many American believed betrayed the aims they had fought for, the United States sought to avoid all forms of entangling alliances. Deeply embittered by the Depression, which was widely blamed on international bankers and businessmen, Congress enacted legislation that sought to prevent these actors from drawing the country into another war. The American aim was neutrality, but the underlying strength of the United States made it too big to be impartial—a problem that Roosevelt had to grapple with as Germany, Italy, and Japan began to challenge international order in the second half of the 1930s.
Ansley T. Erickson
“Urban infrastructure” calls to mind railways, highways, and sewer systems. Yet the school buildings—red brick, limestone, or concrete, low-slung, turreted, or glass-fronted—that hold and seek to shape the city’s children are ubiquitous forms of infrastructure as well. Schools occupy one of the largest line items in a municipal budget, and as many as a fifth of a city’s residents spend the majority of their waking hours in school classrooms, hallways, and gymnasiums. In the 19th and 20th centuries urban educational infrastructure grew, supported by developing consensus for publicly funded and publicly governed schools (if rarely fully accessible to all members of the public). Even before state commitment to other forms of social welfare, from pensions to public health, and infrastructure, from transit to fire, schooling was a government function.
This commitment to public education ultimately was national, but schools in cities had their own story. Schooling in the United States is chiefly a local affair: Constitutional responsibility for education lies with the states; power is then further decentralized as states entrust decisions about school function and funding to school districts. School districts can be as small as a single town or a part of a city. Such localism is one reason that it is possible to speak about schools in U.S. cities as having a particular history, determined as much by the specificities of urban life as by national questions of citizenship, economy, religion, and culture.
While city schools have been distinct, they have also been nationally influential. Urban scale both allowed for and demanded the most extensive educational system-building. Urban growth and diversity galvanized innovation, via exploration in teaching methods, curriculum, and understanding of children and communities. And it generated intense conflict. Throughout U.S. history, urban residents from myriad social, political, religious, and economic positions have struggled to define how schools would operate, for whom, and who would decide.
During the 19th and 20th centuries, U.S. residents struggled over the purposes, funding, and governance of schools in cities shaped by capitalism, nativism, and white supremacy. They built a commitment to schooling as a public function of their cities, with many compromises and exclusions. In the 21st century, old struggles re-emerged in new form, perhaps raising the question of whether schools will continue as public, urban infrastructure.
Adam R. Shaprio
The 1925 Scopes trial was a widely followed court case in Dayton, Tennessee, that attracted the attention of the nation. A prosecution against a schoolteacher charged with violating Tennessee’s new law prohibiting the teaching of human evolution, the trial became a great public spectacle that saw debates over the meaning and truth of the Bible, and the relationship between science and religion. The trial is most famous for the involvement of the lawyers William Jennings Bryan (for the prosecution) and Clarence Darrow (for the defense).
Despite being a legally insignificant case, the trial has remained important in American history because it is seen as symbolizing some of the country’s great social issues in the early 20th century: fundamentalist responses to modernity, the autonomy and clout of the “New South,” and the eternal clash between religion and science.
Since many North American indigenous societies also built and inhabited towns, America was not an entirely rural continent before the arrival of Europeans. Nevertheless, when Europeans set out to colonize their “wilderness,” they arrived with a practical and ideological commitment to recreating cities of the sort with which they were familiar on their home continent. The result of their ambitions was the rapid founding and development of European-style cities, the vast majority of which clustered on large bodies of water, either directly on the Atlantic Ocean or on the seas and river estuaries adjacent to it. The pace of city expansion was closely linked to the levels of support for cities among colonists and an economic environment that stimulated urban growth. Some cities grew faster than others, but by the middle of the 18th century even Virginia and Maryland, the most rural colonies, had towns that played a critical cultural, political, and economic role in society. By the revolutionary era, the centrality of North America’s seaports was cemented by their status as crucibles of the conflict. The issue of which seaport was the new United States’ premier city was contested, but the importance of cities to North American society was no longer debated.
Steven K. Green
Separation of church and state has long been viewed as a cornerstone of American democracy. At the same time, the concept has remained highly controversial in the popular culture and law. Much of the debate over the application and meaning of the phrase focuses on its historical antecedents. This article briefly examines the historical origins of the concept and its subsequent evolutions in the nineteenth century.
In the seventy years since the end of World War II (1939–1945), postindustrialization—the exodus of manufacturing and growth of finance and services—has radically transformed the economy of North American cities. Metropolitan areas are increasingly home to transnational firms that administer dispersed production networks that span the world. A few major global centers host large banks that coordinate flows of finance capital necessary not only for production, but also increasingly for education, infrastructure, municipal government, housing, and nearly every other aspect of life. In cities of the global north, fewer workers produce goods and more produce information, entertainment, and experiences. Women have steadily entered the paid workforce, where they often do the feminized work of caring for children and the ill, cleaning homes, and preparing meals. Like the Gilded Age city, the postindustrial city creates immense social divisions, injustices, and inequalities: penthouses worth millions and rampant homelessness, fifty-dollar burgers and an epidemic of food insecurity, and unparalleled wealth and long-standing structural unemployment all exist side by side. The key features of the postindustrial service economy are the increased concentration of wealth, the development of a privileged and celebrated workforce of professionals, and an economic system reliant on hyperexploited service workers whose availability is conditioned by race, immigration status, and gender.
Both sexuality and religion are terms as vexatious to define as they can be alluring to pursue. In the contemporary period, figuring out one’s sexual feelings, identity, and preferences has become a signal aspect of self-formation. Understanding one’s religious feelings, identity, and preferences may seem less imminent, but is certainly no less complicated. Both terms cause no small amount of confusion. Clearing up some of this confusion requires speaking frankly about delicate matters, and also speaking flatly about enormously complex experiences. Popular media coverage of ecclesiastical sex scandals in America suggests that people enjoy hearing about the profanation of religious duty. Despite the observed, inferred, and accused sexuality in American religious history, or maybe because of it, eroticism suffuses narrative accounts of American religious history and descriptions of religious actors. In U.S. history, sexuality has often been a key lens through which we have understood the nature of religion, the leaders of religions, and the reason for religious commitment.
Anne Sarah Rubin
Sherman’s March, more accurately known as the Georgia and Carolinas Campaigns, cut a swath across three states in 1864–1865. It was one of the most significant campaigns of the war, making Confederate civilians “howl” as farms and plantations were stripped of everything edible and all their valuables. Outbuildings, and occasionally homes, were burned, railroads were destroyed, and enslaved workers were emancipated. Long after the war ended, Sherman’s March continued to shape American’s memories as one of the most symbolically powerful aspects of the Civil War.
Sherman’s March began with the better-known March to the Sea, which started in Atlanta on November 15, 1864, and concluded in Savannah on December 22 of the same year. Sherman’s men proceeded through South Carolina and North Carolina in February, March, and April of 1865. The study of this military campaign illuminates the relationships between Sherman’s soldiers and Southern white civilians, especially women, and African Americans. Sherman’s men were often uncomfortable with their role as an army of liberation, and African Americans, in particular, found the March to be a double-edged sword.
The American Revolution was an episode in a transatlantic outcry against the corruption of the British balance of power and liberty institutionalized in the Glorious Revolution of 1688–1689. English speakers during the 18th century reflected on this constitutional crisis within a larger conversation about the problem of human governance. Although many people excluded from Parliament supported political reform, if not revolution, they also sought remedies for the perversion of political power and influence in new forms of social power and influence. This article looks at the convergence of political and social discussions in a common discourse about the nature of power and the ways in which human beings influenced each other. The first section outlines the meanings of power and influence in British politics. The second section uses the novelist Sarah Fielding’s Remarks on Clarissa (1759) to delineate revolutionary notions about social power and influence. The third section turns to the speeches and writings of Edmund Burke in the run-up to the American Revolution to look at how English speakers deployed notions of social power to advocate for political reform.
Christopher W. Schmidt
One of the most significant protest campaigns of the civil rights era, the lunch counter sit-in movement began on February 1, 1960 when four young African American men sat down at the whites-only lunch counter of the Woolworth store in Greensboro, North Carolina. Refused service, the four college students sat quietly until the store closed. They continued their protest on the following days, each day joined by more fellow students. Students in other southern cities learned what was happening and started their own demonstrations, and in just weeks, lunch counter sit-ins were taking place across the South. By the end of the spring, tens of thousands of black college and high school students, joined in some cases by sympathetic white students, had joined the sit-in movement. Several thousand went to jail for their efforts after being arrested on charges of trespass, disorderly conduct, or whatever other laws southern police officers believed they could use against the protesters.
The sit-ins arrived at a critical juncture in the modern black freedom struggle. The preceding years had brought major breakthroughs, such as the Supreme Court’s Brown v. Board of Education school desegregation ruling in 1954 and the successful Montgomery bus boycott of 1955–1956, but by 1960, activists were struggling to develop next steps. The sit-in movement energized and transformed the struggle for racial equality, moving the leading edge of the movement from the courtrooms and legislative halls to the streets and putting a new, younger generation of activists on the front lines. It gave birth to the Student Nonviolent Coordinating Committee, one of the most important activist groups of the 1960s. It directed the nation’s attention to the problem of racial discrimination in private businesses that served the public, pressured business owners in scores of southern cities to open their lunch counters to African American customers, and set in motion a chain of events that would culminate in the Civil Rights Act of 1964, which banned racial discrimination in public accommodations across the nation.
The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.
The patterns of urban slavery in North American and pre-Civil War US cities reveal the ways in which individual men and women, as well as businesses, institutions, and governmental bodies employed slave labor and readily adapted the system of slavery to their economic needs and desires. Colonial cities east and west of the Mississippi River founded initially as military forts, trading posts, and maritime ports, relied on African and Native American slave labor from their beginnings. The importance of slave labor increased in Anglo-American East Coast urban settings in the 18th century as the number of enslaved Africans increased in these colonies, particularly in response to the growth of the tobacco, wheat, and rice industries in the southern colonies. The focus on African slavery led most Anglo-American colonies to outlaw the enslavement of Native Americans, and urban slavery on the East Coast became associated almost solely with people of African descent. In addition, these cities became central nodes in the circum-Atlantic transportation and sale of enslaved people, slave-produced goods, and provisions for slave colonies whose economies centered on plantation goods. West of the Mississippi, urban enslavement of Native Americans, Mexicans, and even a few Europeans continued through the 19th century.
As the thirteen British colonies transitioned to the United States during and after the Revolutionary War, three different directions emerged regarding the status of slavery, which would affect the status of slavery and people of African descent in cities. The gradual emancipation of enslaved people in states north of Delaware led to the creation of the so-called free states, with large numbers of free blacks moving into cities to take full advantage of freedom and the possibility of creating family and community. Although antebellum northern cities were located within areas where legalized slavery ended, these cities retained economic and political ties to southern slavery. At the same time, the radical antislavery movement developed in Philadelphia, Boston, and New York. Thus, Northern cities were the site of political conflicts between pro- and antislavery forces. In the Chesapeake, as the tobacco economy declined, slave owners manumitted enslaved blacks for whom they did not have enough work, creating large groups of free blacks in cities. But these states began to participate heavily in the domestic slave trade, with important businesses located in cities. And in the Deep South, the recommitment to slavery following the Louisiana Purchase and the emergence of the cotton economy led to the creation of a string of wealthy port cities critical to the transportation of slaves and goods. These cities were situated in local economic geographies that connected rural plantations to urban settings and in national and international economies of exchange of raw and finished goods that fueled industries throughout the Atlantic world. The vast majority of enslaved people employed in the antebellum South worked on rural farms, but slave labor was a key part of the labor force in southern cities. Only after the Civil War did slavery and cities become separate in the minds of Americans, as postwar whites north and south created a mythical South in which romanticized antebellum cotton plantations became the primary symbol of American slavery, regardless of the long history of slavery that preceded their existence.
Canada has sometimes been called the United States’ attic: a useful feature, but one easily forgotten. Of all countries, it has historically resembled the United States the most closely, in terms of culture, geography, economy, society, politics, ideology and, especially, history. A shared culture—literary, social, legal, and political—is a crucial factor in Canadian-American relations. Geography is at least as important. It provides the United States with strategic insulation to the north and enhances geographic isolation to the east and west. North-south economic links are inevitable and very large. It has been a major recipient of American investment, and for most of the time since 1920 has been the United States’ principal trading partner. Prosperous and self-sufficient, it has seldom required American aid. There have been no overtly hostile official encounters since the end of the War of 1812, partly because many Americans tended to believe that Canadians would join the republic; when that did not occur, the United States accepted an independent but friendly Canada as a permanent, useful, and desirable neighbor—North America’s attic. The insulation the attic provided was a common belief in the rule of law, both domestic and international; liberal democracy; a federal constitution; liberal capitalism; and liberal international trade regimes.
That said, the United States, with its large population, huge economy, and military power, insulates Canada from hostile external forces. An attack on Canada from outside the continent is hard to imagine without a simultaneous attack on the United States. Successive American and Canadian governments have reaffirmed the political status quo while favoring mutually beneficial economic and military linkages—bilateral and multilateral. Relations have traditionally been grounded in a negotiating style that is evidence-based, proceeding issue by issue. A sober diplomatic and political context sometimes frames irritations and exclamations, but even these have usually been defined and limited by familiarity. For example, there has always been anti-Americanism in Canada. Most often it consists of sentiments derived from the United States itself, channeled by cultural similarities. No American idea, good or bad, from liberalism to populism, fails to find an echo in Canada. How loud or how soft the echo makes the difference.
Christian J. Koot
Smuggling was a regular feature of the economy of colonial British America in the 17th and 18th centuries. Though the very nature of illicit commerce means that the extent of this trade is incalculable, a wide variety of British and colonial sources testify to the ability of merchants to trade where they pleased and to avoid paying duties in the process. Together admiralty proceedings, merchant correspondence and account books, customs reports, and petitions demonstrate that illicit trade enriched individuals and allowed settlers to shape their colonies’ development. Smuggling formed in resistance to British economic and political control. British authorities attempted to harness the trade of their Atlantic colonies by employing a series of laws that restricted overseas commerce (often referred to as the Navigation Acts). This legislation created the opportunity for illicit trade by raising the costs of legal trade. Hampered by insufficient resources, thousands of miles of coastline, and complicit local officials, British customs agents could not prevent smuggling. Economic self-interest and the pursuit of profit certainly motivated smugglers, but because it was tied to a larger transatlantic debate about the proper balance between regulation and free trade, smuggling was also a political act. Through smuggling colonists rejected what they saw as capricious regulations designed to enrich Britain at their expense.
Janine Giordano Drake
The term “Social Gospel” was coined by ministers and other well-meaning American Protestants with the intention of encouraging the urban and rural poor to understand that Christ cared about them and saw their struggles. The second half of the 19th century saw a rise of both domestic and international missionary fervor. Church and civic leaders feared a future in which freethinkers, agnostics, atheists, and other skeptics dominated spiritual life and well-educated ministers were marginal to American culture. They grew concerned with the rising number of independent and Pentecostal churches without extensive theological training or denominational authority. American Protestants especially feared that immigrant religious and cultural traditions, including Roman Catholicism, Judaism, and Eastern Orthodox Christianity, were not quintessentially American. Most of all, they worried that those belief systems could not promote what they saw as the traditional American values and mores central to the nation.
However, at least on the surface, the Social Gospel did not dwell on extinguishing ideas or traditions. Rather, as was typical of the Progressive Era, it forwarded a wide-ranging set of visions that emphasized scientific and professional expertise, guided by Christian ethics, to solve social and political problems. It fostered an energetic culture of conferences, magazines, and paperback books dedicated to reforming the nation. Books and articles unpacked social surveys that sorted through possible solutions to urban and rural poverty and reported on productive relationships between churches and municipal governments. Pastoral conferences often focused on planning revivals in urban auditoriums, churches, stadiums, or the open air, where participants not only were confronted with old-fashioned gospel messages but with lectures on what Christians could do to improve their communities.
The Social Gospel’s theological turn stressed the need for both individual redemption from sinful behavior, and the redemption of whole societies from damaged community relationships. Revivalists not only entreated listeners to reject personal habits like drinking, smoking, chewing tobacco, gambling, theater-going, and extramarital sex. They also encouraged listeners to replace the gathering space of the saloon with churches, schools, and public parks. Leaders usually saw themselves redeeming the “social sin” that produced impoverished neighborhoods, low-wage jobs, preventable diseases, and chronic unemployment and offering alternatives that kept businesses intact. In the Social Creed of the Churches (1908), ministers across the denominations proposed industrial reforms limiting work hours and improving working conditions, as well as government regulations setting a living wage and providing protection for the injured, sick, and elderly. Sometimes, Social Gospel leaders defended collective bargaining and built alliances with labor leaders. At other times, they proposed palliative solutions that would instill Christian “brotherhood” on the shop floor and render unions unnecessary. This wavering on principles produced complicated and sometimes tense relationships among union leaders, workers, and Social Gospel leaders.
Elements of the Social Gospel movement have carried even into the 21st century, leading some historians to challenge the idea that the movement died with the close of the Great War. The American Civil Liberties Union and Fellowship of Reconciliation, for example, did not lose any time in keeping alive the Social Gospel’s commitments to protecting the poor and defenseless. However, the rise of “premillennial dispensationalist” theology and the general disillusionment produced by the war’s massive casualties marked a major turning point, if not an endpoint, to the Social Gospel’s influence as a well-funded, Protestant evangelical force. The brutality of the war undermined American optimism—much of it fueled by Social Gospel thinking—about creating a more just, prosperous, and peaceful world. Meanwhile, attorney general A. Mitchell Palmer’s campaign against alleged anarchists and Bolsheviks immediately after the war—America’s first “Red Scare”—targeted a large number of labor and religious organizations with the accusation that socialist ideas were undemocratic and un-American. By the 1920s, many Social Gospel leaders had distanced themselves from the organized working classes. They either accepted new arrangements for harmonizing the interests of labor and capital or took their left-leaning political ideals underground.
Since the social sciences began to emerge as scholarly disciplines in the last quarter of the 19th century, they have frequently offered authoritative intellectual frameworks that have justified, and even shaped, a variety of U.S. foreign policy efforts. They played an important role in U.S. imperial expansion in the late 19th and early 20th centuries. Scholars devised racialized theories of social evolution that legitimated the confinement and assimilation of Native Americans and endorsed civilizing schemes in the Philippines, Cuba, and elsewhere. As attention shifted to Europe during and after World War I, social scientists working at the behest of Woodrow Wilson attempted to engineer a “scientific peace” at Versailles. The desire to render global politics the domain of objective, neutral experts intensified during World War II and the Cold War. After 1945, the social sciences became increasingly central players in foreign affairs, offering intellectual frameworks—like modernization theory—and bureaucratic tools—like systems analysis—that shaped U.S. interventions in developing nations, guided nuclear strategy, and justified the increasing use of the U.S. military around the world.
Throughout these eras, social scientists often reinforced American exceptionalism—the notion that the United States stands at the pinnacle of social and political development, and as such has a duty to spread liberty and democracy around the globe. The scholarly embrace of conventional political values was not the result of state coercion or financial co-optation; by and large social scientists and policymakers shared common American values. But other social scientists used their knowledge and intellectual authority to critique American foreign policy. The history of the relationship between social science and foreign relations offers important insights into the changing politics and ethics of expertise in American public policy.
K. Tsianina Lomawaima
In 1911, a group of American Indian intellectuals organized what would become known as the Society of American Indians, or SAI. SAI members convened in annual meetings between 1911 and 1923, and for much of that period the Society’s executive offices were a hub for political advocacy, lobbying Congress and the Office of Indian Affairs (OIA), publishing a journal, offering legal assistance to Native individuals and tribes, and maintaining an impressively voluminous correspondence across the country with American Indians, “Friends of the Indian” reformers, political allies, and staunch critics. Notable Native activists, clergy, entertainers, professionals, speakers, and writers—as well as Native representatives from on- and off-reservation communities—were active in the Society. They worked tirelessly to meet daunting, unrealistic expectations, principally to deliver a unified voice of Indian “public opinion” and to pursue controversial political goals without appearing too radical, especially obtaining U.S. citizenship for Indian individuals and allowing Indian nations to access the U.S. Court of Claims. They maintained their myriad activities with scant financial resources solely through the unpaid labor of dedicated Native volunteers. By 1923, the challenges exhausted the Society’s substantial human and miniscule financial capital. The Native “soul of unity” demanded by non-white spectators and hoped for by SAI leaders could no longer hold the center, and the SAI dissolved. Their work was not in vain, but citizenship and the ability to file claims materialized in circumscribed forms. In 1924 Congress passed the Indian Citizenship Act, granting birthright citizenship to American Indians, but citizenship for Indians was deemed compatible with continued wardship status. In 1946 Congress established an Indian Claims Commission, not a court, and successful claims could only result in monetary compensation, not regained lands.