Michael A. McDonnell
The American War for Independence lasted eight years. It was one of the longest and bloodiest wars in America’s history, and yet it was not such a protracted conflict merely because the might of the British armed forces was brought to bear on the hapless colonials. The many divisions among Americans themselves over whether to fight, what to fight for, and who would do the fighting often had tragic and violent consequences. The Revolutionary War was by any measure the first American civil war. Yet national narratives of the Revolution and even much of the scholarship on the era focus more on simple stories of a contest between the Patriots and the British. Loyalists and other opponents of the Patriots are routinely left out of these narratives, or given short shrift. So, too, are the tens of thousands of ordinary colonists—perhaps a majority of the population—who were disaffected or alienated from either side or who tried to tack between the two main antagonists to make the best of a bad situation. Historians now estimate that as many as three-fifths of the colonial population were neither active Loyalists nor Patriots.
When we take the war seriously and begin to think about narratives that capture the experience of the many, rather than the few, an illuminating picture emerges. The remarkably wide scope of the activities of the disaffected during the war—ranging from nonpayment of taxes to draft dodging and even to armed resistance to protect their neutrality—has to be integrated with older stories of militant Patriots and timid Loyalists. Only then can we understand the profound consequences of disaffection—particularly in creating divisions within the states, increasing levels of violence, prolonging the war, and changing the nature of the political settlements in each state. Indeed, the very divisions among diverse Americans that made the War for Independence so long, bitter, and bloody also explains much of the Revolutionary energy of the period. Though it is not as seamless as traditional narratives of the Revolution would suggest, a more complicated story also helps better explain the many problems the new states and eventually the new nation would face. In making this argument, we may finally suggest ways we can overcome what John Shy long ago noted as the tendency of scholars to separate the ‘destructive’ War for Independence from the ‘constructive’ political Revolution.
Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.
The Creek Confederacy was a loose coalition of ethnically and linguistically diverse Native American towns that slowly coalesced as a political entity in the 18th and early 19th centuries. Its towns existed in Georgia, Alabama, and northern Florida, and for most of its preremoval history, these towns operated as autonomous entities. Several Creek leaders tried to consolidate power and create a more centralized polity, but these attempts at nation building largely failed. Instead, a fragile and informal confederacy connected the towns together for various cultural rituals as well as for purposes of diplomacy and trade. Disputes over centralization, as well as a host of other connected issues, ultimately led to the Creek War of 1813–1814. In the 1830s, the United States forced most members of the Creek Confederacy to vacate their eastern lands and relocate their nation to Indian Territory. Today, their western descendants are known as the Muskogee (Creek) Nation. Those who remained in the east include members of the federally recognized Seminole Tribe of Florida and the Poarch Band of Creek Indians who live in Alabama.
Vincent J. Cannato
The Ellis Island Immigration Station, located in New York Harbor, opened in 1892 and closed in 1954. During peak years from the 1890s until the 1920s, the station processed an estimated twelve million immigrants. Roughly 75 percent of all immigrants arriving in America during this period passed through Ellis Island. The station was run by the federal Immigration Service and represented a new era of federal control over immigration. Officials at Ellis Island were tasked with regulating the flow of immigration by enforcing a growing body of federal laws that barred various categories of “undesirable” immigrants. As the number of immigrants coming to America increased, so did the size of the inspection facility. In 1907, Ellis Island processed more than one million immigrants. The quota laws of the 1920s slowed immigration considerably and the rise of the visa system meant that Ellis Island no longer served as the primary immigrant inspection facility. For the next three decades, Ellis Island mostly served as a detention center for those ordered deported from the country.
After Ellis Island closed in 1954, the facility fell into disrepair. During a period of low immigration and a national emphasis on assimilation, the immigrant inspection station was forgotten by most Americans. With a revival of interest in ethnicity in the 1970s, Ellis Island attracted more attention, especially from the descendants of immigrants who entered the country through its doors. In the 1980s, large-scale fundraising for the restoration of the neighboring Statue of Liberty led to a similar effort to restore part of Ellis Island. In 1990, the Main Building was reopened to the public as an immigration museum under the National Park Service. Ellis Island has evolved into an iconic national monument with deep meaning for the descendants of the immigrants who arrived there, as well as a contested symbol to other Americans grappling with the realities of contemporary immigration.
John M. Dixon
The Enlightenment, a complex cultural phenomenon that lasted approximately from the late seventeenth century until the early nineteenth century, contained a dynamic mix of contrary beliefs and epistemologies. Its intellectual coherence arguably came from its distinctive historical sensibility, which was rooted in the notion that advances in the natural sciences had gifted humankind with an exceptional opportunity in the eighteenth century for self-improvement and societal progress. That unifying historical outlook was flexible and adaptable. Consequently, many aspects of the Enlightenment were left open to negotiation at local and transnational levels. They were debated by the philosophes who met in Europe’s coffeehouses, salons, and scientific societies. Equally, they were contested outside of Europe through innumerable cross-cultural exchanges as well as via long-distance intellectual interactions.
America—whether it is understood expansively as the two full continents and neighboring islands within the Western Hemisphere or, in a more limited way, as the territory that now constitutes the United States—played an especially prominent role in the Enlightenment. The New World’s abundance of plants, animals, and indigenous peoples fascinated early modern natural historians and social theorists, stimulated scientific activity, and challenged traditional beliefs. By the eighteenth century, the Western Hemisphere was an important site for empirical science and also for the intersection of different cultures of knowledge. At the same time, European conceptions of the New World as an undeveloped region inhabited by primitive savages problematized Enlightenment theories of universal progress. Comparisons of Native Americans to Africans, Asians, and Europeans led to speculation about the existence of separate human species or races. Similarly, the prevalence and profitability of American slavery fueled new and increasingly scientific conceptions of race. Eighteenth-century analyses of human differences complicated contemporary assertions that all men possessed basic natural rights. Toward the end of the eighteenth century, the American Revolution focused international attention on man’s innate entitlement to life, liberty, and happiness. Yet, in a manner that typified the contradictions and paradoxes of the Enlightenment, the founders of the United States opted to preserve slavery and social inequality after winning political freedom from Britain.
Alison L. LaCroix
Federalism refers to the constitutional and political structure of the United States of America, according to which political power is divided among multiple levels of government: the national level of government (also referred to as the “federal” or “general” government) and that of the states. It is a multilayered system of government that reserves some powers to component entities while also establishing an overarching level of government with a specified domain of authority. The structures of federalism are set forth in the Constitution of the United States, although some related ideas and practices predated the founding period and others have developed since. The balance between federal and state power has shifted throughout U.S. history, with assertions of broad national power meeting challenges from supporters of states’ rights and state sovereignty. Federalism is a fundamental value of the American political system, and it has been a controversial political and legal question since the founding period.
Carolyn Podruchny and Stacy Nation-Knapper
From the 15th century to the present, the trade in animal fur has been an economic venture with far-reaching consequences for both North Americans and Europeans (in which North Americans of European descent are included). One of the earliest forms of exchange between Europeans and North Americans, the trade in fur was about the garment business, global and local politics, social and cultural interaction, hunting, ecology, colonialism, gendered labor, kinship networks, and religion. European fashion, specifically the desire for hats that marked male status, was a primary driver for the global fur-trade economy until the late 19th century, while European desires for marten, fox, and other luxury furs to make and trim clothing comprised a secondary part of the trade. Other animal hides including deer and bison provided sturdy leather from which belts for the machines of the early Industrial Era were cut. European cloth, especially cotton and wool, became central to the trade for Indigenous peoples who sought materials that were lighter and dried faster than skin clothing. The multiple perspectives on the fur trade included the European men and indigenous men and women actually conducting the trade; the indigenous male and female trappers; European trappers; the European men and women producing trade goods; indigenous “middlemen” (men and women) who were conducting their own fur trade to benefit from European trade companies; laborers hauling the furs and trade goods; all those who built, managed, and sustained trading posts located along waterways and trails across North America; and those Europeans who manufactured and purchased the products made of fur and the trade goods desired by Indigenous peoples. As early as the 17th century, European empires used fur-trade monopolies to establish colonies in North America and later fur trading companies brought imperial trading systems inland, while Indigenous peoples drew Europeans into their own patterns of trade and power. By the 19th century, the fur trade had covered most of the continent and the networks of business, alliances, and families, and the founding of new communities led to new peoples, including the Métis, who were descended from the mixing of European and Indigenous peoples. Trading territories, monopolies, and alliances with Indigenous peoples shaped how European concepts of statehood played out in the making of European-descended nation-states, and the development of treaties with Indigenous peoples. The fur trade flourished in northern climes until well into the 20th century, after which time economic development, resource exploitation, changes in fashion, and politics in North America and Europe limited its scope and scale. Many Indigenous people continue today to hunt and trap animals and have fought in courts for Indigenous rights to resources, land, and sovereignty.
The issue of genocide and American Indian history has been contentious. Many writers see the massive depopulation of the indigenous population of the Americas after 1492 as a clear-cut case of the genocide. Other writers, however, contend that European and U.S. actions toward Indians were deplorable but were rarely if ever genocidal. To a significant extent, disagreements about the pervasiveness of genocide in the history of the post-Columbian Western Hemisphere, in general, and U.S. history, in particular, pivot on definitions of genocide. Conservative definitions emphasize intentional actions and policies of governments that result in very large population losses, usually from direct killing. More liberal definitions call for less stringent criteria for intent, focusing more on outcomes. They do not necessarily require direct sanction by state authorities; rather, they identify societal forces and actors. They also allow for several intersecting forces of destruction, including dispossession and disease. Because debates about genocide easily devolve into quarrels about definitions, an open-ended approach to the question of genocide that explores several phases and events provides the possibility of moving beyond the present stalemate. However one resolves the question of genocide in American Indian history, it is important to recognize that European and U.S. settler colonial projects unleashed massively destructive forces on Native peoples and communities. These include violence resulting directly from settler expansion, intertribal violence (frequently aggravated by colonial intrusions), enslavement, disease, alcohol, loss of land and resources, forced removals, and assaults on tribal religion, culture, and language. The configuration and impact of these forces varied considerably in different times and places according to the goals of particular colonial projects and the capacities of colonial societies and institutions to pursue them. The capacity of Native people and communities to directly resist, blunt, or evade colonial invasions proved equally important.
Philippe R. Girard
Haiti (known as Saint-Domingue until it gained its independence from France in 1804) had a noted economic and political impact on the United States during the era of the American Revolution, when it forced U.S. statesmen to confront issues they had generally avoided, most prominently racism and slavery. But the impact of the Haitian Revolution was most tangible in areas like commerce, territorial expansion, and diplomacy. Saint-Domingue served as a staging ground for the French military and navy during the American Revolution and provided troops to the siege of Savannah in 1779. It became the United States’ second-largest commercial partner during the 1780s and 1790s. After Saint-Domingue’s slaves revolted in 1791, many of its inhabitants found refuge in the United States, most notably in Philadelphia, Charleston, and New Orleans. Fears (or hopes) that the slave revolt would spread to the United States were prevalent in public opinion. As Saint-Domingue achieved quasi-autonomous status under the leadership of Toussaint Louverture, it occupied a central place in the diplomacy of John Adams and Thomas Jefferson. The Louisiana Purchase was made possible in part by the failure of a French expedition to Saint-Domingue in 1802–1803. Bilateral trade declined after Saint-Domingue acquired its independence from France in 1804 (after which Saint-Domingue became known as Haiti), but Haiti continued to loom large in the African-American imagination, and there were several attempts to use Haiti as a haven for U.S. freedmen. The U.S. diplomatic recognition of Haiti also served as a reference point for antebellum debates on slavery, the slave trade, and the status of free people of color in the United States.
Kristin M. Szylvian
Federal housing policy has been primarily devoted to maintaining the economic stability and profitability of the private sector real estate, household finance, and home-building and supply industries since the administration of President Franklin D. Roosevelt (1933–1945). Until the 1970s, federal policy encouraged speculative residential development in suburban areas and extended segregation by race and class. The National Association of Home Builders, the National Association of Realtors, and other allied organizations strenuously opposed federal programs seeking to assist low- and middle-income households and the homeless by forcing recalcitrant suburbs to permit the construction of open-access, affordable dwellings and encouraging the rehabilitation of urban housing. During the 1980s, President Ronald Reagan, a Republican from California, argued it was the government, not the private sector, that was responsible for the gross inequities in social and economic indicators between residents of city, inner ring, and outlying suburban communities. The civic, religious, consumer, labor, and other community-based organizations that tried to mitigate the adverse effects of the “Reagan Revolution” on the affordable housing market lacked a single coherent view or voice. Since that time, housing has become increasingly unaffordable in many metropolitan areas, and segregation by race, income, and ethnicity is on the rise once again. If the home mortgage crisis that began in 2007 is any indication, housing will continue to be a divisive political, economic, and social issue in the foreseeable future.
The national housing goal of a “decent home in a suitable living environment for every American family” not only has yet to be realized, but many law makers now favor eliminating or further restricting federal commitment to its realization.
Sean P. Harvey
“Race,” as a concept denoting a fundamental division of humanity and usually encompassing cultural as well as physical traits, was crucial in early America. It provided the foundation for the colonization of Native land, the enslavement of American Indians and Africans, and a common identity among socially unequal and ethnically diverse Europeans. Longstanding ideas and prejudices merged with aims to control land and labor, a dynamic reinforced by ongoing observation and theorization of non-European peoples. Although before colonization, neither American Indians, nor Africans, nor Europeans considered themselves unified “races,” Europeans endowed racial distinctions with legal force and philosophical and scientific legitimacy, while Natives appropriated categories of “red” and “Indian,” and slaves and freed people embraced those of “African” and “colored,” to imagine more expansive identities and mobilize more successful resistance to Euro-American societies. The origin, scope, and significance of “racial” difference were questions of considerable transatlantic debate in the age of Enlightenment and they acquired particular political importance in the newly independent United States.
Since the beginning of European exploration in the 15th century, voyagers called attention to the peoples they encountered, but European, American Indian, and African “races” did not exist before colonization of the so-called New World. Categories of “Christian” and “heathen” were initially most prominent, though observations also encompassed appearance, gender roles, strength, material culture, subsistence, and language. As economic interests deepened and colonies grew more powerful, classifications distinguished Europeans from “Negroes” or “Indians,” but at no point in the history of early America was there a consensus that “race” denoted bodily traits only. Rather, it was a heterogeneous compound of physical, intellectual, and moral characteristics passed on from one generation to another. While Europeans assigned blackness and African descent priority in codifying slavery, skin color was secondary to broad dismissals of the value of “savage” societies, beliefs, and behaviors in providing a legal foundation for dispossession.
“Race” originally denoted a lineage, such as a noble family or a domesticated breed, and concerns over purity of blood persisted as 18th-century Europeans applied the term—which dodged the controversial issue of whether different human groups constituted “varieties” or “species”—to describe a roughly continental distribution of peoples. Drawing upon the frameworks of scripture, natural and moral philosophy, and natural history, scholars endlessly debated whether different races shared a common ancestry, whether traits were fixed or susceptible to environmentally produced change, and whether languages or the body provided the best means to trace descent. Racial theorization boomed in the U.S. early republic, as some citizens found dispossession and slavery incompatible with natural-rights ideals, while others reconciled any potential contradictions through assurances that “race” was rooted in nature.
Between 1820 and 1924, nearly thirty-six million immigrants entered the United States. Prior to the Civil War, the vast majority of immigrants were northern and western Europeans, though the West Coast received Chinese immigration from the late 1840s onward. In mid-century, the United States received an unprecedented influx of Irish and German immigrants, who included a large number of Catholics and the poor. At the turn of the 20th century, the major senders of immigrants shifted to southern and eastern Europe, and Asians and Mexicans made up a growing portion of newcomers. Throughout the long 19th century, urban settlement remained a popular option for immigrants, and they contributed to the social, cultural, political, economic, and physical growth of the cities they resided in. Foreign-born workers also provided much-needed labor for America’s industrial development. At the same time, intense nativism emerged in cities in opposition to the presence of foreigners, who appeared to be unfit for American society, threats to Americans’ jobs, or sources of urban problems such as poverty. Anti-immigrant sentiment resulted in the introduction of state and federal laws for preventing the immigration of undesirable foreigners, such as the poor, southern and eastern Europeans, and Asians. Cities constituted an integral part of the 19th-century American immigration experience.
The history of American slavery began long before the first Africans arrived at Jamestown in 1619. Evidence from archaeology and oral tradition indicates that for hundreds, perhaps thousands, of years prior, Native Americans had developed their own forms of bondage. This fact should not be surprising, for most societies throughout history have practiced slavery. In her cross-cultural and historical research on comparative captivity, Catherine Cameron found that bondspeople composed 10 percent to 70 percent of the population of most societies, lending credence to Seymour Drescher’s assertion that “freedom, not slavery, was the peculiar institution.” If slavery is ubiquitous, however, it is also highly variable. Indigenous American slavery, rooted in warfare and diplomacy, was flexible, often offering its victims escape through adoption or intermarriage, and it was divorced from racial ideology, deeming all foreigners—men, women, and children, of whatever color or nation—potential slaves. Thus, Europeans did not introduce slavery to North America. Rather, colonialism brought distinct and evolving notions of bondage into contact with one another. At times, these slaveries clashed, but they also reinforced and influenced one another. Colonists, who had a voracious demand for labor and export commodities, exploited indigenous networks of captive exchange, producing a massive global commerce in Indian slaves. This began with the second voyage of Christopher Columbus in 1495 and extended in some parts of the Americas through the twentieth century. During this period, between 2 and 4 million Indians were enslaved. Elsewhere in the Americas, Indigenous people adapted Euro-American forms of bondage. In the Southeast, an elite class of Indians began to hold African Americans in transgenerational slavery and, by 1800, developed plantations that rivaled those of their white neighbors. The story of Native Americans and slavery is complicated: millions were victims, some were masters, and the nature of slavery changed over time and varied from one place to another. A significant and long overlooked aspect of American history, Indian slavery shaped colonialism, exacerbated Native population losses, figured prominently in warfare and politics, and influenced Native and colonial ideas about race and identity.
By serving travelers and commerce, roads and streets unite people and foster economic growth. But as they develop, roads and streets also disrupt old patterns, upset balances of power, and isolate some as they serve others. The consequent disagreements leave historical records documenting social struggles that might otherwise be overlooked. For long-distance travel in America before the middle of the 20th century, roads were generally poor alternatives, resorted to when superior means of travel, such as river and coastal vessels, canal boats, or railroads were unavailable. Most roads were unpaved, unmarked, and vulnerable to the effects of weather. Before the railroads, for travelers willing to pay the toll, rare turnpikes and plank roads could be much better. Even in towns, unpaved streets were common until the late 19th century, and persisted into the 20th. In the late 19th century, rapid urban growth, rural free delivery of the mails, and finally the proliferation of electric railways and bicycling contributed to growing pressure for better roads and streets. After 1910, the spread of the automobile accelerated the trend, but only with great controversy, especially in cities. Partly in response to the controversy, advocates of the automobile organized to promote state and county motor highways funded substantially by gasoline taxes; such roads were intended primarily for motor vehicles. In the 1950s, massive federal funds accelerated the trend; by then, motor vehicles were the primary transportation mode for both long and short distances. The consequences have been controversial, and alternatives have been attracting growing interest.
Courts and legislatures in colonial America and the early American republic developed and refined a power to compel civilians to assist peace and law enforcement officers in arresting wrongdoers, keeping the peace, and other matters of law enforcement. This power to command civilian cooperation was known as the posse comitatus or “power of the county.” Rooted in early modern English countryside law enforcement, the posse comitatus became an important police institution in 18th- and 19th-century America. The posse comitatus was typically composed of able-bodied white male civilians who were temporarily deputized to aid a sheriff or constable. But if this “power of the county” was insufficient, law enforcement officers were often authorized to call on the military to serve as the posse comitatus.
The posse comitatus proved particularly important in buttressing slavery in the American South. Slaveholders pushed for and especially benefited from laws that required citizens to assist in the recapture of local runaway slaves and fugitive slaves who crossed into states without slavery. Though slave patrols were rooted in the posse comitatus, the posse comitatus originated as a compulsory and noncompensated institution. Slaveholders in the American South later added financial incentives for those who acted in the place of a posse to recapture slaves on the run from their owners.
The widespread use of the posse comitatus in southern slave law became part of the national discussion about slavery during the early American republic as national lawmakers contemplated how to deal with the problem of fugitive slaves who fled to free states. This dialogue culminated with the Fugitive Slave Law of 1850, in which the US Congress authorized officials to “summon and call to their aid the bystanders, or posse comitatus” and declared that “all good citizens are hereby commanded to aid and assist in the prompt and efficient execution of this law, whenever their services may be required.” During Reconstruction, the Radical Republican Congress used the posse comitatus to enforce laws that targeted conquered Confederates. After the end of Reconstruction in 1877, Southern states pushed Congress to create what would come to be known as the “Posse Comitatus Act,” which prohibited the use of federal military forces for law enforcement. The history of the posse comitatus in early America is thus best understood as a story about and an example of the centralization of government authority and its ramifications.
Robert G. Parkinson
According to David Ramsay, one of the first historians of the American Revolution, “in establishing American independence, the pen and press had merit equal to that of the sword.” Because of the unstable and fragile notions of unity among the thirteen American colonies, print acted as a binding agent that mitigated the chances that the colonies would not support one another when war with Britain broke out in 1775.
Two major types of print dealt with the political process of the American Revolution: pamphlets and newspapers. Pamphlets were one of the most important conveyors of ideas during the imperial crisis. Often written by elites under pseudonyms and published by booksellers, they have long been held by historians as the lifeblood of the American Revolution. There were also three dozen newspaper printers in the American mainland colonies at the start of the Revolution, each producing a four-page issue every week. These weekly papers, or one-sheet broadsides that appeared in American cities even more frequently, were the most important communication avenue to keep colonists informed of events hundreds of miles away. Because of the structure of the newspaper business in the 18th century, the stories that appeared in each paper were “exchanged” from other papers in different cities, creating a uniform effect akin to a modern news wire. The exchange system allowed for the same story to appear across North America, and it provided the Revolutionaries with a method to shore up that fragile sense of unity. It is difficult to imagine American independence—as a popular idea let alone a possible policy decision—without understanding how print worked in colonial America in the mid-18th century.
Jessica Ellen Sewell
From 1800 to 2000, cities grew enormously, and saw an expansion of public spaces to serve the varied needs of a diverse population living in ever more cramped and urban circumstances. While a wide range of commercial semipublic spaces became common in the late 19th century, parks and streets were the best examples of truly public spaces with full freedom of access. Changes in the design and management of streets, sidewalks, squares, parks, and plazas during this period reflect changing ideas about the purpose of public space and how it should be used.
Streets shifted from being used for a wide range of activities, including vending, playing games, and storing goods, to becoming increasingly specialized spaces of movement, designed and managed by the early twentieth century for automobile traffic. Sidewalks, which in the early nineteenth century were paid for and liberally used by adjacent businesses, were similarly specialized as spaces of pedestrian movement. However, the tradition of using streets and sidewalks as a space of public celebration and public speech remained strong throughout the period. During parades and protests, streets and sidewalks were temporarily remade as spaces of the performance of the public, and the daily activities of circulation and commerce were set aside.
In 1800, the main open public spaces in cities were public squares or commons, often used for militia training and public celebration. In the second half of the 19th century, these were augmented by large picturesque parks. Designed as an antidote to urbanity, these parks served the public as a place for leisure, redefining public space as a polite leisure amenity, rather than a place for people to congregate as a public. The addition of playgrounds, recreational spaces, and public plazas in the 20th century served both the physical and mental health of the public. In the late 20th century, responding to neoliberal ideas and urban fiscal crises, the ownership and management of public parks and plazas was increasingly privatized, further challenging public accessibility.
Rachel Hope Cleves
The task of recovering the history of same-sex love among early American women faces daunting challenges of definition and sources. Modern conceptions of same-sex sexuality did not exist in early America, but alternative frameworks did. Many indigenous nations had social roles for female-bodied individuals who lived as men, performed male work, and acquired wives. Early Christian settlers viewed sexual encounters between women as sodomy, but also valued loving dyadic bonds between religious women. Primary sources indicate that same-sex sexual practices existed within western and southern African societies exploited by the slave trade, but little more is known. The word “lesbian” has been used to signify erotics between women since roughly the 10th century, but historians must look to women who led lesbian-like lives in early America rather than to women who self-identified as lesbians. Stories of female husbands who passed as men and married other women were popular in the 18th century. Tales of passing women who served in the military, in the navy, and as pirates also amused audiences and raised the spectre of same-sex sexuality. Some female religious leaders trespassed conventional gender roles and challenged the marital sexual order. Other women conformed to female gender roles, but constructed loving female households. 18th-century pornography depicting lesbian sexual encounters indicates that early Americans were familiar with the concept of sex between women. A few court records exist from prosecutions of early American women for engaging in lewd acts together. Far more common, by the end of the 18th century, were female-authored letters and diaries describing the culture of romantic friendship, which sometimes extended to sexual intimacy. Later in the 19th century, romantic friendship became an important ingredient in the development of lesbian culture and identity.
Described as a “chief among chiefs” by the British, and by his arch-rival, William Henry Harrison, as “one of those uncommon geniuses which spring up occasionally to produce revolutions and overturn the established order of things,” Tecumseh impressed all who knew him. Lauded for his oratory, military and diplomatic skills, and, ultimately, his humanity, Tecumseh presided over the greatest Indian resistance movement that had ever been assembled in the eastern half of North America. His genius lay in his ability to fully articulate religious, racial, and cultural ideals borne out of his people’s existence on fault lines between competing empires and Indian confederacies. Known as “southerners” by their Algonquian relatives, the Shawnees had a history of migrating between worlds. Tecumseh, and his brother, Tenskwatawa, converted this inheritance into a widespread social movement in the first decade and a half of the 19th-century, when more than a thousand warriors, from many different tribes, heeded their call to halt American expansion along the border of what is now Ohio and Indiana. Tecumseh articulated a vision of intertribal, pan-Indian unity based on revitalization and reform, and his ambitions very nearly rewrote early American history.
The transformation of post-industrial American life in the late 20th and early 21st centuries includes several economically robust metropolitan centers that stand as new models of urban and economic life, featuring well-educated populations that engage in professional practices in education, medical care, design and legal services, and artistic and cultural production. By the early 21st century, these cities dominated the nation’s consciousness economically and culturally, standing in for the most dynamic and progressive sectors of the economy, driven by collections of technical and creative spark. The origins of these academic and knowledge centers are rooted in the political economy, including investments shaped by federal policy and philanthropic ambition. Education and health care communities were and remain frequently economically robust but also rife with racial, economic, and social inequality, and riddled with resulting political tensions over development. These information communities fundamentally incubated and directed the proceeds of the new economy, but also constrained who accessed this new mode of wealth in the knowledge economy.