You are looking at 81-100 of 320 articles
Adam J. Hodges
The first Red Scare, which occurred in 1919–1920, emerged out of longer clashes in the United States over the processes of industrialization, immigration, and urbanization as well as escalating conflict over the development of a labor movement challenging elite control of the economy. More immediately, the suppression of dissent during World War I and shock over a revolution in Russia that energized anti-capitalist radicals spurred further confrontations during an ill-planned postwar demobilization of the armed forces and economy.
A general strike in Seattle in February 1919 that grew out of wartime grievances among shipbuilders raised the specter of Bolshevik insurrection in the United States. National press attention fanned the flames and continued to do so throughout the year. In fact, 1919 became a record strike year. Massive coal and steel walkouts in the fall shook the industrial economy, while a work stoppage by Boston police became a national sensation and spread fears of a revolutionary breakdown in public order. Ultimately, however, much of the union militancy of the war era was crushed by the end of 1919 and the labor movement entered a period of retrenchment after 1922 that lasted until the 1930s.
Fall 1919 witnessed the creation of two competing Communist parties in the United States after months of press focus on bombs, riots, and strikes. Federal anti-radical investigative operations, which had grown enormously during World War I and continued into 1919, peaked in the so-called “Palmer Raids” of November 1919 and January 1920, named for US Attorney General A. Mitchell Palmer, who authorized them. The excesses of the Department of Justice and the decline of labor militancy caused a shift in press and public attention in 1920, though another Red Scare would escalate after World War II, with important continuities between the two.
Gabriella M. Petrick
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles.
The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods.
Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.
Cindy R. Lobel
Over the course of the 19th century, American cities developed from small seaports and trading posts to large metropolises. Not surprisingly, foodways and other areas of daily life changed accordingly. In 1800, the dietary habits of urban Americans were similar to those of the colonial period. Food provisioning was very local. Farmers, hunters, fishermen, and dairymen from a few miles away brought food by rowboats and ferryboats and by horse carts to centralized public markets within established cities. Dietary options were seasonal as well as regional. Few public dining options existed outside of taverns, which offered lodging as well as food. Most Americans, even in urban areas, ate their meals at home, which in many cases were attached to their workshops, countinghouses, and offices.
These patterns changed significantly over the course of the19th century, thanks largely to demographic changes and technological developments. By the turn of the 20th century, urban Americans relied on a food-supply system that was highly centralized and in the throes of industrialization. Cities developed complex restaurant sectors, and majority immigrant populations dramatically shaped and reshaped cosmopolitan food cultures. Furthermore, with growing populations, lax regulation, and corrupt political practices in many cities, issues arose periodically concerning the safety of the food supply. In sum, the roots of today’s urban food systems were laid down over the course of the 19th century.
Changing foodways, the consumption and production of food, access to food, and debates over food shaped the nature of American cities in the 20th century. As American cities transformed from centers of industrialization at the start of the century to post-industrial societies at the end of the 20th century, food cultures in urban America shifted in response to the ever-changing urban environment. Cities remained centers of food culture, diversity, and food reform despite these shifts.
Growing populations and waves of immigration changed the nature of food cultures throughout the United States in the 20th century. These changes were significant, all contributing to an evolving sense of American food culture. For urban denizens, however, food choice and availability were dictated and shaped by a variety of powerful social factors, including class, race, ethnicity, gender, and laboring status. While cities possessed an abundance of food in a variety of locations to consume food, fresh food often remained difficult for the urban poor to obtain as the 20th century ended.
As markets expanded from 1900 to 1950, regional geography became a less important factor in determining what types of foods were available. In the second half of the 20th century, even global geography became less important to food choices. Citrus fruit from the West Coast was readily available in northeastern markets near the start of the century, and off-season fruits and vegetables from South America filled shelves in grocery stores by the end of the 20th century. Urban Americans became further disconnected from their food sources, but this dislocation spurred counter-movements that embraced ideas of local, seasonal foods and a rethinking of the city’s relationship with its food sources.
Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.
According to the First Amendment of the US Constitution, Congress is barred from abridging the freedom of the press (“Congress shall make no law . . . abridging the freedom of speech, or of the press”). In practice, the history of press freedom is far more complicated than this simple constitutional right suggests. Over time, the meaning of the First Amendment has changed greatly. The Supreme Court largely ignored the First Amendment until the 20th century, leaving the scope of press freedom to state courts and legislatures. Since World War I, jurisprudence has greatly expanded the types of publication protected from government interference. The press now has broad rights to publish criticism of public officials, salacious material, private information, national security secrets, and much else. To understand the shifting history of press freedom, however, it is important to understand not only the expansion of formal constitutional rights but also how those rights have been shaped by such factors as economic transformations in the newspaper industry, the evolution of professional standards in the press, and the broader political and cultural relations between politicians and the press.
Alexander B. Haskell
Bacon’s Rebellion (1676–1677) was an uprising in the Virginia colony that its participants experienced as both a civil breakdown and a period of intense cosmic disorder. Although Thomas Hobbes had introduced his theory of state sovereignty a quarter century earlier, the secularizing connotations of his highly naturalized conceptualization of power had yet to make major inroads on a post-Reformation culture that was only gradually shifting from Renaissance providentialism to Enlightenment rationalism. Instead, the period witnessed a complicated interplay of providential beliefs and Hobbist doctrines. In the aftermath of the English civil war (1642–1651), this mingling of ideologies had prompted the Puritans’ own experimentation with Hobbes’s ideas, often in tandem with a Platonic spiritualism that was quite at odds with Hobbes’s own philosophical skepticism. The Restoration of 1660 had given an additional boost to Hobbism as his ideas won a number of prominent adherents in Charles II’s government.
The intermingling of providentialism and Hobbism gave Bacon’s Rebellion its particular aura of heightened drama and frightening uncertainty. In the months before the uprising, the outbreak of a war on the colony’s frontier with the Doeg and Susquehannock peoples elicited fears in the frontier counties of a momentous showdown between faithful planters and God’s enemies. In contrast, Governor Sir William Berkeley’s establishmentarian Protestantism encouraged him to see the frontiersmen’s vigilantism as impious, and the government’s more measured response to the conflict as inherently godlier because tied to time-tested hierarchies and institutions. Greatly complicating this already confusing scene, the colony also confronted a further destabilizing force in the form of the new Hobbist politics emerging from the other side of the ocean. In addition to a number of alarming policies emanating from Charles II’s court in the 1670s that sought to enhance the English state’s supremacy over the colonies, Hobbes’s doctrines also informed the young Nathaniel Bacon Jr.’s stated rationale for leading frontiersmen against local Indian communities without Berkeley’s authorization. Drawing on the Hobbes-influenced civil war-era writings of his relation the Presbyterian lawyer Nathaniel Bacon, the younger Bacon made the protection of the colony’s Christian brotherhood a moral priority that outweighed even the preservation of existing civil relations and public institutions.
While Berkeley’s antagonism toward this Hobbesian argument led him to lash out forcibly against Bacon as a singularly great threat to Virginia’s commonwealth, it was ordinary Virginians who most consequentially resisted Bacon’s strange doctrines. Yet a division persisted. Whereas the interior counties firmly rejected Bacon’s Hobbism in favor of the colony’s more traditional bonds to God and king, the frontier counties remained more open to a Hobbesian politics that promised their protection.
Carolyn Podruchny and Stacy Nation-Knapper
From the 15th century to the present, the trade in animal fur has been an economic venture with far-reaching consequences for both North Americans and Europeans (in which North Americans of European descent are included). One of the earliest forms of exchange between Europeans and North Americans, the trade in fur was about the garment business, global and local politics, social and cultural interaction, hunting, ecology, colonialism, gendered labor, kinship networks, and religion. European fashion, specifically the desire for hats that marked male status, was a primary driver for the global fur-trade economy until the late 19th century, while European desires for marten, fox, and other luxury furs to make and trim clothing comprised a secondary part of the trade. Other animal hides including deer and bison provided sturdy leather from which belts for the machines of the early Industrial Era were cut. European cloth, especially cotton and wool, became central to the trade for Indigenous peoples who sought materials that were lighter and dried faster than skin clothing. The multiple perspectives on the fur trade included the European men and indigenous men and women actually conducting the trade; the indigenous male and female trappers; European trappers; the European men and women producing trade goods; indigenous “middlemen” (men and women) who were conducting their own fur trade to benefit from European trade companies; laborers hauling the furs and trade goods; all those who built, managed, and sustained trading posts located along waterways and trails across North America; and those Europeans who manufactured and purchased the products made of fur and the trade goods desired by Indigenous peoples. As early as the 17th century, European empires used fur-trade monopolies to establish colonies in North America and later fur trading companies brought imperial trading systems inland, while Indigenous peoples drew Europeans into their own patterns of trade and power. By the 19th century, the fur trade had covered most of the continent and the networks of business, alliances, and families, and the founding of new communities led to new peoples, including the Métis, who were descended from the mixing of European and Indigenous peoples. Trading territories, monopolies, and alliances with Indigenous peoples shaped how European concepts of statehood played out in the making of European-descended nation-states, and the development of treaties with Indigenous peoples. The fur trade flourished in northern climes until well into the 20th century, after which time economic development, resource exploitation, changes in fashion, and politics in North America and Europe limited its scope and scale. Many Indigenous people continue today to hunt and trap animals and have fought in courts for Indigenous rights to resources, land, and sovereignty.
While American gambling has a historical association with the lawlessness of the frontier and with the wasteful leisure practices of Southern planters, it was in large cities where American gambling first flourished as a form of mass leisure, and as a commercial enterprise of significant scale. In the urban areas of the Mid-Atlantic, the Northeast, and the upper Mid-West, for the better part of two centuries the gambling economy was deeply intertwined with municipal politics and governance, the practices of betting were a prominent feature of social life, and controversies over the presence of gambling both legal and illegal, were at the center of public debate. In New York and Chicago in particular, but also in Cleveland, Pittsburgh, Detroit, Baltimore, and Philadelphia, gambling channeled money to municipal police forces and sustained machine politics. In the eyes of reformers, gambling corrupted governance and corroded social and economic interactions. Big city gambling has changed over time, often in a manner reflecting important historical processes and transformations in economics, politics, and demographics. Yet irrespective of such change, from the onset of Northern urbanization during the 19th century, through much of the 20th century, gambling held steady as a central feature of city life and politics. From the poolrooms where recently arrived Irish New Yorkers bet on horseracing after the Civil War, to the corner stores where black and Puerto Rican New Yorkers bet on the numbers game in the 1960s, the gambling activity that covered the urban landscape produced argument and controversy, particularly with respect to drawing the line between crime and leisure, and over the question of where and to what ends the money of the gambling public should be directed.
Throughout American history, gender, meaning notions of essential differences between women and men, has shaped how Americans have defined and engaged in productive activity. Work has been a key site where gendered inequalities have been produced, but work has also been a crucible for rights claims that have challenged those inequalities. Federal and state governments long played a central role in generating and upholding gendered policy. Workers and advocates have debated whether to advance laboring women’s cause by demanding equality with men or different treatment that accounted for women’s distinct responsibilities and disadvantages.
Beginning in the colonial period, constructions of dependence and independence derived from the heterosexual nuclear family underscored a gendered division of labor that assigned distinct tasks to the sexes, albeit varied by race and class. In the 19th century, gendered expectations shaped all workers’ experiences of the Industrial Revolution, slavery and its abolition, and the ideology of free labor. Early 20th-century reform movements sought to beat back the excesses of industrial capitalism by defining the sexes against each other, demanding protective labor laws for white women while framing work done by women of color and men as properly unregulated. Policymakers reinforced this framework in the 1930s as they built a welfare state that was rooted in gendered and racialized constructions of citizenship.
In the second half of the 20th century, labor rights claims that reasoned from the sexes’ distinctiveness increasingly gave way to assertions of sex equality, even as the meaning of that equality was contested. As the sex equality paradigm triumphed in the late 20th and early 21st centuries, seismic economic shifts and a conservative business climate narrowed the potential of sex equality laws to deliver substantive changes to workers.
Marjorie J. Spruill
The late 20th century saw gender roles transformed as the so-called Second Wave of American feminism that began in the 1960s gained support. By the early 1970s public opinion increasingly favored the movement and politicians in both major political parties supported it. In 1972 Congress overwhelmingly approved the Equal Rights Amendment (ERA) and sent it to the states. Many quickly ratified, prompting women committed to traditional gender roles to organize. However, by 1975 ERA opponents led by veteran Republican activist Phyllis Schlafly, founder of Stop ERA, had slowed the ratification process, although federal support for feminism continued. Congresswoman Bella Abzug (D-NY), inspired by the United Nations’ International Women’s Year (IWY) program, introduced a bill approved by Congress that mandated state and national IWY conferences at which women would produce recommendations to guide the federal government on policy regarding women. Federal funding of these conferences (held in 1977), and the fact that feminists were appointed to organize them, led to an escalation in tensions between feminist and conservative women, and the conferences proved to be profoundly polarizing events. Feminists elected most of the delegates to the culminating IWY event, the National Women’s Conference held in Houston, Texas, and the “National Plan of Action” adopted there endorsed a wide range of feminist goals including the ERA, abortion rights, and gay rights. But the IWY conferences presented conservatives with a golden opportunity to mobilize, and anti-ERA, pro-life, and anti-gay groups banded together as never before. By the end of 1977, these groups, supported by conservative Catholics,
Mormons, and evangelical and fundamentalist Protestants, had come together to form a “Pro-Family Movement” that became a powerful force in American politics. By 1980 they had persuaded the Republican Party to drop its support for women’s rights. Afterward, as Democrats continued to support feminist goals and the GOP presented itself as the defender of “family values,” national politics became more deeply polarized and bitterly partisan.
The issue of genocide and American Indian history has been contentious. Many writers see the massive depopulation of the indigenous population of the Americas after 1492 as a clear-cut case of the genocide. Other writers, however, contend that European and U.S. actions toward Indians were deplorable but were rarely if ever genocidal. To a significant extent, disagreements about the pervasiveness of genocide in the history of the post-Columbian Western Hemisphere, in general, and U.S. history, in particular, pivot on definitions of genocide. Conservative definitions emphasize intentional actions and policies of governments that result in very large population losses, usually from direct killing. More liberal definitions call for less stringent criteria for intent, focusing more on outcomes. They do not necessarily require direct sanction by state authorities; rather, they identify societal forces and actors. They also allow for several intersecting forces of destruction, including dispossession and disease. Because debates about genocide easily devolve into quarrels about definitions, an open-ended approach to the question of genocide that explores several phases and events provides the possibility of moving beyond the present stalemate. However one resolves the question of genocide in American Indian history, it is important to recognize that European and U.S. settler colonial projects unleashed massively destructive forces on Native peoples and communities. These include violence resulting directly from settler expansion, intertribal violence (frequently aggravated by colonial intrusions), enslavement, disease, alcohol, loss of land and resources, forced removals, and assaults on tribal religion, culture, and language. The configuration and impact of these forces varied considerably in different times and places according to the goals of particular colonial projects and the capacities of colonial societies and institutions to pursue them. The capacity of Native people and communities to directly resist, blunt, or evade colonial invasions proved equally important.
Gentrification is one of the most controversial issues in American cities today. But it also remains one of the least understood. Few agree on how to define it or whether it is boon or curse for cities. Gentrification has changed over time and has a history dating back to the early 20th century. Historically, gentrification has had a smaller demographic impact on American cities than suburbanization or immigration. But since the late 1970s, gentrification has dramatically reshaped cities like Seattle, San Francisco, and Boston. Furthermore, districts such as the French Quarter in New Orleans, New York City’s Greenwich Village, and Georgetown in Washington DC have had an outsized influence on the political, cultural, and architectural history of cities. Gentrification thus must be examined alongside suburbanization as one of the major historical trends shaping the 20th-century American metropolis.
Betsy A. Beasley
American cities have been transnational in nature since the first urban spaces emerged during the colonial period. Yet the specific shape of the relationship between American cities and the rest of the world has changed dramatically in the intervening years. In the mid-20th century, the increasing integration of the global economy within the American economy began to reshape US cities. In the Northeast and Midwest, the once robust manufacturing centers and factories that had sustained their residents—and their tax bases—left, first for the South and West, and then for cities and towns outside the United States, as capital grew more mobile and businesses sought lower wages and tax incentives elsewhere. That same global capital, combined with federal subsidies, created boomtowns in the once-rural South and West. Nationwide, city boosters began to pursue alternatives to heavy industry, once understood to be the undisputed guarantor of a healthy urban economy. Increasingly, US cities organized themselves around the service economy, both in high-end, white-collar sectors like finance, consulting, and education, and in low-end pink-collar and no-collar sectors like food service, hospitality, and health care. A new legal infrastructure related to immigration made US cities more racially, ethnically, and linguistically diverse than ever before.
At the same time, some US cities were agents of economic globalization themselves. Dubbed “global cities” by celebrants and critics of the new economy alike, these cities achieved power and prestige in the late 20th century not only because they had survived the ruptures of globalization but because they helped to determine its shape. By the end of the 20th century, cities that are not routinely listed among the “global city” elite jockeyed to claim “world-class” status, investing in high-end art, entertainment, technology, education, and health care amenities to attract and retain the high-income white-collar workers understood to be the last hope for cities hollowed out by deindustrialization and global competition. Today, the extreme differences between “global cities” and the rest of US cities, and the extreme socioeconomic stratification seen in cities of all stripes, is a key concern of urbanists.
Erik Gellman and Margaret Rung
From the late 1920s through the 1930s, countries on every inhabited continent suffered through a dramatic and wrenching economic contraction termed the Great Depression, an economic collapse that has come to represent the nadir of modern economic history. With national unemployment reaching well into double digits for over a decade, productivity levels falling by half, prices severely depressed, and millions of Americans without adequate food, shelter or clothing, the United States experienced some of the Great Depression’s severest consequences. The crisis left deep physical, psychological, political, social, and cultural impressions on the national landscape. It encouraged political reform and reaction, renewed labor activism, spurred migration, unleashed grass-roots movements, inspired cultural experimentation, and challenged family structures and gender roles.
Christopher R. Reed
The unanticipated and massive migration of half a million African Americans between 1916 and 1918 from the racially oppressive South to the welcoming North surprised the nation. Directly resulting from the advent of the First World War, the movement of these able-bodied workers provided essential labor to maintain wartime production that sustained the Allied war effort. One-tenth of the people who surged north headed to and remained in Chicago, where their presence challenged the status quo in the areas of employment, external race relations, internal race arrangements, politics, housing, and recreation. Once in the Windy City, this migrant-influenced labor pool expanded with the addition of resident blacks to form the city’s first African American industrial proletariat. Wages for both men and women increased compared to what they had been earning in the South, and local businesses were ready and willing to accommodate these new consumers. A small black business sector became viable and was able to support two banks, and by the mid-1920s, there were multiple stores along Chicago’s State Street forming a virtual “Black Wall Street.” An extant political submachine within Republic Party ranks also increased its power and influence in repeated electoral contests. Importantly, upon scrutiny, the purported social conflict between the Old Settler element and the newcomers was shown to be overblown and inconsequential to black progress.
Recent revisionist scholarship over the past two decades has served to minimize the first phase of northward movement and has positioned it within the context of a half-century phenomenon under the labels of the “Second Great Migration” and the “Great Black Migration.” No matter what the designation, the voluntary movement of five to six million blacks from what had been their traditional home to the uncertainty of the North and West between the First World War and the Vietnam conflict stands as both a condemnation of regional oppression of the human spirit and aspirations of millions, and a demonstration of group courage in taking on new challenges in new settings. Although Chicago would prove to be “no crystal stair,” it was on many occasions a land of hope and promise for migrants throughout the past century.
During the 20th century, the black population of the United States transitioned from largely rural to mostly urban. In the early 1900s the majority of African Americans lived in rural, agricultural areas. Depictions of black people in popular culture often focused on pastoral settings, like the cotton fields of the rural South. But a dramatic shift occurred during the Great Migrations (1914–1930 and 1941–1970) when millions of rural black southerners relocated to US cities.
Motivated by economic opportunities in urban industrial areas during World Wars I and II, African Americans opted to move to southern cities as well as to urban centers in the Northeast, Midwest, and West Coast. New communities emerged that contained black social and cultural institutions, and musical and literary expressions flourished. Black migrants who left the South exercised voting rights, sending the first black representatives to Congress in the 20th century. Migrants often referred to themselves as “New Negroes,” pointing to their social, political, and cultural achievements, as well as their use of armed self-defense during violent racial confrontations, as evidence of their new stance on race.
Philippe R. Girard
Haiti (known as Saint-Domingue until it gained its independence from France in 1804) had a noted economic and political impact on the United States during the era of the American Revolution, when it forced U.S. statesmen to confront issues they had generally avoided, most prominently racism and slavery. But the impact of the Haitian Revolution was most tangible in areas like commerce, territorial expansion, and diplomacy. Saint-Domingue served as a staging ground for the French military and navy during the American Revolution and provided troops to the siege of Savannah in 1779. It became the United States’ second-largest commercial partner during the 1780s and 1790s. After Saint-Domingue’s slaves revolted in 1791, many of its inhabitants found refuge in the United States, most notably in Philadelphia, Charleston, and New Orleans. Fears (or hopes) that the slave revolt would spread to the United States were prevalent in public opinion. As Saint-Domingue achieved quasi-autonomous status under the leadership of Toussaint Louverture, it occupied a central place in the diplomacy of John Adams and Thomas Jefferson. The Louisiana Purchase was made possible in part by the failure of a French expedition to Saint-Domingue in 1802–1803. Bilateral trade declined after Saint-Domingue acquired its independence from France in 1804 (after which Saint-Domingue became known as Haiti), but Haiti continued to loom large in the African-American imagination, and there were several attempts to use Haiti as a haven for U.S. freedmen. The U.S. diplomatic recognition of Haiti also served as a reference point for antebellum debates on slavery, the slave trade, and the status of free people of color in the United States.
The Haymarket Riot and Conspiracy of 1886 is a landmark in American social and political history. On May 4, 1886, during an open-air meeting near Haymarket Square in Chicago, someone threw a dynamite bomb into a squad of police, sparking a riot that resulted in the deaths of seven police officers and at least four rioters. Eight anarchists were brought to trial. Though the bomb-thrower was never apprehended, the eight radical leaders were charged as accessories before the fact for conspiring to murder the police. After the longest criminal trial in Illinois history up to that time, seven men were convicted and condemned to death and one to a long prison term. After all appeals were exhausted, four were executed, one cheated the hangman with a jail cell suicide, and the death sentences of two others were commuted to life imprisonment (all three incarcerated men were later pardoned by Governor John Peter Altgeld in 1892).
The Haymarket bombing and trial marked a pivotal moment in the history of American social movements. It sparked the nation’s first red scare whose fury disrupted even moderately leftist movements for a generation. It drove the nation’s labor unions onto a more conservative path than they had been heading before the bombing. The worldwide labor campaign for clemency for the convicted men became the foundation for the institution of International Workers’ Day on May 1, a holiday ironically observed in most countries except for the United States. It also began a tradition within the American left of memorializing the Haymarket defendants as the first martyrs to their cause.
Timothy S. Huebner
The Supreme Court of the United States stands at the head of the nation’s judicial system. Created in Article III of the Constitution of 1787 but obscured by the other branches of government during the first few decades of its history, the Court came into its own as a co-equal branch in the early 19th century. Its exercise of judicial review—the power that it claimed to determine the constitutionality of legislative acts—gave the Court a unique status as the final arbiter of the nation’s constitutional conflicts. From the slavery question during the antebellum era to abortion and gay rights in more recent times, the Court has decided cases brought to it by individual litigants, and in doing so has shaped American constitutional and legal development. Composed of unelected justices who serve “during good behavior,” the Court’s rise in stature has not gone uncontested. Throughout the nation’s history, Congress, the president, and organized interest groups have all attempted to influence the Court’s jurisdiction, composition, and decision making. The Court’s prominence reflects Americans’ historically paradoxical attitudes toward the judiciary: they have often been suspicious of the power of unelected judges at the same time that they have relied on independent judicial institutions to resolve their deepest disputes.