41-58 of 58 Results  for:

  • Late 19th-Century History x
Clear all

Article

Fires have plagued American cities for centuries. During the 18th century, the Great Fire of Boston (1760), the First Great Fire of New York City (1776), the First Great New Orleans Fire (1788), and the Great Fire of Savannah (1796) each destroyed hundreds of buildings and challenged municipal authorities to improve safety in an increasingly risky environment. Beginning in the 19th century, with increasing commerce, rapid urbanization, and the rise of industrial capitalism, fires became more frequent and destructive. Several initiatives sought to reduce the risk of fire: volunteer fire companies emerged in all major cities, fire insurance developed to help economic recovery, and municipal infrastructure like fire hydrants became ubiquitous to combat blazes. Despite significant efforts to curb this growing urban problem, fire dangers increased in the late 19th century as cities became epicenters of industry and the populations boomed. The “great” fires of the late 19th century, like those that took place in Chicago (1871), Boston (1872), Seattle (1889), Baltimore (1904), and San Francisco (1906), fundamentally altered cities. The fires not only destroyed buildings and took lives, but they also unearthed deep-rooted social tensions. Rebuilding in the aftermath of fire further exacerbated inequalities and divided cities. While fire loss tapered off after 1920, other issues surrounding urban fires heated up. The funneling of resources to suburbs in the post-war white-flight period left inner cities ill-equipped to handle serious conflagrations. In last few decades, suburban sprawl has created exurban fire regimes, where wildfires collide with cities. Extreme weather events, dependence on fossil fuels, deregulation of risky industries, and a lack of safe and affordable housing has put American metropolitan areas on a path to experience another period of “great” fires like those of the late 19th and 20th centuries.

Article

The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members. Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.

Article

Jessica Ellen Sewell

From 1800 to 2000, cities grew enormously, and saw an expansion of public spaces to serve the varied needs of a diverse population living in ever more cramped and urban circumstances. While a wide range of commercial semipublic spaces became common in the late 19th century, parks and streets were the best examples of truly public spaces with full freedom of access. Changes in the design and management of streets, sidewalks, squares, parks, and plazas during this period reflect changing ideas about the purpose of public space and how it should be used. Streets shifted from being used for a wide range of activities, including vending, playing games, and storing goods, to becoming increasingly specialized spaces of movement, designed and managed by the early twentieth century for automobile traffic. Sidewalks, which in the early nineteenth century were paid for and liberally used by adjacent businesses, were similarly specialized as spaces of pedestrian movement. However, the tradition of using streets and sidewalks as a space of public celebration and public speech remained strong throughout the period. During parades and protests, streets and sidewalks were temporarily remade as spaces of the performance of the public, and the daily activities of circulation and commerce were set aside. In 1800, the main open public spaces in cities were public squares or commons, often used for militia training and public celebration. In the second half of the 19th century, these were augmented by large picturesque parks. Designed as an antidote to urbanity, these parks served the public as a place for leisure, redefining public space as a polite leisure amenity, rather than a place for people to congregate as a public. The addition of playgrounds, recreational spaces, and public plazas in the 20th century served both the physical and mental health of the public. In the late 20th century, responding to neoliberal ideas and urban fiscal crises, the ownership and management of public parks and plazas was increasingly privatized, further challenging public accessibility.

Article

Paul Michel Taillon

Railroad workers occupy a singular place in United States history. Working in the nation’s first “big businesses,” they numbered in the hundreds of thousands, came from a wide range of ethnic and racial groups, included both men and women, and performed a wide range of often esoteric tasks. As workers in an industry that shaped the nation’s financial, technological, and political-economic development, railroaders drove the leading edge of industrialization in the 19th century and played a central role in the nation’s economy for much of the 20th. With the legends of “steel-driving” John Henry and “Cannonball” Casey Jones, railroad workers entered the national folklore as Americans pondered the benefits and costs of progress in an industrial age. Those tales highlighted the glamor and rewards, the risks and disparities, and the gender-exclusive and racially hierarchical nature of railroad work. They also offer insight into the character of railroad unionism, which, from its beginnings in the 1860s, oriented toward craft-based, male-only, white-supremacist forms of organization. Those unions remained fragmented, but they also became among the most powerful in the US labor movement, leveraging their members’ strategic location in a central infrastructural industry, especially those who operated the trains. That strategic location also ensured that any form of collective organization—and therefore potential disruption of the national economy—would lead to significant state intervention. Thus, the epic railroad labor conflict of the late 19th century generated the first federal labor relations laws in US history, which in turn set important precedents for 20th-century national labor relations policy. At the same time, the industry nurtured the first national all-Black, civil-rights-oriented unions, which played crucial roles in the 20th-century African American freedom struggle. By the mid-20th century, however, with technological change and the railroads entering a period of decline, the numbers of railroad workers diminished and with them, too, their once-powerful unions.

Article

Between the turbulent months of April and October 1919, racial violence reached a peak in the United States. Some twenty-six white-on-black massacres took place across the country. Author and civil rights activist James Weldon Johnson dubbed this terrible period the Red Summer as a way to characterize pervasive racial hostility and for the blood spilled in its wake. Yet, racial violence has had a long and painful history in the United States. From the moment enslaved Africans arrived in the New World, whites strove cruelly and systematically to maintain power and control over their bodies and labor. Indeed, many interactions between ostensible racial groups have centered on white hostility. A type of brutality that proved especially vicious took the shape of white-on-black race massacres. First appearing in the early 19th century and fading by the end of World War II, whites used these types of disturbances to deny African Americans progress and freedom. Destruction of black communities, massive bloodshed, and lynchings characterized these occurrences. The early 20th century, and particularly the Red Summer, marked a critical moment in the history of race relations of the United States—one that proved deadly to African Americans.

Article

While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.

Article

The Spanish-American War is best understood as a series of linked conflicts. Those conflicts punctuated Madrid’s decline to a third-rank European state and marked the United States’ transition from a regional to an imperial power. The central conflict was a brief conventional war fought in the Caribbean and the Pacific between Madrid and Washington. Those hostilities were preceded and followed by protracted and costly guerrilla wars in Cuba and the Philippines. The Spanish-American War was the consequence of the protracted stalemate in the Spanish-Cuban War. The economic and humanitarian distress which accompanied the fighting made it increasingly difficult for the United States to remain neutral until a series of Spanish missteps and bad fortune in early 1898 hastened the American entry to the war. The US Navy quickly moved to eliminate or blockade the strongest Spanish squadrons in the Philippines and Cuba; Spain’s inability to contest American control of the sea in either theater was decisive and permitted successful American attacks on outnumbered Spanish garrisons in Santiago de Cuba, Puerto Rico, and Manila. The transfer of the Philippines, along with Cuba, Puerto Rico, and Guam, to the United States in the Treaty of Paris confirmed American imperialist appetites for the Filipino nationalists, led by Emilio Aguinaldo, and contributed to tensions between the Filipino and American armies around and in Manila. Fighting broke out in February 1899, but the Filipino conventional forces were soon driven back from Manila and were utterly defeated by the end of the year. The Filipino forces that evaded capture re-emerged as guerrillas in early 1900, and for the next two and a half years the United States waged an increasingly severe anti-guerrilla war against Filipino irregulars. Despite Aguinaldo’s capture in early 1901, fighting continued in a handful of provinces until the spring of 1902, when the last organized resistance to American governance ended in Samar and Batangas provinces.

Article

With unique aboveground tombs, massive walls of burial vaults, and a density of historic funerary structures found nowhere else in the United States, the cemeteries of New Orleans are among the most fascinating and historic aspects of the city. The cemeteries reflect the unique climate, history, and culture of New Orleans. Although New Orleans cemeteries share characteristics with burial grounds in Mediterranean and many Latin American countries, such historic “cities of the dead” are rare in the United States. Four major factors guided the evolution of the New Orleans cemetery: (a) the high South Louisiana water table; (b) a need to conserve land in a growing city surrounded by water; (c) French, Spanish, and Caribbean traditions of aboveground burial and tomb building; and (d) neoclassical and Victorian architectural fashions that prevailed during the 19th century, the period during which the cemeteries as we know them developed. New Orleans’ burial traditions contrasted with the predominantly underground interments in the cemeteries of northern Europe, England, and the United States apart from the Gulf Coast. Because of this, tourists often marvel at the exotic nature of the historic New Orleans cemeteries, expressing many of the same impressions and reactions to their architecture, layout, and general character as their 19th-century forbears. New Orleanians also value their unique historic cemeteries, most of which are still active burial grounds.

Article

Jews began to arrive in the present-day South during the late 17th century and established community institutions in Charleston, South Carolina, and Savannah, Georgia, in the colonial era. These communities, along with Richmond, Virginia, accounted for a sizable minority of American Jews during the early 19th century. As Jewish migration to the United States increased, northern urban centers surpassed southern cities as national centers of Jewish life, although a minority of American Jews continued to make their way to southern market hubs in the mid-19th century. From Reconstruction through the “New South” era, Jews played a visible role in the development of the region’s commercial economy, and they organized Jewish institutions wherever they settled in sufficient numbers. In many respects, Jewish experiences in the South mirrored national trends. Jewish life developed similarly in small towns, whether in Georgia, Wisconsin, or California. Likewise, relationships between acculturated Jews and east European newcomers in the late 19th and early 20th centuries played out according to similar dynamics regardless of region. Perhaps the most distinctive feature of Jewish life in the South resulted from Jewish encounters with the region’s particular history of race and racism. The “classical” era of the Civil Rights movement highlights this fact, as southern Jews faced both heightened scrutiny from southern segregationists and frustration from northern coreligionists who supported the movement. Since the 1970s, overall trends in southern history have once again led to changes in the landscape of southern Jewry. Among other factors, the continued migration from rural to urban areas undermined the customer base for once-ubiquitous small-town Jewish retail businesses, and growing urban centers have attracted younger generations of Jewish professionals from both inside and outside the region. Consequently, the 21st-century Jewish South features fewer of the small-town communities that once typified the region, and its larger Jewish centers are not as identifiably “southern” as they once were.

Article

American Populism of the 1880s and 1890s marked the political high-water mark of the social movements of farmers, wage earners, women, and other sectors of society in the years after the Civil War. These movements forged the People’s Party, also known as the Populist Party, which campaigned against corporate power and economic inequality and was one of the most successful third parties in US history. Populist candidates won gubernatorial elections in nine states and gained some forty-five seats in the US Congress, including six seats in the Senate, and in 1892 the Populist presidential candidate, James B. Weaver of Iowa, received over a million votes, more than 8 percent of the total. The Populist Party was not a conventional political party but a coalition of organizations, including the Farmers’ Alliances, the Knights of Labor, and other reform movements, in what the Populists described as a “congress of industrial orders.” These organizations gave the People’s Party its strength and shaped its character as a party of working people with a vision of egalitarian cooperation and solidarity comparable to the labor, farmer-labor, and social-democratic parties in Europe and elsewhere that took shape in the same decades. Despite their egalitarian claims, however, the Populists had at best a mixed attitude towards the struggles for racial equality, and at worst accommodated Indian dispossession, Chinese exclusion, and Jim Crow segregation. In terms of its legacy, veterans of the Populist movement and many of its policy proposals would shape progressive and labor-farmer politics deep into the 20th century, partly by way of the Socialist Party, but mainly by way of the progressive or liberal wings of the Democratic and Republican Parties. At the same time, the adjective “populist” has come to describe a wide variety of political phenomena, including right-wing and nationalist movements, that have no particular connection to the late 19th-century Populism.

Article

The Special Relationship is a term used to describe the close relations between the United States and the United Kingdom. It applies particularly to the governmental realms of foreign, defense, security, and intelligence policy, but it also captures a broader sense that both public and private relations between the United States and Britain are particularly deep and close. The Special Relationship is thus a term for a reality that came into being over time as the result of political leadership as well as ideas and events outside the formal arena of politics. After the political break of the American Revolution and in spite of sporadic cooperation in the 19th century, it was not until the Great Rapprochement of the 1890s that the idea that Britain and the United States had a special kind of relationship took hold. This decade, in turn, created the basis for the Special Relationship, a term first used by Winston Churchill in 1944. Churchill did the most to build the relationship, convinced as he was that close friendship between Britain and the United States was the cornerstone of world peace and prosperity. During and after the Second World War, many others on both sides of the Atlantic came to agree with Churchill. The post-1945 era witnessed a flowering of the relationship, which was cemented—not without many controversies and crises—by the emerging Cold War against the Soviet Union. After the end of the Cold War in 1989, the relationship remained close, though it was severely tested by further security crises, Britain’s declining defense spending, the evolving implications of Britain’s membership in the European Union, the relative decline of Europe, and an increasing U.S. interest in Asia. Yet on many public and private levels, relations between the United States and Britain continue to be particularly deep, and thus the Special Relationship endures.

Article

The relationship between the United States and the island of Ireland combines nostalgic sentimentality and intervention in the sectarian conflict known as the “Troubles.” Irish migration to the United States remains a celebrated and vital part of the American saga, while Irish American interest—and involvement—in the “Troubles” during the second half of the 20th century was a problematic issue in transatlantic relations and for those seeking to establish a peaceful political consensus on the Irish question. Paradoxically, much of the historiography of American–Irish relations addresses the social, economic, and cultural consequences of the Irish in America, yet the major political issue—namely the United States’ approach to the “Troubles”—has only recently become subject of thorough historiographical inquiry. As much as the Irish have contributed to developments in American history, the American contribution to the Anglo-Irish process, and ultimate peace process, in order to end conflict in Northern Ireland is an example of the peacemaking potential of US foreign policy.

Article

Holly Pinheiro

The United States Colored Troops (USCT) were a collection of racially segregated, as mandated by the US War Department, Black US Army military units that served during the Civil War and the Reconstruction era. Their collective military service is widely known for playing critical roles in ending slavery, protecting freedpeople, defeating the Confederate military, enforcing multiple US government policies, and reframing gender ideology while making explicit demands for more racially inclusive conceptions of citizenship. Black men, from a wide range of backgrounds and ages, comprised the 179,000 individuals that served in a USCT regiment. For instance, some soldiers were formerly bondsmen from Confederate states, while others (who were freeborn) came from free states and even internationally (including Canada). USCT regiments were never solely male-exclusive domains. Numerous Black women supported the US war effort, in and outside of the military spaces, in many ways. For example, Susie King Taylor served as a laundress and nurse in the Thirty-Third United States Colored Infantry. Thus, Black women are important figures in understanding Black Civil War–era military service. Ultimately, USCT regiments, and their supporters, fought for racial and social justice (during and long after USCT soldiering ended). Their service also provided avenues for prominent abolitionists, including Frederick Douglass, William Still, and Mary Ann Shadd Cary, who used Black military service to make clear demands for slavery and racial discrimination to end. Meanwhile, various Black communities (especially Black women) lobbied to protect their civil rights (while attempting to support USCT soldiers’ training). Additionally, the families of USCT soldiers vocalized to the Bureau of Pensions (a branch of the US government) to remember their collective wartime sacrifices through Civil War pensions. Their collective actions highlight that the history of USCT regiments requires an understanding of Black families and communities whose lived experiences remain relevant today.

Article

The key pieces of antitrust legislation in the United States—the Sherman Antitrust Act of 1890 and the Clayton Act of 1914—contain broad language that has afforded the courts wide latitude in interpreting and enforcing the law. This article chronicles the judiciary’s shifting interpretations of antitrust law and policy over the past 125 years. It argues that jurists, law enforcement agencies, and private litigants have revised their approaches to antitrust to accommodate economic shocks, technological developments, and predominant economic wisdom. Over time an economic logic that prioritizes lowest consumer prices as a signal of allocative efficiency—known as the consumer welfare standard—has replaced the older political objectives of antitrust, such as protecting independent proprietors or small businesses, or reducing wealth transfers from consumers to producers. However, a new group of progressive activists has again called for revamping antitrust so as to revive enforcement against dominant firms, especially in digital markets, and to refocus attention on the political effects of antitrust law and policy. This shift suggests that antitrust may remain a contested field for scholarly and popular debate.

Article

James F. Siekmeier

Throughout the 19th and 20th centuries, U.S. officials often viewed Bolivia as both a potential “test case” for U.S. economic foreign policy and a place where Washington’s broad visions for Latin America might be implemented relatively easily. After World War II, Washington leaders sought to show both Latin America and the nonindustrialized world that a relatively open economy could produce significant economic wealth for Bolivia’s working and middle classes, thus giving the United States a significant victory in the Cold War. Washington sought a Bolivia widely open to U.S. influence, and Bolivia often seemed an especially pliable country. In order to achieve their goals in Bolivia, U.S. leaders dispensed a large amount of economic assistance to Bolivia in the 1950s—a remarkable development in two senses. First, the U.S. government, generally loath to aid Third World nations, gave this assistance to a revolutionary regime. Second, the U.S. aid program for Bolivia proved to be a precursor to the Alliance for Progress, the massive aid program for Latin America in the 1960s that comprised the largest U.S. economic aid program in the Third World. Although U.S. leaders achieved their goal of a relatively stable, noncommunist Bolivia, the decision in the late 1950s to significantly increase U.S. military assistance to Bolivia’s relatively small military emboldened that military, which staged a coup in 1964, snuffing out democracy for nearly two decades. The country’s long history of dependency in both export markets and public- and private-sector capital investment led Washington leaders to think that dependency would translate into leverage over Bolivian policy. However, the historical record is mixed in this regard. Some Bolivian governments have accommodated U.S. demands; others have successfully resisted them.

Article

Women in the United States have drunk, made, bought, sold, and organized both against and for the consumption of alcohol throughout the nation’s history. During the second half of the 20th century, however, women became increasingly visible as social drinkers and alcoholics. Specifically, the 1970s and 1980s marked women’s relationship to alcohol in interesting ways that both echoed moments from the past and ushered in new realities. Throughout these decades, women emerged as: (1) alcoholics who sought recovery in Alcoholics Anonymous or a lesser-known all-women’s sobriety program; (2) anti-alcohol activists who drew authority from their status as mothers; (3) potential criminals who harmed their progeny via fetal alcohol syndrome; and (4) recovery memoirists who claimed their addictions in unprecedented ways.

Article

Throughout the 19th century, American women experienced vast changes regarding possibilities for childbirth and for enhancing or restricting fertility control. At the beginning of the century, issues involving reproduction were discussed primarily in domestic, private settings among women’s networks that included family members, neighbors, or midwives. In the face of massive social and economic changes due to industrialization, urbanization, and immigration, many working-class women became separated from these traditional networks and knowledge and found themselves reliant upon emerging medical systems for care and advice during pregnancy and childbirth. At the same time, upper-class women sought out men in the emerging profession of obstetrics to deliver their babies in hopes of beating the frightening odds against maternal and infant health and even survival. Nineteenth-century reproduction was altered drastically with the printing and commercial boom of the middle of the century. Families could now access contraception and abortion methods and information, which was available earlier in the century albeit in a more private and limited manner, through newspapers, popular books, stores, and from door-to-door salesmen. As fertility control entered these public spaces, many policy makers became concerned about the impacts of such practices on the character and future of the nation. By the 1880s, contraception and abortion came under legal restrictions, just as women and their partners gained access to safer and more effective products than ever before. When the 19th century closed, legislatures and the medical profession raised obstacles that hindered the ability of most women to limit the size of their families as the national fertility rate reached an all-time low. Clearly, American families eagerly seized opportunities to exercise control over their reproductive destinies and their lives.

Article

Dana M. Caldemeyer

Unlike the anti-unionism that runs through the ranks of employers, worker anti-unionism describes the workers who are opposed to or who work against unionization. Anti-union actions can be seen throughout the United States from the early industrial age forward and include anything from refusing to join the union or follow union orders, to fighting against the union, such as with strikebreaking. Workers’ reasons for acting against the union, however, are far more complex, including the economic gains that come from remaining outside the union, moral opposition to unionism, and spite against the union. The variations between workers’ reasons for rejecting the union, then, provide insight into how workers define their place in society as well as their relationship with the union.