1-6 of 6 Results

  • Keywords: 19th century x
Clear all

Article

The Role of Congress in the History of US Foreign Relations  

Clay Silver Katsky

While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.

Article

Old and New Directions in the History of Lynching  

John Giggie and Emma Jackson Pepperman

Professional studies of lynching and its tragic history, especially its unique American character, depth, and dynamics, evolved in critically important ways from the pioneering scholarship of W. E. B. Du Bois and Ida B. Wells in the 1890s and 1900s across the 20th century and into the 21st century, their different stages introducing fresh categories of analysis amidst moments of dramatic civil rights protests. The first stage was heralded by pioneering research by African American intellectuals, such as Du Bois and Wells, and growing black demands for an end to discrimination in the late 19th century. Joining them in the early 20th century was a small group of social scientists whose case studies of lynching illuminated race relations in local communities or, from a very different vantage, saw them as symptoms of the violence so common in American society. The push to end racial and gender segregation and the passage of civil rights laws in the 1960s and 1970s encouraged historians to review lynchings from new perspectives, including gender, sexuality, religion, memory, and black community formation and resistance, stressing their centrality to modern southern history. The late 20th century saw a comparative turn. Historians evaluated lynching across America to identify common patterns of racial subjugation, but also to see how it was used to punish a wide range of Americans, including Asian Americans, Mexican Americans, and Native Americans. By 2000, the field shifted again, this time toward memorialization and community remembrance. Scholars and lawyers recalculated the total number of lynchings in America and found a large number of unrecorded killings, asked why so little was known about them, and created memorials to the victims. They demanded, too, that the causes and long-term consequences of the nation’s history of racial violence be discussed openly and taught in public schools. This effort is of particular resonance in 2020 as America confronts rising protests over a culture of mass incarceration and police brutality that disproportionately affects men and women of color. Indeed, the historical study of lynching has never been so vital as it is in the early 21st century.

Article

Slavery in North American Cities  

Leslie Harris

The patterns of urban slavery in North American and pre-Civil War US cities reveal the ways in which individual men and women, as well as businesses, institutions, and governmental bodies employed slave labor and readily adapted the system of slavery to their economic needs and desires. Colonial cities east and west of the Mississippi River founded initially as military forts, trading posts, and maritime ports, relied on African and Native American slave labor from their beginnings. The importance of slave labor increased in Anglo-American East Coast urban settings in the 18th century as the number of enslaved Africans increased in these colonies, particularly in response to the growth of the tobacco, wheat, and rice industries in the southern colonies. The focus on African slavery led most Anglo-American colonies to outlaw the enslavement of Native Americans, and urban slavery on the East Coast became associated almost solely with people of African descent. In addition, these cities became central nodes in the circum-Atlantic transportation and sale of enslaved people, slave-produced goods, and provisions for slave colonies whose economies centered on plantation goods. West of the Mississippi, urban enslavement of Native Americans, Mexicans, and even a few Europeans continued through the 19th century. As the thirteen British colonies transitioned to the United States during and after the Revolutionary War, three different directions emerged regarding the status of slavery, which would affect the status of slavery and people of African descent in cities. The gradual emancipation of enslaved people in states north of Delaware led to the creation of the so-called free states, with large numbers of free blacks moving into cities to take full advantage of freedom and the possibility of creating family and community. Although antebellum northern cities were located within areas where legalized slavery ended, these cities retained economic and political ties to southern slavery. At the same time, the radical antislavery movement developed in Philadelphia, Boston, and New York. Thus, Northern cities were the site of political conflicts between pro- and antislavery forces. In the Chesapeake, as the tobacco economy declined, slave owners manumitted enslaved blacks for whom they did not have enough work, creating large groups of free blacks in cities. But these states began to participate heavily in the domestic slave trade, with important businesses located in cities. And in the Deep South, the recommitment to slavery following the Louisiana Purchase and the emergence of the cotton economy led to the creation of a string of wealthy port cities critical to the transportation of slaves and goods. These cities were situated in local economic geographies that connected rural plantations to urban settings and in national and international economies of exchange of raw and finished goods that fueled industries throughout the Atlantic world. The vast majority of enslaved people employed in the antebellum South worked on rural farms, but slave labor was a key part of the labor force in southern cities. Only after the Civil War did slavery and cities become separate in the minds of Americans, as postwar whites north and south created a mythical South in which romanticized antebellum cotton plantations became the primary symbol of American slavery, regardless of the long history of slavery that preceded their existence.

Article

The Nineteenth-Century South in Film  

Matthew Christopher Hulbert

Representations of the 19th-century South on film have been produced in America from the Silent Era to the present. These movies include some of the most critically acclaimed and influential in American cinematic history—Gone with the Wind (1939), Glory (1989), 12 Years a Slave (2013)—and have produced some of the most iconic onscreen characters—Scarlett O’Hara, Josey Wales, Uncle Remus, Django Freeman—and onscreen moments—Rhett Butler not giving a damn, Mede boiling to death in a giant cauldron—in all of American popular culture. Depictions of the 19th-century South on film have also accounted for some of American film’s most notorious offerings—see the section entitled Anti-Slavery: Blaxploitation—and some of its biggest financial disappointments, such as Raintree County (1957) or Gods and Generals (2003). The Birth of a Nation (1915) and Gone with the Wind (1939) set standards for how southerners and other Americans would imagine the 19th-century South and subsequent films have been responding ever since. Prior to the apex of the Civil Rights Movement in the 1950s and 60s, Lost Cause themes dominated at the box office. After integration, the Civil Rights Act (1964), the Voting Rights Act (1965), and the assassinations of Malcolm X, Martin Luther King Jr., and Robert Kennedy, movies about the 19th-century South gradually shifted toward African American and female protagonists. Films also became increasingly graphic, violent, and sexualized in the late 1960s and 1970s as the pendulum swung fully away from the moonlight and magnolia, pro-slavery narratives of Gone with the Wind. In the 1990s, Hollywood began to carve out a middle position; however, neither extreme—exemplified by The Birth of a Nation and Mandingo, respectively—ever completely disappeared. Filmic coverage of the antebellum (1820–1860) and war years (1861–1865) dominates portrayals of the 19th-century South. These movies home in on major themes involving the legacy of slavery in America, the legacy of the Civil War, American territorial expansion, and American exceptionalism. Moreover, the South is habitually depicted as unique compared to the rest of the nation—for its hospitality, pace of living, race relations, mysteriousness, exoticism—and southerners are represented as innately more violent than their northern counterparts. Generally, the messaging of these films has been untethered from contemporary academic interpretations of the region, slavery, or the Civil War—yet their scripts and visuals have played, and continue to play, an outsized role in how Americans imagine the South and use the South to forge regional and national identities.

Article

The Lumbee Tribe of North Carolina  

Malinda Maynor Lowery

The Lumbee tribe of North Carolina, including approximately 55,000 enrolled members, is the largest Indian community east of the Mississippi River. Lumbee history serves as a window into the roles that Native people have played in the struggle to implement the founding principles of the United States, not just as “the First Americans,” but as members of their own nations, operating in their own communities’ interests. When we see US history through the perspectives of Native nations, we see that the United States is not only on a quest to expand rights for individuals. Surviving Native nations like the Lumbees, who have their own unique claims on this land and its ruling government, are forcing Americans to confront the ways in which their stories, their defining moments, and their founding principles are flawed and inadequate. We know the forced removals, the massacres, the protests that Native people have lodged against injustice, yet such knowledge is not sufficient to understand American history. Lumbee history provides a way to honor, and complicate, American history by focusing not just on the dispossession and injustice visited upon Native peoples, but on how and why Native survival matters. Native nations are doing the same work as the American nation—reconstituting communities, thriving, and finding a shared identity with which to achieve justice and self-determination. Since the late 19th century, Lumbee Indians have used segregation, war, and civil rights to maintain a distinct identity in the biracial South. The Lumbees’ survival as a people, a race, and a tribal nation shows that their struggle has revolved around autonomy, or the ability to govern their own affairs. They have sought local, state, and federal recognition to support that autonomy, but doing so has entangled the processes of survival with outsiders’ ideas about what constitutes a legitimate Lumbee identity. Lumbees continue to adapt to the constraints imposed on them by outsiders, strengthening their community ties through the process of adaptation itself. Lumbee people find their cohesion in the relentless fight for self-determination. Always, that struggle has mattered more than winning or losing a single battle.

Article

Food in 19th-Century American Cities  

Cindy R. Lobel

Over the course of the 19th century, American cities developed from small seaports and trading posts to large metropolises. Not surprisingly, foodways and other areas of daily life changed accordingly. In 1800, the dietary habits of urban Americans were similar to those of the colonial period. Food provisioning was very local. Farmers, hunters, fishermen, and dairymen from a few miles away brought food by rowboats and ferryboats and by horse carts to centralized public markets within established cities. Dietary options were seasonal as well as regional. Few public dining options existed outside of taverns, which offered lodging as well as food. Most Americans, even in urban areas, ate their meals at home, which in many cases were attached to their workshops, countinghouses, and offices. These patterns changed significantly over the course of the19th century, thanks largely to demographic changes and technological developments. By the turn of the 20th century, urban Americans relied on a food-supply system that was highly centralized and in the throes of industrialization. Cities developed complex restaurant sectors, and majority immigrant populations dramatically shaped and reshaped cosmopolitan food cultures. Furthermore, with growing populations, lax regulation, and corrupt political practices in many cities, issues arose periodically concerning the safety of the food supply. In sum, the roots of today’s urban food systems were laid down over the course of the 19th century.