1-20 of 21 Results  for:

  • Southern History x
Clear all

Article

The period from 1900 to 1945 was characterized by both surprising continuity and dramatic change in southern agriculture. Unlike the rest of the nation, which urbanized and industrialized at a rapid pace in the late nineteenth century, the South remained overwhelmingly rural and poor, from the 1880s through the 1930s. But by 1945, the region was beginning to urbanize and industrialize into a recognizably modern South, with a population concentrated in urban centers, industries taking hold, and agriculture following the larger-scale, mechanized trend common in other farming regions of the country. Three overlapping factors explain this long lag followed by rapid transformation. First, the cumulative effects of two centuries of land-extensive, staple crop agriculture and white supremacy had sapped the region of much of its fertility and limited its options for prosperity. Second, in response to this “problem South,” generations of reformers sought to modernize the South, along with other rural areas around the world. These piecemeal efforts became the foundation for the South’s dramatic transformation by federal policy known as the New Deal. Third, poor rural southerners, both black and white, left the countryside in increasing numbers. Coupled with the labor demands created by two major military conflicts, World War I and World War II, this movement aided and abetted the mechanization of agriculture and the depopulation of the rural South.

Article

Brooke Bauer

The Catawba Indian Nation of the 1750s developed from the integration of diverse Piedmont Indian people who belonged to and lived in autonomous communities along the Catawba River of North and South Carolina. Catawban-speaking Piedmont Indians experienced many processes of coalescence, where thinly populated groups joined the militarily strong Iswą Indians (Catawba proper) for protection and survival. Over twenty-five groups of Indians merged with the Iswą, creating an alliance or confederation of tribal communities. They all worked together building a unified community through kinship, traditional customs, and a shared history to form a nation, despite the effects of colonialism, which included European settlement, Indian slavery, warfare, disease, land loss, and federal termination. American settler colonialism, therefore, functions to erase and exterminate Native societies through biological warfare (intentional or not), military might, seizure of Native land, and assimilation. In spite of these challenges, the Catawbas’ nation-building efforts have been constant, but in 1960 the federal government terminated its relationship with the Nation. In the 1970s, the Catawba Indian Nation filed a suit to reclaim their land and their federal recognition status. Consequently, the Nation received federal recognition in 1993 and became the only federally recognized tribe in the state of South Carolina. The Nation has land seven miles east of the city of Rock Hill along the Catawba River. Tribal citizenship consists of 3,400 Catawbas including 2,400 citizens of voting age. The tribe holds elections every four years to fill five executive positions—Chief, Assistant Chief, Secretary/Treasurer, and two at-large positions. Scholarship on Southeastern Indians focuses less on the history of the Catawba Indian Nation and more on the historical narratives of the Five Civilized Tribes, which obscures the role Catawbas filled in the history of the development of the South. Finally, a comprehensive Catawba Nation history explains how the people became Catawba and, through persistence, ensured the survival of the Nation and its people.

Article

Distinctive patterns of daily life defined the Jim Crow South. Contrary to many observers’ emphasis on de jure segregation—meaning racial separation demanded by law—neither law nor the physical separation of blacks and whites was at the center of the early 20th-century South’s social system. Instead, separation, whether by law or custom, was one of multiple tools whites used to subordinate and exclude blacks and to maintain notions of white racial purity. In turn, these notions themselves varied over time and across jurisdictions, at least in their details, as elites tried repeatedly to establish who was “white,” who was “black,” and how the legal fictions they created would apply to Native Americans and others who fit neither category. Within this complex multiracial world of the South, whites’ fundamental commitment to keeping blacks “in their place” manifested most routinely in day-to-day social dramas, often described in terms of racial “etiquette.” The black “place” in question was socially but not always physically distant from whites, and the increasing number of separate, racially marked spaces and actual Jim Crow laws was a development over time that became most pronounced in urban areas. It was a development that reveals blacks’ determination to resist racial oppression and whites’ perceived need to shore up a supposedly natural order that had, in fact, always been enforced by violence as well as political and economic power. Black resistance took many forms, from individual, covert acts of defiance to organized political movements. Whether in response to African Americans’ continued efforts to vote or their early 20th-century boycotts of segregated streetcars or World War I-era patterns of migration that threatened to deplete the agricultural labor force, whites found ways to counter blacks’ demands for equal citizenship and economic opportunity whenever and wherever they appeared. In the rural South, where the majority of black Southerners remained economically dependent on white landowners, a “culture of personalism” characterized daily life within a paternalistic model of white supremacy that was markedly different from urban—and largely national, not merely southern—racial patterns. Thus, distinctions between rural and urban areas and issues of age and gender are critical to understanding the Jim Crow South. Although schools were rigorously segregated, preadolescent children could be allowed greater interracial intimacy in less official settings. Puberty became a break point after which close contact, especially between black males and white females, was prohibited. All told, Jim Crow was an inconsistent and uneven system of racial distinction and separation whose great reach shaped the South’s landscape and the lives of all Southerners, including those who were neither black nor white.

Article

Cody R. Melcher and Michael Goldfield

The failure of labor unions to succeed in the American South, largely because national unions proved unable or unwilling to confront white supremacy head on, offers an important key to understanding post–World War II American politics, especially the rise of the civil rights movement. Looking at the 1930s and 1940s, it is clear that the failure was not the result of a cultural aversion to collective action on the part of white workers in the South, as several histories have suggested, but rather stemmed from the refusal of the conservative leadership in the Congress of Industrial Organizations (CIO) to organize an otherwise militant southern workforce composed of both whites and Blacks. These lost opportunities, especially among southern woodworkers and textile workers, contrasts sharply with successful interracial union drives among southern coal miners and steelworkers, especially in Alabama. Counterfactual examples of potentially durable civil rights unionism illustrate how the labor movement could have affected the civil rights movement and transformed politics had the South been unionized.

Article

Bacon’s Rebellion (1676–1677) was an uprising in the Virginia colony that its participants experienced as both a civil breakdown and a period of intense cosmic disorder. Although Thomas Hobbes had introduced his theory of state sovereignty a quarter century earlier, the secularizing connotations of his highly naturalized conceptualization of power had yet to make major inroads on a post-Reformation culture that was only gradually shifting from Renaissance providentialism to Enlightenment rationalism. Instead, the period witnessed a complicated interplay of providential beliefs and Hobbist doctrines. In the aftermath of the English civil war (1642–1651), this mingling of ideologies had prompted the Puritans’ own experimentation with Hobbes’s ideas, often in tandem with a Platonic spiritualism that was quite at odds with Hobbes’s own philosophical skepticism. The Restoration of 1660 had given an additional boost to Hobbism as his ideas won a number of prominent adherents in Charles II’s government. The intermingling of providentialism and Hobbism gave Bacon’s Rebellion its particular aura of heightened drama and frightening uncertainty. In the months before the uprising, the outbreak of a war on the colony’s frontier with the Doeg and Susquehannock peoples elicited fears in the frontier counties of a momentous showdown between faithful planters and God’s enemies. In contrast, Governor Sir William Berkeley’s establishmentarian Protestantism encouraged him to see the frontiersmen’s vigilantism as impious, and the government’s more measured response to the conflict as inherently godlier because tied to time-tested hierarchies and institutions. Greatly complicating this already confusing scene, the colony also confronted a further destabilizing force in the form of the new Hobbist politics emerging from the other side of the ocean. In addition to a number of alarming policies emanating from Charles II’s court in the 1670s that sought to enhance the English state’s supremacy over the colonies, Hobbes’s doctrines also informed the young Nathaniel Bacon Jr.’s stated rationale for leading frontiersmen against local Indian communities without Berkeley’s authorization. Drawing on the Hobbes-influenced civil war-era writings of his relation the Presbyterian lawyer Nathaniel Bacon, the younger Bacon made the protection of the colony’s Christian brotherhood a moral priority that outweighed even the preservation of existing civil relations and public institutions. While Berkeley’s antagonism toward this Hobbesian argument led him to lash out forcibly against Bacon as a singularly great threat to Virginia’s commonwealth, it was ordinary Virginians who most consequentially resisted Bacon’s strange doctrines. Yet a division persisted. Whereas the interior counties firmly rejected Bacon’s Hobbism in favor of the colony’s more traditional bonds to God and king, the frontier counties remained more open to a Hobbesian politics that promised their protection.

Article

Jerry Watkins

Regional variation, race, gender presentation, and class differences mean that there are many “Gay Souths.” Same-sex desire has been a feature of the human experience since the beginning, but the meanings, expressions, and ability to organize one’s life around desire have shifted profoundly since the invention of sexuality in the mid-19th century. World War II represented a key transition in gay history, as it gave many people a language for their desires. During the Cold War, government officials elided sex, race, and gender transgression with subversion and punished accordingly by state committees. These forces profoundly shaped gay social life, and rather than a straight line from closet to liberation, gays in the South have meandered. Movement rather than stasis, circulation rather than congregation, and the local rather than the stranger as well as creative uses of space and place mean that the gay South is distinctive, though not wholly unique, from the rest of the country.

Article

Emancipation celebrations in the United States have been important and complicated moments of celebration and commemoration. Since the end of the slave trade in 1808 and the enactment of the British Emancipation Act in 1834 people of African descent throughout the Atlantic world have gathered, often in festival form, to remember and use that memory for more promising futures. In the United States, emancipation celebrations exploded after the Civil War, when each local community celebrated their own experience of emancipation. For many, the commemoration took the form of a somber church service, Watch Night, which recognized the signing of the Emancipation Proclamation on January 1, 1863. Juneteenth, which recognized the end of slavery in Texas on June 19, 1865, became one of the most vibrant and longstanding celebrations. Although many emancipation celebrations disappeared after World War I, Juneteenth remained a celebration in most of Texas through the late 1960s when it disappeared from all cities in the state. However, because of the Second Great Migration, Texans transplanted in Western cities continued the celebration in their new communities far from Texas. In Texas, Juneteenth was resurrected in 1979 when state representative, later Congressman, Al Edwards successfully sponsored a bill to make Juneteenth a state holiday and campaigned to spread Juneteenth throughout the country. This grassroots movement brought Juneteenth resolutions to forty-six states and street festivals in hundreds of neighborhoods. Juneteenth’s remarkable post-1980 spread has given it great resonance in popular culture as well, even becoming a focus of two major television episodes in 2016 and 2017.

Article

Latinas/os were present in the American South long before the founding of the United States of America, yet knowledge about their southern communities in different places and time periods is deeply uneven. In fact, regional themes important throughout the South clarify the dynamics that shaped Latinas/os’ lives, especially race, ethnicity, and the colorline; work and labor; and migration and immigration. Ideas about racial difference, in particular, reflected specifics of place, and intersections of local, regional, and international endeavors and movements of people and resources. Accordingly, Latinas/os’ position and treatment varied across the South. They first worked in agricultural fields picking cotton, oranges, and harvesting tobacco, then in a variety of industries, especially poultry and swine processing and packing. The late 20th century saw the rapid growth of Latinas/os in southern states due to changing migration and immigration patterns that moved from traditional states of reception to new destinations in rural, suburban, and urban locales with limited histories with Latinas/os or with substantial numbers of immigrants in general.

Article

On December 20, 1803, residents of New Orleans gathered at the Place d’Armes in the city center to watch as the French flag was lowered and the flag of the United States was raised in its place. Toasts were made to the US president, the French First Consul, and the Spanish king (whose flag had been lowered in a similar ceremony just twenty days earlier), and the celebrations continued throughout the night. The following day, however, began the process of determining just what it meant now that Louisiana was a part of the United States, initiating the first great test for the United States of its ability to expand its borders, incorporating both territories and peoples. The treaty ratifying the transfer, signed in Paris the previous April 30th, promised that “the inhabitants of the ceded territory shall be incorporated in the Union of the United States” where they would experience “the enjoyment of all these rights, advantages and immunities of citizens of the United States.” These inhabitants included thousands of people of French and Spanish descent, several thousand slaves of African descent, and about fifteen hundred free people of at least partial African ancestry; most of these inhabitants spoke French or (far fewer) Spanish and practiced Catholicism. In addition, the territory was home to tens of thousands of indigenous peoples, many of whom still lived on traditional territories and under their own sovereignty. For a few inhabitants of what would become the Territory of Orleans and later the state of Louisiana, incorporation did lead to “the enjoyment of all these rights” and gave some small grain of truth to Thomas Jefferson’s hope that the trans-Mississippi region would undergird the United States as an “empire of liberty,” although even for Europeans of French and Spanish ancestry, the process was neither easy nor uncontested. For most, however, incorporation led to the expansion of the United States as an empire of slavery, one built upon the often violent dispossession of native peoples of their lands and the expropriated labor of enslaved peoples of African descent.

Article

The capture, adoption, and/or enslavement of enemies in North American warfare long predated the European invasion of the 16th century. In every region and among nearly every nation of Native North America, captive-taking continued after the arrival of the Spanish, English, and French and accelerated in the 18th century as a result of the opportunities and pressures that colonialism brought to bear on indigenous peoples. Although the famous narratives of Indian captivity were written by people of European descent, the majority of people who were taken and adopted or enslaved by Native Americans were themselves Native American women, girls, and boys. One scholar estimates that perhaps as many as 2.5 to 5 million Indigenous slaves were owned by Europeans in the Western hemisphere from 1492 to 1900; this estimate excludes the millions more who were retained within other Indigenous communities. Within these Native American communities, captives served a variety of purposes along a continuum: depending on their age and sex, they might be adopted fully into a new kinship network, or they might be ritually executed. Most captive adults seem to have endured fates in-between these dramatic poles: they might be marked as “adopted slaves” and set to the most tedious and repetitive work; they might be traded or given as gifts for profit or diplomacy; they might be subjected to coerced sex; or they might marry a captor and have children who were full kin members of their new community. Most would probably experience more than one of these fates. In the early 21st century, important scholarship on Native American captivity has emphasized its similarities to African slavery and how the African slave trade influenced Native American captive raiding, trading, and enslavement in the colonial era and in the early United States. But there were two possibly interrelated important differences between these two slaveries. First, unlike the adult male African captives who were preferred by Europeans for enslavement in North America, most captives taken by other Native Americans were women and children. Second, this Indigenous slavery was not heritable, although the captives themselves were frequently marked or even mutilated to signify their status as outsiders, or not-kin, in a world defined by kinship ties. Although the differences of intersecting European and Indigenous cultures, chronology, and context made for widely disparate experiences in Indian captivity and slavery over four centuries, one constant across time and space is that captive-taking seems to have been intended to grow the captors’ populations as well as deprive their enemies of productive and reproductive labor. The appropriation of girls’ and women’s sexuality and reproductive power became the means by which female captives might suffer intensely as well as possibly improve their standing and their children’s futures.

Article

W. Fitzhugh Brundage

Rapid and far-reaching environmental, economic, and social transformations marked the New South (1880–1910). Substantial industrialization and urbanization followed the expansion of rail networks across the region, and produced unprecedented changes in daily life for both urban and rural residents. White southern elites embraced these innovations and worked to ensure that state governments evolved in order to advance them. One of their most significant endeavors was the institutionalization of white supremacy in virtually every facet of public life. Black and white voluntary organizations complemented, and sometimes contested, the emerging economic and social order in the New South. Similarly, while many contemporary representations of the region in national culture trivialized the scale and costs of the changes underway, some artists offered revelatory portraits of a region consumed by upheaval.

Article

Matthew Christopher Hulbert

Representations of the 19th-century South on film have been produced in America from the Silent Era to the present. These movies include some of the most critically acclaimed and influential in American cinematic history—Gone with the Wind (1939), Glory (1989), 12 Years a Slave (2013)—and have produced some of the most iconic onscreen characters—Scarlett O’Hara, Josey Wales, Uncle Remus, Django Freeman—and onscreen moments—Rhett Butler not giving a damn, Mede boiling to death in a giant cauldron—in all of American popular culture. Depictions of the 19th-century South on film have also accounted for some of American film’s most notorious offerings—see the section entitled Anti-Slavery: Blaxploitation—and some of its biggest financial disappointments, such as Raintree County (1957) or Gods and Generals (2003). The Birth of a Nation (1915) and Gone with the Wind (1939) set standards for how southerners and other Americans would imagine the 19th-century South and subsequent films have been responding ever since. Prior to the apex of the Civil Rights Movement in the 1950s and 60s, Lost Cause themes dominated at the box office. After integration, the Civil Rights Act (1964), the Voting Rights Act (1965), and the assassinations of Malcolm X, Martin Luther King Jr., and Robert Kennedy, movies about the 19th-century South gradually shifted toward African American and female protagonists. Films also became increasingly graphic, violent, and sexualized in the late 1960s and 1970s as the pendulum swung fully away from the moonlight and magnolia, pro-slavery narratives of Gone with the Wind. In the 1990s, Hollywood began to carve out a middle position; however, neither extreme—exemplified by The Birth of a Nation and Mandingo, respectively—ever completely disappeared. Filmic coverage of the antebellum (1820–1860) and war years (1861–1865) dominates portrayals of the 19th-century South. These movies home in on major themes involving the legacy of slavery in America, the legacy of the Civil War, American territorial expansion, and American exceptionalism. Moreover, the South is habitually depicted as unique compared to the rest of the nation—for its hospitality, pace of living, race relations, mysteriousness, exoticism—and southerners are represented as innately more violent than their northern counterparts. Generally, the messaging of these films has been untethered from contemporary academic interpretations of the region, slavery, or the Civil War—yet their scripts and visuals have played, and continue to play, an outsized role in how Americans imagine the South and use the South to forge regional and national identities.

Article

John Giggie and Emma Jackson Pepperman

Professional studies of lynching and its tragic history, especially its unique American character, depth, and dynamics, evolved in critically important ways from the pioneering scholarship of W. E. B. Du Bois and Ida B. Wells in the 1890s and 1900s across the 20th century and into the 21st century, their different stages introducing fresh categories of analysis amidst moments of dramatic civil rights protests. The first stage was heralded by pioneering research by African American intellectuals, such as Du Bois and Wells, and growing black demands for an end to discrimination in the late 19th century. Joining them in the early 20th century was a small group of social scientists whose case studies of lynching illuminated race relations in local communities or, from a very different vantage, saw them as symptoms of the violence so common in American society. The push to end racial and gender segregation and the passage of civil rights laws in the 1960s and 1970s encouraged historians to review lynchings from new perspectives, including gender, sexuality, religion, memory, and black community formation and resistance, stressing their centrality to modern southern history. The late 20th century saw a comparative turn. Historians evaluated lynching across America to identify common patterns of racial subjugation, but also to see how it was used to punish a wide range of Americans, including Asian Americans, Mexican Americans, and Native Americans. By 2000, the field shifted again, this time toward memorialization and community remembrance. Scholars and lawyers recalculated the total number of lynchings in America and found a large number of unrecorded killings, asked why so little was known about them, and created memorials to the victims. They demanded, too, that the causes and long-term consequences of the nation’s history of racial violence be discussed openly and taught in public schools. This effort is of particular resonance in 2020 as America confronts rising protests over a culture of mass incarceration and police brutality that disproportionately affects men and women of color. Indeed, the historical study of lynching has never been so vital as it is in the early 21st century.

Article

Simon Balto and Max Felker-Kantor

The relationship between policing and crime in American history has been tenuous at best. In fact, policing and crime are imperfectly correlated. Crime is understood as a socially constructed category that varies over time and space. Crime in the American city was produced by the actions of police officers on the street and the laws passed by policymakers that made particular behaviors, often ones associated with minoritized people, into something called “crime.” Police create a statistical narrative about crime through the behaviors and activities they choose to target as “crime.” As a result, policing the American city has functionally reinforced the nation’s dominant racial and gender hierarchies as much as (or more so) than it has served to ensure public safety or reduce crime. Policing and the production of crime in the American city has been broadly shaped by three interrelated historical processes: racism, xenophobia, and capitalism. As part of these processes, policing took many forms across space and time. From origins in the slave patrols in the South, settler colonial campaigns of elimination in the West, and efforts to put down striking workers in the urban North, the police evolved into the modern, professional forces familiar to many Americans in the early 21st century. The police, quite simply, operated to uphold a status quo based on unequal and hierarchical racial, ethnic, and economic orders. Tracing the history of policing and crime from the colonial era to the present demonstrates the ways that policing has evolved through a dialectic of crisis and reform. Moments of protest and unrest routinely exposed the ways policing was corrupt, violent, and brutal, and did little to reduce crime in American cities. In turn, calls for reform produced “new” forms of policing (what was often referred to as professionalization in the early and mid-20th century and community policing in the 21st). But these reforms did not address the fundamental role or power of police in society. Rather, these reforms often expanded it, producing new crises, new protests, and still more “reforms,” in a seemingly endless feedback loop. From the vantage point of the 21st century, this evolution demonstrates the inability of reform or professionalization to address the fundamental role of police in American society. In short, it is a history that demands a rethinking of the relationship between policing and crime, the social function of the police, and how to achieve public safety in American cities.

Article

Katherine R. Jewell

The term “Sunbelt” connotes a region defined by its environment. “Belt” suggests the broad swath of states from the Atlantic coast, stretching across Texas and Oklahoma, the Southwest, to southern California. “Sun” suggests its temperate—even hot—climate. Yet in contrast to the industrial northeastern and midwestern Rust Belt, or perhaps, “Frost” Belt, the term’s emergence at the end of the 1960s evoked an optimistic, opportunistic brand. Free from snowy winters, with spaces cooled by air conditioners, and Florida’s sandy beaches or California’s surfing beckoning, it is true that more Americans moved to the Sunbelt states in the 1950s and 1960s than to the deindustrializing centers of the North and East. But the term “Sunbelt” also captures an emerging political culture that defies regional boundaries. The term originates more from the diagnosis of this political climate, rather than an environmental one, associated with the new patterns of migration in the mid-20th century. The term defined a new regional identity: politically, economically, in policy, demographically, and socially, as well as environmentally. The Sunbelt received federal money in an unprecedented manner, particularly because of rising Cold War defense spending in research and military bases, and its urban centers grew in patterns unlike those in the old Northeast and Midwest, thanks to the policy innovations wrought by local boosters, business leaders, and politicians, which defined politics associated with the region after the 1970s. Yet from its origin, scholars debate whether the Sunbelt’s emergence reflects a new regional identity, or something else.

Article

Lorien Foote

Soldiers enlisted in the Union Army from every state in the Union and the Confederacy. The initial volunteers were motivated to preserve the accomplishments of the American Revolution and save the world’s hope that democratic government could survive. They were influenced by their culture’s ideals of manhood and republican ideals of the citizen soldier. They served in regiments that retained close ties with their sending communities throughout the war. Recruits faced a difficult adjustment period when their units were mustered into the US Army. The test of battle taught soldiers to value some drills and discipline, but many soldiers insisted that officers respect their independence and equality. Soldiers successfully resisted many aspects of formal military discipline. Army life exposed conflicts between soldiers who sought to create moral regiments and soldiers who displayed manliness through fighting and drinking. Establishing honor before peers was an important component of soldier life. Effective soldiering involved enduring the boredom and disease of camp, the rigors of marching, and the terror of battle. To survive, soldiers formed close bonds with their comrades, mastered self-care techniques to stay healthy, applied skills learned from their civilian occupations on the battlefield, and remained connected to their families and communities. Conscription changed the character of the Union Army. Officers tightened discipline over the influx of lower-class “roughs.” Union soldiers generally demonized their enemies as inferior barbarians. Because of their interaction with slaves in the South, Union soldiers quickly shifted their support to emancipation. Although Christianity and ideals of civilized behavior placed some restraints on Union soldiers when they encountered southerners, they supported and implemented hard war measures against the South’s population and resources, and treated guerrillas and their supporters with particular brutality. In the election of 1864, Union soldiers voted to fight until the Confederacy was defeated.

Article

With unique aboveground tombs, massive walls of burial vaults, and a density of historic funerary structures found nowhere else in the United States, the cemeteries of New Orleans are among the most fascinating and historic aspects of the city. The cemeteries reflect the unique climate, history, and culture of New Orleans. Although New Orleans cemeteries share characteristics with burial grounds in Mediterranean and many Latin American countries, such historic “cities of the dead” are rare in the United States. Four major factors guided the evolution of the New Orleans cemetery: (a) the high South Louisiana water table; (b) a need to conserve land in a growing city surrounded by water; (c) French, Spanish, and Caribbean traditions of aboveground burial and tomb building; and (d) neoclassical and Victorian architectural fashions that prevailed during the 19th century, the period during which the cemeteries as we know them developed. New Orleans’ burial traditions contrasted with the predominantly underground interments in the cemeteries of northern Europe, England, and the United States apart from the Gulf Coast. Because of this, tourists often marvel at the exotic nature of the historic New Orleans cemeteries, expressing many of the same impressions and reactions to their architecture, layout, and general character as their 19th-century forbears. New Orleanians also value their unique historic cemeteries, most of which are still active burial grounds.

Article

Jews began to arrive in the present-day South during the late 17th century and established community institutions in Charleston, South Carolina, and Savannah, Georgia, in the colonial era. These communities, along with Richmond, Virginia, accounted for a sizable minority of American Jews during the early 19th century. As Jewish migration to the United States increased, northern urban centers surpassed southern cities as national centers of Jewish life, although a minority of American Jews continued to make their way to southern market hubs in the mid-19th century. From Reconstruction through the “New South” era, Jews played a visible role in the development of the region’s commercial economy, and they organized Jewish institutions wherever they settled in sufficient numbers. In many respects, Jewish experiences in the South mirrored national trends. Jewish life developed similarly in small towns, whether in Georgia, Wisconsin, or California. Likewise, relationships between acculturated Jews and east European newcomers in the late 19th and early 20th centuries played out according to similar dynamics regardless of region. Perhaps the most distinctive feature of Jewish life in the South resulted from Jewish encounters with the region’s particular history of race and racism. The “classical” era of the Civil Rights movement highlights this fact, as southern Jews faced both heightened scrutiny from southern segregationists and frustration from northern coreligionists who supported the movement. Since the 1970s, overall trends in southern history have once again led to changes in the landscape of southern Jewry. Among other factors, the continued migration from rural to urban areas undermined the customer base for once-ubiquitous small-town Jewish retail businesses, and growing urban centers have attracted younger generations of Jewish professionals from both inside and outside the region. Consequently, the 21st-century Jewish South features fewer of the small-town communities that once typified the region, and its larger Jewish centers are not as identifiably “southern” as they once were.

Article

The only youth-led national civil rights organization in the 1960s in the United States, the Student Nonviolent Coordinating Committee (SNCC), grew out of sit-ins, with the base of its early membership coming from Black colleges. It became one of the most militant civil rights groups, pushing older organizations to become more aggressive. Under the tutelage of the experienced activist Ella Baker, it emphasized developing leadership in “ordinary” people. Its early years were dominated by direct action campaigns against White supremacy in the urban and Upper South, while internally, SNCC strove to actualize the Beloved Community. Later it specialized in grassroots community organizing and voter registration in dangerous areas of the Deep South. Its Freedom Summer campaign played a significant role in radicalizing young activists. SNCC, in general, acted as a training ground and model for other forms of youth activism. Notwithstanding its own issues with chauvinism, SNCC was open to leadership from women in a way that few social change organizations of the time were.

Article

During the American Civil War, Union and Confederate commanders made the capture and destruction of enemy cities a central feature of their military campaigns. They did so for two reasons. First, most mid-19th-century cities had factories, foundries, and warehouses within their borders, churning out and storing war materiel; military officials believed that if they interrupted or incapacitated the enemy’s ability to arm or clothe themselves, the war would end. Second, it was believed that the widespread destruction of property—especially in major or capital cities—would also damage civilians’ morale, undermining their political convictions and decreasing their support for the war effort. Both Union and Confederate armies bombarded and burned cities with these goals in mind. Sometimes they fought battles on city streets but more often, Union troops initiated long-term sieges in order to capture Confederate cities and demoralize their inhabitants. Soldiers on both sides were motivated by vengeance when they set fire to city businesses and homes; these acts were controversial, as was defensive burning—the deliberate destruction of one’s own urban center in order to keep its war materiel out of the hands of the enemy. Urban destruction, particularly long-term sieges, took a psychological toll on (mostly southern) city residents. Many were wounded, lost property, or were forced to become refugees. Because of this, the destruction of cities during the American Civil War provoked widespread discussions about the nature of “civilized warfare” and the role that civilians played in military strategy. Both soldiers and civilians tried to make sense of the destruction of cities in writing, and also in illustrations and photographs; images in particular shaped both northern and southern memories of the war and its costs.