You are looking at 221-240 of 326 articles
Since many North American indigenous societies also built and inhabited towns, America was not an entirely rural continent before the arrival of Europeans. Nevertheless, when Europeans set out to colonize their “wilderness,” they arrived with a practical and ideological commitment to recreating cities of the sort with which they were familiar on their home continent. The result of their ambitions was the rapid founding and development of European-style cities, the vast majority of which clustered on large bodies of water, either directly on the Atlantic Ocean or on the seas and river estuaries adjacent to it. The pace of city expansion was closely linked to the levels of support for cities among colonists and an economic environment that stimulated urban growth. Some cities grew faster than others, but by the middle of the 18th century even Virginia and Maryland, the most rural colonies, had towns that played a critical cultural, political, and economic role in society. By the revolutionary era, the centrality of North America’s seaports was cemented by their status as crucibles of the conflict. The issue of which seaport was the new United States’ premier city was contested, but the importance of cities to North American society was no longer debated.
Steven K. Green
Separation of church and state has long been viewed as a cornerstone of American democracy. At the same time, the concept has remained highly controversial in the popular culture and law. Much of the debate over the application and meaning of the phrase focuses on its historical antecedents. This article briefly examines the historical origins of the concept and its subsequent evolutions in the nineteenth century.
In the seventy years since the end of World War II (1939–1945), postindustrialization—the exodus of manufacturing and growth of finance and services—has radically transformed the economy of North American cities. Metropolitan areas are increasingly home to transnational firms that administer dispersed production networks that span the world. A few major global centers host large banks that coordinate flows of finance capital necessary not only for production, but also increasingly for education, infrastructure, municipal government, housing, and nearly every other aspect of life. In cities of the global north, fewer workers produce goods and more produce information, entertainment, and experiences. Women have steadily entered the paid workforce, where they often do the feminized work of caring for children and the ill, cleaning homes, and preparing meals. Like the Gilded Age city, the postindustrial city creates immense social divisions, injustices, and inequalities: penthouses worth millions and rampant homelessness, fifty-dollar burgers and an epidemic of food insecurity, and unparalleled wealth and long-standing structural unemployment all exist side by side. The key features of the postindustrial service economy are the increased concentration of wealth, the development of a privileged and celebrated workforce of professionals, and an economic system reliant on hyperexploited service workers whose availability is conditioned by race, immigration status, and gender.
Both sexuality and religion are terms as vexatious to define as they can be alluring to pursue. In the contemporary period, figuring out one’s sexual feelings, identity, and preferences has become a signal aspect of self-formation. Understanding one’s religious feelings, identity, and preferences may seem less imminent, but is certainly no less complicated. Both terms cause no small amount of confusion. Clearing up some of this confusion requires speaking frankly about delicate matters, and also speaking flatly about enormously complex experiences. Popular media coverage of ecclesiastical sex scandals in America suggests that people enjoy hearing about the profanation of religious duty. Despite the observed, inferred, and accused sexuality in American religious history, or maybe because of it, eroticism suffuses narrative accounts of American religious history and descriptions of religious actors. In U.S. history, sexuality has often been a key lens through which we have understood the nature of religion, the leaders of religions, and the reason for religious commitment.
Anne Sarah Rubin
Sherman’s March, more accurately known as the Georgia and Carolinas Campaigns, cut a swath across three states in 1864–1865. It was one of the most significant campaigns of the war, making Confederate civilians “howl” as farms and plantations were stripped of everything edible and all their valuables. Outbuildings, and occasionally homes, were burned, railroads were destroyed, and enslaved workers were emancipated. Long after the war ended, Sherman’s March continued to shape American’s memories as one of the most symbolically powerful aspects of the Civil War.
Sherman’s March began with the better-known March to the Sea, which started in Atlanta on November 15, 1864, and concluded in Savannah on December 22 of the same year. Sherman’s men proceeded through South Carolina and North Carolina in February, March, and April of 1865. The study of this military campaign illuminates the relationships between Sherman’s soldiers and Southern white civilians, especially women, and African Americans. Sherman’s men were often uncomfortable with their role as an army of liberation, and African Americans, in particular, found the March to be a double-edged sword.
The American Revolution was an episode in a transatlantic outcry against the corruption of the British balance of power and liberty institutionalized in the Glorious Revolution of 1688–1689. English speakers during the 18th century reflected on this constitutional crisis within a larger conversation about the problem of human governance. Although many people excluded from Parliament supported political reform, if not revolution, they also sought remedies for the perversion of political power and influence in new forms of social power and influence. This article looks at the convergence of political and social discussions in a common discourse about the nature of power and the ways in which human beings influenced each other. The first section outlines the meanings of power and influence in British politics. The second section uses the novelist Sarah Fielding’s Remarks on Clarissa (1759) to delineate revolutionary notions about social power and influence. The third section turns to the speeches and writings of Edmund Burke in the run-up to the American Revolution to look at how English speakers deployed notions of social power to advocate for political reform.
Christopher W. Schmidt
One of the most significant protest campaigns of the civil rights era, the lunch counter sit-in movement began on February 1, 1960 when four young African American men sat down at the whites-only lunch counter of the Woolworth store in Greensboro, North Carolina. Refused service, the four college students sat quietly until the store closed. They continued their protest on the following days, each day joined by more fellow students. Students in other southern cities learned what was happening and started their own demonstrations, and in just weeks, lunch counter sit-ins were taking place across the South. By the end of the spring, tens of thousands of black college and high school students, joined in some cases by sympathetic white students, had joined the sit-in movement. Several thousand went to jail for their efforts after being arrested on charges of trespass, disorderly conduct, or whatever other laws southern police officers believed they could use against the protesters.
The sit-ins arrived at a critical juncture in the modern black freedom struggle. The preceding years had brought major breakthroughs, such as the Supreme Court’s Brown v. Board of Education school desegregation ruling in 1954 and the successful Montgomery bus boycott of 1955–1956, but by 1960, activists were struggling to develop next steps. The sit-in movement energized and transformed the struggle for racial equality, moving the leading edge of the movement from the courtrooms and legislative halls to the streets and putting a new, younger generation of activists on the front lines. It gave birth to the Student Nonviolent Coordinating Committee, one of the most important activist groups of the 1960s. It directed the nation’s attention to the problem of racial discrimination in private businesses that served the public, pressured business owners in scores of southern cities to open their lunch counters to African American customers, and set in motion a chain of events that would culminate in the Civil Rights Act of 1964, which banned racial discrimination in public accommodations across the nation.
The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.
The patterns of urban slavery in North American and pre-Civil War US cities reveal the ways in which individual men and women, as well as businesses, institutions, and governmental bodies employed slave labor and readily adapted the system of slavery to their economic needs and desires. Colonial cities east and west of the Mississippi River founded initially as military forts, trading posts, and maritime ports, relied on African and Native American slave labor from their beginnings. The importance of slave labor increased in Anglo-American East Coast urban settings in the 18th century as the number of enslaved Africans increased in these colonies, particularly in response to the growth of the tobacco, wheat, and rice industries in the southern colonies. The focus on African slavery led most Anglo-American colonies to outlaw the enslavement of Native Americans, and urban slavery on the East Coast became associated almost solely with people of African descent. In addition, these cities became central nodes in the circum-Atlantic transportation and sale of enslaved people, slave-produced goods, and provisions for slave colonies whose economies centered on plantation goods. West of the Mississippi, urban enslavement of Native Americans, Mexicans, and even a few Europeans continued through the 19th century.
As the thirteen British colonies transitioned to the United States during and after the Revolutionary War, three different directions emerged regarding the status of slavery, which would affect the status of slavery and people of African descent in cities. The gradual emancipation of enslaved people in states north of Delaware led to the creation of the so-called free states, with large numbers of free blacks moving into cities to take full advantage of freedom and the possibility of creating family and community. Although antebellum northern cities were located within areas where legalized slavery ended, these cities retained economic and political ties to southern slavery. At the same time, the radical antislavery movement developed in Philadelphia, Boston, and New York. Thus, Northern cities were the site of political conflicts between pro- and antislavery forces. In the Chesapeake, as the tobacco economy declined, slave owners manumitted enslaved blacks for whom they did not have enough work, creating large groups of free blacks in cities. But these states began to participate heavily in the domestic slave trade, with important businesses located in cities. And in the Deep South, the recommitment to slavery following the Louisiana Purchase and the emergence of the cotton economy led to the creation of a string of wealthy port cities critical to the transportation of slaves and goods. These cities were situated in local economic geographies that connected rural plantations to urban settings and in national and international economies of exchange of raw and finished goods that fueled industries throughout the Atlantic world. The vast majority of enslaved people employed in the antebellum South worked on rural farms, but slave labor was a key part of the labor force in southern cities. Only after the Civil War did slavery and cities become separate in the minds of Americans, as postwar whites north and south created a mythical South in which romanticized antebellum cotton plantations became the primary symbol of American slavery, regardless of the long history of slavery that preceded their existence.
Christian J. Koot
Smuggling was a regular feature of the economy of colonial British America in the 17th and 18th centuries. Though the very nature of illicit commerce means that the extent of this trade is incalculable, a wide variety of British and colonial sources testify to the ability of merchants to trade where they pleased and to avoid paying duties in the process. Together admiralty proceedings, merchant correspondence and account books, customs reports, and petitions demonstrate that illicit trade enriched individuals and allowed settlers to shape their colonies’ development. Smuggling formed in resistance to British economic and political control. British authorities attempted to harness the trade of their Atlantic colonies by employing a series of laws that restricted overseas commerce (often referred to as the Navigation Acts). This legislation created the opportunity for illicit trade by raising the costs of legal trade. Hampered by insufficient resources, thousands of miles of coastline, and complicit local officials, British customs agents could not prevent smuggling. Economic self-interest and the pursuit of profit certainly motivated smugglers, but because it was tied to a larger transatlantic debate about the proper balance between regulation and free trade, smuggling was also a political act. Through smuggling colonists rejected what they saw as capricious regulations designed to enrich Britain at their expense.
Janine Giordano Drake
The term “Social Gospel” was coined by ministers and other well-meaning American Protestants with the intention of encouraging the urban and rural poor to understand that Christ cared about them and saw their struggles. The second half of the 19th century saw a rise of both domestic and international missionary fervor. Church and civic leaders feared a future in which freethinkers, agnostics, atheists, and other skeptics dominated spiritual life and well-educated ministers were marginal to American culture. They grew concerned with the rising number of independent and Pentecostal churches without extensive theological training or denominational authority. American Protestants especially feared that immigrant religious and cultural traditions, including Roman Catholicism, Judaism, and Eastern Orthodox Christianity, were not quintessentially American. Most of all, they worried that those belief systems could not promote what they saw as the traditional American values and mores central to the nation.
However, at least on the surface, the Social Gospel did not dwell on extinguishing ideas or traditions. Rather, as was typical of the Progressive Era, it forwarded a wide-ranging set of visions that emphasized scientific and professional expertise, guided by Christian ethics, to solve social and political problems. It fostered an energetic culture of conferences, magazines, and paperback books dedicated to reforming the nation. Books and articles unpacked social surveys that sorted through possible solutions to urban and rural poverty and reported on productive relationships between churches and municipal governments. Pastoral conferences often focused on planning revivals in urban auditoriums, churches, stadiums, or the open air, where participants not only were confronted with old-fashioned gospel messages but with lectures on what Christians could do to improve their communities.
The Social Gospel’s theological turn stressed the need for both individual redemption from sinful behavior, and the redemption of whole societies from damaged community relationships. Revivalists not only entreated listeners to reject personal habits like drinking, smoking, chewing tobacco, gambling, theater-going, and extramarital sex. They also encouraged listeners to replace the gathering space of the saloon with churches, schools, and public parks. Leaders usually saw themselves redeeming the “social sin” that produced impoverished neighborhoods, low-wage jobs, preventable diseases, and chronic unemployment and offering alternatives that kept businesses intact. In the Social Creed of the Churches (1908), ministers across the denominations proposed industrial reforms limiting work hours and improving working conditions, as well as government regulations setting a living wage and providing protection for the injured, sick, and elderly. Sometimes, Social Gospel leaders defended collective bargaining and built alliances with labor leaders. At other times, they proposed palliative solutions that would instill Christian “brotherhood” on the shop floor and render unions unnecessary. This wavering on principles produced complicated and sometimes tense relationships among union leaders, workers, and Social Gospel leaders.
Elements of the Social Gospel movement have carried even into the 21st century, leading some historians to challenge the idea that the movement died with the close of the Great War. The American Civil Liberties Union and Fellowship of Reconciliation, for example, did not lose any time in keeping alive the Social Gospel’s commitments to protecting the poor and defenseless. However, the rise of “premillennial dispensationalist” theology and the general disillusionment produced by the war’s massive casualties marked a major turning point, if not an endpoint, to the Social Gospel’s influence as a well-funded, Protestant evangelical force. The brutality of the war undermined American optimism—much of it fueled by Social Gospel thinking—about creating a more just, prosperous, and peaceful world. Meanwhile, attorney general A. Mitchell Palmer’s campaign against alleged anarchists and Bolsheviks immediately after the war—America’s first “Red Scare”—targeted a large number of labor and religious organizations with the accusation that socialist ideas were undemocratic and un-American. By the 1920s, many Social Gospel leaders had distanced themselves from the organized working classes. They either accepted new arrangements for harmonizing the interests of labor and capital or took their left-leaning political ideals underground.
Since the social sciences began to emerge as scholarly disciplines in the last quarter of the 19th century, they have frequently offered authoritative intellectual frameworks that have justified, and even shaped, a variety of U.S. foreign policy efforts. They played an important role in U.S. imperial expansion in the late 19th and early 20th centuries. Scholars devised racialized theories of social evolution that legitimated the confinement and assimilation of Native Americans and endorsed civilizing schemes in the Philippines, Cuba, and elsewhere. As attention shifted to Europe during and after World War I, social scientists working at the behest of Woodrow Wilson attempted to engineer a “scientific peace” at Versailles. The desire to render global politics the domain of objective, neutral experts intensified during World War II and the Cold War. After 1945, the social sciences became increasingly central players in foreign affairs, offering intellectual frameworks—like modernization theory—and bureaucratic tools—like systems analysis—that shaped U.S. interventions in developing nations, guided nuclear strategy, and justified the increasing use of the U.S. military around the world.
Throughout these eras, social scientists often reinforced American exceptionalism—the notion that the United States stands at the pinnacle of social and political development, and as such has a duty to spread liberty and democracy around the globe. The scholarly embrace of conventional political values was not the result of state coercion or financial co-optation; by and large social scientists and policymakers shared common American values. But other social scientists used their knowledge and intellectual authority to critique American foreign policy. The history of the relationship between social science and foreign relations offers important insights into the changing politics and ethics of expertise in American public policy.
K. Tsianina Lomawaima
In 1911, a group of American Indian intellectuals organized what would become known as the Society of American Indians, or SAI. SAI members convened in annual meetings between 1911 and 1923, and for much of that period the Society’s executive offices were a hub for political advocacy, lobbying Congress and the Office of Indian Affairs (OIA), publishing a journal, offering legal assistance to Native individuals and tribes, and maintaining an impressively voluminous correspondence across the country with American Indians, “Friends of the Indian” reformers, political allies, and staunch critics. Notable Native activists, clergy, entertainers, professionals, speakers, and writers—as well as Native representatives from on- and off-reservation communities—were active in the Society. They worked tirelessly to meet daunting, unrealistic expectations, principally to deliver a unified voice of Indian “public opinion” and to pursue controversial political goals without appearing too radical, especially obtaining U.S. citizenship for Indian individuals and allowing Indian nations to access the U.S. Court of Claims. They maintained their myriad activities with scant financial resources solely through the unpaid labor of dedicated Native volunteers. By 1923, the challenges exhausted the Society’s substantial human and miniscule financial capital. The Native “soul of unity” demanded by non-white spectators and hoped for by SAI leaders could no longer hold the center, and the SAI dissolved. Their work was not in vain, but citizenship and the ability to file claims materialized in circumscribed forms. In 1924 Congress passed the Indian Citizenship Act, granting birthright citizenship to American Indians, but citizenship for Indians was deemed compatible with continued wardship status. In 1946 Congress established an Indian Claims Commission, not a court, and successful claims could only result in monetary compensation, not regained lands.
Soldiers enlisted in the Union Army from every state in the Union and the Confederacy. The initial volunteers were motivated to preserve the accomplishments of the American Revolution and save the world’s hope that democratic government could survive. They were influenced by their culture’s ideals of manhood and republican ideals of the citizen soldier. They served in regiments that retained close ties with their sending communities throughout the war.
Recruits faced a difficult adjustment period when their units were mustered into the US Army. The test of battle taught soldiers to value some drills and discipline, but many soldiers insisted that officers respect their independence and equality. Soldiers successfully resisted many aspects of formal military discipline. Army life exposed conflicts between soldiers who sought to create moral regiments and soldiers who displayed manliness through fighting and drinking. Establishing honor before peers was an important component of soldier life. Effective soldiering involved enduring the boredom and disease of camp, the rigors of marching, and the terror of battle. To survive, soldiers formed close bonds with their comrades, mastered self-care techniques to stay healthy, applied skills learned from their civilian occupations on the battlefield, and remained connected to their families and communities. Conscription changed the character of the Union Army. Officers tightened discipline over the influx of lower-class “roughs.”
Union soldiers generally demonized their enemies as inferior barbarians. Because of their interaction with slaves in the South, Union soldiers quickly shifted their support to emancipation. Although Christianity and ideals of civilized behavior placed some restraints on Union soldiers when they encountered southerners, they supported and implemented hard war measures against the South’s population and resources, and treated guerrillas and their supporters with particular brutality. In the election of 1864, Union soldiers voted to fight until the Confederacy was defeated.
Chia Youyee Vang
In geopolitical terms, the Asian sub-region Southeast Asia consists of ten countries that are organized under the Association of Southeast Asian Nations (ASEAN). Current member nations include Brunei Darussalam, Kingdom of Cambodia, Republic of Indonesia, Lao People’s Democratic Republic (Laos), Malaysia, Republic of the Union of Myanmar (formerly Burma), Republic of the Philippines, Singapore, Kingdom of Thailand, and Socialist Republic of Vietnam. The term Southeast Asian Americans has been shaped largely by the flow of refugees from the American War in Vietnam’ however, Americans with origins in Southeast Asia have much more diverse migration and settlement experiences that are intricately tied to the complex histories of colonialism, imperialism, and war from the late 19th through the end of the 20th century. A commonality across Southeast Asian American groups today is that their immigration history resulted primarily from the political and military involvement of the United States in the region, aimed at building the United States as a global power. From Filipinos during the Spanish-American War in 1898 to Vietnamese, Cambodian, Lao, and Hmong refugees from the American War in Vietnam, military interventions generated migration flows that, once begun, became difficult to stop. Complicating this history is its role in supporting the international humanitarian apparatus by creating the possibility for displaced people to seek refuge in the United States. Additionally, the relationships between the United States, Malaysia, Indonesia, and Singapore are different from those of other SEA countries involved in the Vietnam War. Consequently, today’s Southeast Asian Americans are heterogeneous with varying levels of acculturation to U.S. society.
During the 1890s, the word segregation became the preferred term for the practice of coercing different groups of people, especially those designated by race, to live in separate and unequal urban residential neighborhoods. In the southern states of the United States, segregationists imported the word—originally used in the British colonies of Asia—to describe Jim Crow laws, and, in 1910, whites in Baltimore passed a “segregation ordinance” mandating separate black and white urban neighborhoods. Copy-cat legislation sprang up in cities across the South and the Midwest. But in 1917, a multiracial team of lawyers from the fledgling National Association for the Advancement of Colored People (NAACP) mounted a successful legal challenge to these ordinances in the U.S. Supreme Court—even as urban segregation laws were adopted in other places in the world, most notably in South Africa. The collapse of the movement for legislated racial segregation in the United States occurred just as African Americans began migrating in large numbers into cities in all regions of the United States, resulting in waves of anti-black mob violence. Segregationists were forced to rely on nonstatutory or formally nonracial techniques. In Chicago, an alliance of urban reformers and real estate professionals invented alternatives to explicitly racist segregation laws. The practices they promoted nationwide created one of the most successful forms of urban racial segregation in world history, rivaling and finally outliving South African apartheid. Understanding how this system came into being and how it persists today requires understanding both how the Chicago segregationists were connected to counterparts elsewhere in the world and how they adapted practices of city-splitting to suit the peculiarities of racial politics in the United States.
Peter C. Baldwin
Today the term nightlife typically refers to social activities in urban commercial spaces—particularly drinking, dancing, dining, and listening to live musical performances. This was not always so. Cities in the 18th and early 19th centuries knew relatively limited nightlife, most of it occurring in drinking places for men. Theater attracted mixed-gender audiences but was sometimes seen as disreputable in both its content and the character of the audience. Theater owners worked to shed this negative reputation starting in the mid-19th century, while nightlife continued to be tainted by the profusion of saloons, brothels, and gambling halls. Gradual improvements in street lighting and police protection encouraged people to go out at night, as did growing incomes and decreasing hours of labor. Nightlife attracted more women in the decades around 1900 as it expanded and diversified. Dance halls, vaudeville houses, movie theaters, restaurants, and cabarets thrived in the electrified “bright lights” districts of central cities. Commercial entertainment contracted again in the 1950s and 1960s as Americans spent more of their evening leisure hours watching television and began to regard urban public spaces with suspicion. Still, nightlife is viewed as an important component of urban economic life and is actively promoted by many municipal governments.
Over the first half of the 20th century, Rabbi Stephen S. Wise (1874–1949) devoted himself to solving the most controversial social and political problems of his day: corruption in municipal politics, abuse of industrial workers, women’s second-class citizenship, nativism and racism, and global war. He considered his activities an effort to define “Americanism” and apply its principles toward humanity’s improvement. On the one hand, Wise joined a long tradition of American Christian liberals committed to seeing their fellow citizens as their equals and to grounding this egalitarianism in their religious beliefs. On the other hand, he was in the vanguard of the Jewish Reform, or what he referred to as the Liberal Judaism movement, with its commitment to apply Jewish moral teachings to improve the world. His life’s work demonstrated that the two—liberal democracy and Liberal Judaism—went hand in hand. And while concerned with equality and justice, Wise’s Americanism had a democratic elitist character. His advocacy to engage the public on the meaning of citizenship and the role of the state relied on his own Jewish, male, and economically privileged perspective as well as those of an elite circle of political and business leaders, intellectual trendsetters, social scientists, philanthropists, labor leaders, and university faculty. In doing so, Wise drew upon on Jewish liberal teachings, transformed America’s liberal tradition, and helped to remake American’s national understanding of itself.
Conceptions of what constitutes a street gang or a youth gang have varied since the seminal sociological studies on these entities in the 1920s. Organizations of teenage youths and young adults in their twenties, congregating in public spaces and acting collectively, were fixtures of everyday life in American cities throughout the 20th century. While few studies historicize gangs in their own right, historians in a range of subfields cast gangs as key actors in critical dimensions of the American urban experience: the formation and defense of ethno-racial identities and communities; the creation and maintenance of segregated metropolitan spaces; the shaping of gender norms and forms of sociability in working-class districts; the structuring of contentious political mobilization challenging police practices and municipal policies; the evolution of underground and informal economies and organized crime activities; and the epidemic of gun violence that spread through minority communities in many major cities at the end of the 20th and beginning of the 21st centuries.
Although groups of white youths patrolling the streets of working-class neighborhoods and engaging in acts of defensive localism were commonplace in the urban Northeast, Mid-Atlantic, and Midwest states by the mid-19th century, street gangs exploded onto the urban landscape in the early 20th century as a consequence of massive demographic changes related to the wave of immigration from Europe, Asia, and Latin America and the migration of African Americans from the South. As immigrants and migrants moved into urban working-class neighborhoods and industrial workplaces, street gangs proliferated at the boundaries of ethno-racially defined communities, shaping the context within which immigrant and second-generation youths negotiated Americanization and learned the meanings of race and ethnicity. Although social workers in some cities noted the appearance of some female gangs by the 1930s, the milieu of youth gangs during this era was male dominated, and codes of honor and masculinity were often at stake in increasingly violent clashes over territory and resources like parks and beaches.
The interplay of race, ethnicity, and masculinity continued to shape the world of gangs in the 1940s and 1950s, when white male gangs claiming to defend the whiteness of their communities used terror tactics to reinforce the boundaries of ghettos and barrios in many cities. Such aggressions spurred the formation of fighting gangs in black and Latino neighborhoods, where youths entered into at times deadly combat against their aggressors but also fought for honor, respect, and status with rivals within their communities. In the 1960s and 1970s, with civil rights struggles and ideologies of racial empowerment circulating through minority neighborhoods, some of these same gangs, often with the support of community organizers affiliated with political organizations like the Black Panther Party, turned toward defending the rights of their communities and participating in contentious politics. However, such projects were cut short by the fierce repression of gangs in minority communities by local police forces, working at times in collaboration with the Federal Bureau of Investigation. By the mid-1970s, following the withdrawal of the Black Panthers and other mediating organizations from cities like Chicago and Los Angeles, so-called “super-gangs” claiming the allegiance of thousands of youths began federating into opposing camps—“People” against “Folks” in Chicago, “Crips” against “Bloods” in LA—to wage war for control of emerging drug markets. In the 1980s and 1990s, with minority communities dealing with high unemployment, cutbacks in social services, failing schools, hyperincarceration, drug trafficking, gun violence, and toxic relations with increasingly militarized police forces waging local “wars” against drugs and gangs, gangs proliferated in cities throughout the urban Sun Belt. Their prominence within popular and political discourse nationwide made them symbols of the urban crisis and of the cultural deficiencies that some believed had caused it.
Stephen H. Norwood
Strikebreakers have been drawn from many parts of the American population, most notably the permanently and seasonally unemployed and underemployed. Excluded from a vast range of occupations and shunned by many trade unions, African Americans constituted another potential pool of strikebreakers, especially during the early decades of the 20th century. During the first quarter of the 20th century, college students enthusiastically volunteered for strikebreaking, both because of their generally pro-business outlook and a desire to test their manhood in violent clashes.
A wide array of private and government forces has suppressed strikes. Beginning in the late 19th century, private detective agencies supplied guards who protected company property against strikers, sometimes assaulting them. During the early 20th century, several firms emerged that supplied strikebreakers and guards at companies’ request, drawing on what amounted to private armies of thousands of men. The largest of these operated nationally.
On many occasions the state itself intervened to break strikes. Like some strikebreaking firms, state militiamen deployed advanced weaponry against strikers and their sympathizers, including machine guns. Presidents Hayes and Cleveland called out federal troops to break the 1877 and 1894 interregional railroad strikes. In 1905, Pennsylvania established an elite mounted force to suppress coal miners’ strikes modeled on the British Constabulary patrols in Ireland.
Corporations directly intervened to break strikes, building weapons arsenals, including large supplies of tear gas, that they distributed to police forces. They initiated “back to work” movements to destroy strikers’ morale and used their considerable influence with the media to propagandize in the press and on the radio. Corporations, of course, discharged strikers, often permanently.
In the highly bureaucratized society of the late twentieth and early 21st century that stigmatized public displays of anger, management turned to new “union avoidance” firms to break strikes. These firms emphasized legal and psychological methods rather than violence. They advised employers on how to blur the line between management and labor, defame union leaders and activists, and sow discord among strikers.