You are looking at 241-260 of 353 articles
The American Revolution was an episode in a transatlantic outcry against the corruption of the British balance of power and liberty institutionalized in the Glorious Revolution of 1688–1689. English speakers during the 18th century reflected on this constitutional crisis within a larger conversation about the problem of human governance. Although many people excluded from Parliament supported political reform, if not revolution, they also sought remedies for the perversion of political power and influence in new forms of social power and influence. This article looks at the convergence of political and social discussions in a common discourse about the nature of power and the ways in which human beings influenced each other. The first section outlines the meanings of power and influence in British politics. The second section uses the novelist Sarah Fielding’s Remarks on Clarissa (1759) to delineate revolutionary notions about social power and influence. The third section turns to the speeches and writings of Edmund Burke in the run-up to the American Revolution to look at how English speakers deployed notions of social power to advocate for political reform.
Christopher W. Schmidt
One of the most significant protest campaigns of the civil rights era, the lunch counter sit-in movement began on February 1, 1960 when four young African American men sat down at the whites-only lunch counter of the Woolworth store in Greensboro, North Carolina. Refused service, the four college students sat quietly until the store closed. They continued their protest on the following days, each day joined by more fellow students. Students in other southern cities learned what was happening and started their own demonstrations, and in just weeks, lunch counter sit-ins were taking place across the South. By the end of the spring, tens of thousands of black college and high school students, joined in some cases by sympathetic white students, had joined the sit-in movement. Several thousand went to jail for their efforts after being arrested on charges of trespass, disorderly conduct, or whatever other laws southern police officers believed they could use against the protesters.
The sit-ins arrived at a critical juncture in the modern black freedom struggle. The preceding years had brought major breakthroughs, such as the Supreme Court’s Brown v. Board of Education school desegregation ruling in 1954 and the successful Montgomery bus boycott of 1955–1956, but by 1960, activists were struggling to develop next steps. The sit-in movement energized and transformed the struggle for racial equality, moving the leading edge of the movement from the courtrooms and legislative halls to the streets and putting a new, younger generation of activists on the front lines. It gave birth to the Student Nonviolent Coordinating Committee, one of the most important activist groups of the 1960s. It directed the nation’s attention to the problem of racial discrimination in private businesses that served the public, pressured business owners in scores of southern cities to open their lunch counters to African American customers, and set in motion a chain of events that would culminate in the Civil Rights Act of 1964, which banned racial discrimination in public accommodations across the nation.
The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.
The patterns of urban slavery in North American and pre-Civil War US cities reveal the ways in which individual men and women, as well as businesses, institutions, and governmental bodies employed slave labor and readily adapted the system of slavery to their economic needs and desires. Colonial cities east and west of the Mississippi River founded initially as military forts, trading posts, and maritime ports, relied on African and Native American slave labor from their beginnings. The importance of slave labor increased in Anglo-American East Coast urban settings in the 18th century as the number of enslaved Africans increased in these colonies, particularly in response to the growth of the tobacco, wheat, and rice industries in the southern colonies. The focus on African slavery led most Anglo-American colonies to outlaw the enslavement of Native Americans, and urban slavery on the East Coast became associated almost solely with people of African descent. In addition, these cities became central nodes in the circum-Atlantic transportation and sale of enslaved people, slave-produced goods, and provisions for slave colonies whose economies centered on plantation goods. West of the Mississippi, urban enslavement of Native Americans, Mexicans, and even a few Europeans continued through the 19th century.
As the thirteen British colonies transitioned to the United States during and after the Revolutionary War, three different directions emerged regarding the status of slavery, which would affect the status of slavery and people of African descent in cities. The gradual emancipation of enslaved people in states north of Delaware led to the creation of the so-called free states, with large numbers of free blacks moving into cities to take full advantage of freedom and the possibility of creating family and community. Although antebellum northern cities were located within areas where legalized slavery ended, these cities retained economic and political ties to southern slavery. At the same time, the radical antislavery movement developed in Philadelphia, Boston, and New York. Thus, Northern cities were the site of political conflicts between pro- and antislavery forces. In the Chesapeake, as the tobacco economy declined, slave owners manumitted enslaved blacks for whom they did not have enough work, creating large groups of free blacks in cities. But these states began to participate heavily in the domestic slave trade, with important businesses located in cities. And in the Deep South, the recommitment to slavery following the Louisiana Purchase and the emergence of the cotton economy led to the creation of a string of wealthy port cities critical to the transportation of slaves and goods. These cities were situated in local economic geographies that connected rural plantations to urban settings and in national and international economies of exchange of raw and finished goods that fueled industries throughout the Atlantic world. The vast majority of enslaved people employed in the antebellum South worked on rural farms, but slave labor was a key part of the labor force in southern cities. Only after the Civil War did slavery and cities become separate in the minds of Americans, as postwar whites north and south created a mythical South in which romanticized antebellum cotton plantations became the primary symbol of American slavery, regardless of the long history of slavery that preceded their existence.
Canada has sometimes been called the United States’ attic: a useful feature, but one easily forgotten. Of all countries, it has historically resembled the United States the most closely, in terms of culture, geography, economy, society, politics, ideology and, especially, history. A shared culture—literary, social, legal, and political—is a crucial factor in Canadian-American relations. Geography is at least as important. It provides the United States with strategic insulation to the north and enhances geographic isolation to the east and west. North-south economic links are inevitable and very large. It has been a major recipient of American investment, and for most of the time since 1920 has been the United States’ principal trading partner. Prosperous and self-sufficient, it has seldom required American aid. There have been no overtly hostile official encounters since the end of the War of 1812, partly because many Americans tended to believe that Canadians would join the republic; when that did not occur, the United States accepted an independent but friendly Canada as a permanent, useful, and desirable neighbor—North America’s attic. The insulation the attic provided was a common belief in the rule of law, both domestic and international; liberal democracy; a federal constitution; liberal capitalism; and liberal international trade regimes.
That said, the United States, with its large population, huge economy, and military power, insulates Canada from hostile external forces. An attack on Canada from outside the continent is hard to imagine without a simultaneous attack on the United States. Successive American and Canadian governments have reaffirmed the political status quo while favoring mutually beneficial economic and military linkages—bilateral and multilateral. Relations have traditionally been grounded in a negotiating style that is evidence-based, proceeding issue by issue. A sober diplomatic and political context sometimes frames irritations and exclamations, but even these have usually been defined and limited by familiarity. For example, there has always been anti-Americanism in Canada. Most often it consists of sentiments derived from the United States itself, channeled by cultural similarities. No American idea, good or bad, from liberalism to populism, fails to find an echo in Canada. How loud or how soft the echo makes the difference.
Christian J. Koot
Smuggling was a regular feature of the economy of colonial British America in the 17th and 18th centuries. Though the very nature of illicit commerce means that the extent of this trade is incalculable, a wide variety of British and colonial sources testify to the ability of merchants to trade where they pleased and to avoid paying duties in the process. Together admiralty proceedings, merchant correspondence and account books, customs reports, and petitions demonstrate that illicit trade enriched individuals and allowed settlers to shape their colonies’ development. Smuggling formed in resistance to British economic and political control. British authorities attempted to harness the trade of their Atlantic colonies by employing a series of laws that restricted overseas commerce (often referred to as the Navigation Acts). This legislation created the opportunity for illicit trade by raising the costs of legal trade. Hampered by insufficient resources, thousands of miles of coastline, and complicit local officials, British customs agents could not prevent smuggling. Economic self-interest and the pursuit of profit certainly motivated smugglers, but because it was tied to a larger transatlantic debate about the proper balance between regulation and free trade, smuggling was also a political act. Through smuggling colonists rejected what they saw as capricious regulations designed to enrich Britain at their expense.
Janine Giordano Drake
The term “Social Gospel” was coined by ministers and other well-meaning American Protestants with the intention of encouraging the urban and rural poor to understand that Christ cared about them and saw their struggles. The second half of the 19th century saw a rise of both domestic and international missionary fervor. Church and civic leaders feared a future in which freethinkers, agnostics, atheists, and other skeptics dominated spiritual life and well-educated ministers were marginal to American culture. They grew concerned with the rising number of independent and Pentecostal churches without extensive theological training or denominational authority. American Protestants especially feared that immigrant religious and cultural traditions, including Roman Catholicism, Judaism, and Eastern Orthodox Christianity, were not quintessentially American. Most of all, they worried that those belief systems could not promote what they saw as the traditional American values and mores central to the nation.
However, at least on the surface, the Social Gospel did not dwell on extinguishing ideas or traditions. Rather, as was typical of the Progressive Era, it forwarded a wide-ranging set of visions that emphasized scientific and professional expertise, guided by Christian ethics, to solve social and political problems. It fostered an energetic culture of conferences, magazines, and paperback books dedicated to reforming the nation. Books and articles unpacked social surveys that sorted through possible solutions to urban and rural poverty and reported on productive relationships between churches and municipal governments. Pastoral conferences often focused on planning revivals in urban auditoriums, churches, stadiums, or the open air, where participants not only were confronted with old-fashioned gospel messages but with lectures on what Christians could do to improve their communities.
The Social Gospel’s theological turn stressed the need for both individual redemption from sinful behavior, and the redemption of whole societies from damaged community relationships. Revivalists not only entreated listeners to reject personal habits like drinking, smoking, chewing tobacco, gambling, theater-going, and extramarital sex. They also encouraged listeners to replace the gathering space of the saloon with churches, schools, and public parks. Leaders usually saw themselves redeeming the “social sin” that produced impoverished neighborhoods, low-wage jobs, preventable diseases, and chronic unemployment and offering alternatives that kept businesses intact. In the Social Creed of the Churches (1908), ministers across the denominations proposed industrial reforms limiting work hours and improving working conditions, as well as government regulations setting a living wage and providing protection for the injured, sick, and elderly. Sometimes, Social Gospel leaders defended collective bargaining and built alliances with labor leaders. At other times, they proposed palliative solutions that would instill Christian “brotherhood” on the shop floor and render unions unnecessary. This wavering on principles produced complicated and sometimes tense relationships among union leaders, workers, and Social Gospel leaders.
Elements of the Social Gospel movement have carried even into the 21st century, leading some historians to challenge the idea that the movement died with the close of the Great War. The American Civil Liberties Union and Fellowship of Reconciliation, for example, did not lose any time in keeping alive the Social Gospel’s commitments to protecting the poor and defenseless. However, the rise of “premillennial dispensationalist” theology and the general disillusionment produced by the war’s massive casualties marked a major turning point, if not an endpoint, to the Social Gospel’s influence as a well-funded, Protestant evangelical force. The brutality of the war undermined American optimism—much of it fueled by Social Gospel thinking—about creating a more just, prosperous, and peaceful world. Meanwhile, attorney general A. Mitchell Palmer’s campaign against alleged anarchists and Bolsheviks immediately after the war—America’s first “Red Scare”—targeted a large number of labor and religious organizations with the accusation that socialist ideas were undemocratic and un-American. By the 1920s, many Social Gospel leaders had distanced themselves from the organized working classes. They either accepted new arrangements for harmonizing the interests of labor and capital or took their left-leaning political ideals underground.
Since the social sciences began to emerge as scholarly disciplines in the last quarter of the 19th century, they have frequently offered authoritative intellectual frameworks that have justified, and even shaped, a variety of U.S. foreign policy efforts. They played an important role in U.S. imperial expansion in the late 19th and early 20th centuries. Scholars devised racialized theories of social evolution that legitimated the confinement and assimilation of Native Americans and endorsed civilizing schemes in the Philippines, Cuba, and elsewhere. As attention shifted to Europe during and after World War I, social scientists working at the behest of Woodrow Wilson attempted to engineer a “scientific peace” at Versailles. The desire to render global politics the domain of objective, neutral experts intensified during World War II and the Cold War. After 1945, the social sciences became increasingly central players in foreign affairs, offering intellectual frameworks—like modernization theory—and bureaucratic tools—like systems analysis—that shaped U.S. interventions in developing nations, guided nuclear strategy, and justified the increasing use of the U.S. military around the world.
Throughout these eras, social scientists often reinforced American exceptionalism—the notion that the United States stands at the pinnacle of social and political development, and as such has a duty to spread liberty and democracy around the globe. The scholarly embrace of conventional political values was not the result of state coercion or financial co-optation; by and large social scientists and policymakers shared common American values. But other social scientists used their knowledge and intellectual authority to critique American foreign policy. The history of the relationship between social science and foreign relations offers important insights into the changing politics and ethics of expertise in American public policy.
K. Tsianina Lomawaima
In 1911, a group of American Indian intellectuals organized what would become known as the Society of American Indians, or SAI. SAI members convened in annual meetings between 1911 and 1923, and for much of that period the Society’s executive offices were a hub for political advocacy, lobbying Congress and the Office of Indian Affairs (OIA), publishing a journal, offering legal assistance to Native individuals and tribes, and maintaining an impressively voluminous correspondence across the country with American Indians, “Friends of the Indian” reformers, political allies, and staunch critics. Notable Native activists, clergy, entertainers, professionals, speakers, and writers—as well as Native representatives from on- and off-reservation communities—were active in the Society. They worked tirelessly to meet daunting, unrealistic expectations, principally to deliver a unified voice of Indian “public opinion” and to pursue controversial political goals without appearing too radical, especially obtaining U.S. citizenship for Indian individuals and allowing Indian nations to access the U.S. Court of Claims. They maintained their myriad activities with scant financial resources solely through the unpaid labor of dedicated Native volunteers. By 1923, the challenges exhausted the Society’s substantial human and miniscule financial capital. The Native “soul of unity” demanded by non-white spectators and hoped for by SAI leaders could no longer hold the center, and the SAI dissolved. Their work was not in vain, but citizenship and the ability to file claims materialized in circumscribed forms. In 1924 Congress passed the Indian Citizenship Act, granting birthright citizenship to American Indians, but citizenship for Indians was deemed compatible with continued wardship status. In 1946 Congress established an Indian Claims Commission, not a court, and successful claims could only result in monetary compensation, not regained lands.
Soldiers enlisted in the Union Army from every state in the Union and the Confederacy. The initial volunteers were motivated to preserve the accomplishments of the American Revolution and save the world’s hope that democratic government could survive. They were influenced by their culture’s ideals of manhood and republican ideals of the citizen soldier. They served in regiments that retained close ties with their sending communities throughout the war.
Recruits faced a difficult adjustment period when their units were mustered into the US Army. The test of battle taught soldiers to value some drills and discipline, but many soldiers insisted that officers respect their independence and equality. Soldiers successfully resisted many aspects of formal military discipline. Army life exposed conflicts between soldiers who sought to create moral regiments and soldiers who displayed manliness through fighting and drinking. Establishing honor before peers was an important component of soldier life. Effective soldiering involved enduring the boredom and disease of camp, the rigors of marching, and the terror of battle. To survive, soldiers formed close bonds with their comrades, mastered self-care techniques to stay healthy, applied skills learned from their civilian occupations on the battlefield, and remained connected to their families and communities. Conscription changed the character of the Union Army. Officers tightened discipline over the influx of lower-class “roughs.”
Union soldiers generally demonized their enemies as inferior barbarians. Because of their interaction with slaves in the South, Union soldiers quickly shifted their support to emancipation. Although Christianity and ideals of civilized behavior placed some restraints on Union soldiers when they encountered southerners, they supported and implemented hard war measures against the South’s population and resources, and treated guerrillas and their supporters with particular brutality. In the election of 1864, Union soldiers voted to fight until the Confederacy was defeated.
Chia Youyee Vang
In geopolitical terms, the Asian sub-region Southeast Asia consists of ten countries that are organized under the Association of Southeast Asian Nations (ASEAN). Current member nations include Brunei Darussalam, Kingdom of Cambodia, Republic of Indonesia, Lao People’s Democratic Republic (Laos), Malaysia, Republic of the Union of Myanmar (formerly Burma), Republic of the Philippines, Singapore, Kingdom of Thailand, and Socialist Republic of Vietnam. The term Southeast Asian Americans has been shaped largely by the flow of refugees from the American War in Vietnam’ however, Americans with origins in Southeast Asia have much more diverse migration and settlement experiences that are intricately tied to the complex histories of colonialism, imperialism, and war from the late 19th through the end of the 20th century. A commonality across Southeast Asian American groups today is that their immigration history resulted primarily from the political and military involvement of the United States in the region, aimed at building the United States as a global power. From Filipinos during the Spanish-American War in 1898 to Vietnamese, Cambodian, Lao, and Hmong refugees from the American War in Vietnam, military interventions generated migration flows that, once begun, became difficult to stop. Complicating this history is its role in supporting the international humanitarian apparatus by creating the possibility for displaced people to seek refuge in the United States. Additionally, the relationships between the United States, Malaysia, Indonesia, and Singapore are different from those of other SEA countries involved in the Vietnam War. Consequently, today’s Southeast Asian Americans are heterogeneous with varying levels of acculturation to U.S. society.
The Spanish-American War is best understood as a series of linked conflicts. Those conflicts punctuated Madrid’s decline to a third-rank European state and marked the United States’ transition from a regional to an imperial power. The central conflict was a brief conventional war fought in the Caribbean and the Pacific between Madrid and Washington. Those hostilities were preceded and followed by protracted and costly guerrilla wars in Cuba and the Philippines. The Spanish-American War was the consequence of the protracted stalemate in the Spanish-Cuban War. The economic and humanitarian distress which accompanied the fighting made it increasingly difficult for the United States to remain neutral until a series of Spanish missteps and bad fortune in early 1898 hastened the American entry to the war. The US Navy quickly moved to eliminate or blockade the strongest Spanish squadrons in the Philippines and Cuba; Spain’s inability to contest American control of the sea in either theater was decisive and permitted successful American attacks on outnumbered Spanish garrisons in Santiago de Cuba, Puerto Rico, and Manila. The transfer of the Philippines, along with Cuba, Puerto Rico, and Guam, to the United States in the Treaty of Paris confirmed American imperialist appetites for the Filipino nationalists, led by Emilio Aguinaldo, and contributed to tensions between the Filipino and American armies around and in Manila. Fighting broke out in February 1899, but the Filipino conventional forces were soon driven back from Manila and were utterly defeated by the end of the year. The Filipino forces that evaded capture re-emerged as guerrillas in early 1900, and for the next two and a half years the United States waged an increasingly severe anti-guerrilla war against Filipino irregulars. Despite Aguinaldo’s capture in early 1901, fighting continued in a handful of provinces until the spring of 1902, when the last organized resistance to American governance ended in Samar and Batangas provinces.
During the 1890s, the word segregation became the preferred term for the practice of coercing different groups of people, especially those designated by race, to live in separate and unequal urban residential neighborhoods. In the southern states of the United States, segregationists imported the word—originally used in the British colonies of Asia—to describe Jim Crow laws, and, in 1910, whites in Baltimore passed a “segregation ordinance” mandating separate black and white urban neighborhoods. Copy-cat legislation sprang up in cities across the South and the Midwest. But in 1917, a multiracial team of lawyers from the fledgling National Association for the Advancement of Colored People (NAACP) mounted a successful legal challenge to these ordinances in the U.S. Supreme Court—even as urban segregation laws were adopted in other places in the world, most notably in South Africa. The collapse of the movement for legislated racial segregation in the United States occurred just as African Americans began migrating in large numbers into cities in all regions of the United States, resulting in waves of anti-black mob violence. Segregationists were forced to rely on nonstatutory or formally nonracial techniques. In Chicago, an alliance of urban reformers and real estate professionals invented alternatives to explicitly racist segregation laws. The practices they promoted nationwide created one of the most successful forms of urban racial segregation in world history, rivaling and finally outliving South African apartheid. Understanding how this system came into being and how it persists today requires understanding both how the Chicago segregationists were connected to counterparts elsewhere in the world and how they adapted practices of city-splitting to suit the peculiarities of racial politics in the United States.
Peter C. Baldwin
Today the term nightlife typically refers to social activities in urban commercial spaces—particularly drinking, dancing, dining, and listening to live musical performances. This was not always so. Cities in the 18th and early 19th centuries knew relatively limited nightlife, most of it occurring in drinking places for men. Theater attracted mixed-gender audiences but was sometimes seen as disreputable in both its content and the character of the audience. Theater owners worked to shed this negative reputation starting in the mid-19th century, while nightlife continued to be tainted by the profusion of saloons, brothels, and gambling halls. Gradual improvements in street lighting and police protection encouraged people to go out at night, as did growing incomes and decreasing hours of labor. Nightlife attracted more women in the decades around 1900 as it expanded and diversified. Dance halls, vaudeville houses, movie theaters, restaurants, and cabarets thrived in the electrified “bright lights” districts of central cities. Commercial entertainment contracted again in the 1950s and 1960s as Americans spent more of their evening leisure hours watching television and began to regard urban public spaces with suspicion. Still, nightlife is viewed as an important component of urban economic life and is actively promoted by many municipal governments.
Over the first half of the 20th century, Rabbi Stephen S. Wise (1874–1949) devoted himself to solving the most controversial social and political problems of his day: corruption in municipal politics, abuse of industrial workers, women’s second-class citizenship, nativism and racism, and global war. He considered his activities an effort to define “Americanism” and apply its principles toward humanity’s improvement. On the one hand, Wise joined a long tradition of American Christian liberals committed to seeing their fellow citizens as their equals and to grounding this egalitarianism in their religious beliefs. On the other hand, he was in the vanguard of the Jewish Reform, or what he referred to as the Liberal Judaism movement, with its commitment to apply Jewish moral teachings to improve the world. His life’s work demonstrated that the two—liberal democracy and Liberal Judaism—went hand in hand. And while concerned with equality and justice, Wise’s Americanism had a democratic elitist character. His advocacy to engage the public on the meaning of citizenship and the role of the state relied on his own Jewish, male, and economically privileged perspective as well as those of an elite circle of political and business leaders, intellectual trendsetters, social scientists, philanthropists, labor leaders, and university faculty. In doing so, Wise drew upon on Jewish liberal teachings, transformed America’s liberal tradition, and helped to remake American’s national understanding of itself.
Conceptions of what constitutes a street gang or a youth gang have varied since the seminal sociological studies on these entities in the 1920s. Organizations of teenage youths and young adults in their twenties, congregating in public spaces and acting collectively, were fixtures of everyday life in American cities throughout the 20th century. While few studies historicize gangs in their own right, historians in a range of subfields cast gangs as key actors in critical dimensions of the American urban experience: the formation and defense of ethno-racial identities and communities; the creation and maintenance of segregated metropolitan spaces; the shaping of gender norms and forms of sociability in working-class districts; the structuring of contentious political mobilization challenging police practices and municipal policies; the evolution of underground and informal economies and organized crime activities; and the epidemic of gun violence that spread through minority communities in many major cities at the end of the 20th and beginning of the 21st centuries.
Although groups of white youths patrolling the streets of working-class neighborhoods and engaging in acts of defensive localism were commonplace in the urban Northeast, Mid-Atlantic, and Midwest states by the mid-19th century, street gangs exploded onto the urban landscape in the early 20th century as a consequence of massive demographic changes related to the wave of immigration from Europe, Asia, and Latin America and the migration of African Americans from the South. As immigrants and migrants moved into urban working-class neighborhoods and industrial workplaces, street gangs proliferated at the boundaries of ethno-racially defined communities, shaping the context within which immigrant and second-generation youths negotiated Americanization and learned the meanings of race and ethnicity. Although social workers in some cities noted the appearance of some female gangs by the 1930s, the milieu of youth gangs during this era was male dominated, and codes of honor and masculinity were often at stake in increasingly violent clashes over territory and resources like parks and beaches.
The interplay of race, ethnicity, and masculinity continued to shape the world of gangs in the 1940s and 1950s, when white male gangs claiming to defend the whiteness of their communities used terror tactics to reinforce the boundaries of ghettos and barrios in many cities. Such aggressions spurred the formation of fighting gangs in black and Latino neighborhoods, where youths entered into at times deadly combat against their aggressors but also fought for honor, respect, and status with rivals within their communities. In the 1960s and 1970s, with civil rights struggles and ideologies of racial empowerment circulating through minority neighborhoods, some of these same gangs, often with the support of community organizers affiliated with political organizations like the Black Panther Party, turned toward defending the rights of their communities and participating in contentious politics. However, such projects were cut short by the fierce repression of gangs in minority communities by local police forces, working at times in collaboration with the Federal Bureau of Investigation. By the mid-1970s, following the withdrawal of the Black Panthers and other mediating organizations from cities like Chicago and Los Angeles, so-called “super-gangs” claiming the allegiance of thousands of youths began federating into opposing camps—“People” against “Folks” in Chicago, “Crips” against “Bloods” in LA—to wage war for control of emerging drug markets. In the 1980s and 1990s, with minority communities dealing with high unemployment, cutbacks in social services, failing schools, hyperincarceration, drug trafficking, gun violence, and toxic relations with increasingly militarized police forces waging local “wars” against drugs and gangs, gangs proliferated in cities throughout the urban Sun Belt. Their prominence within popular and political discourse nationwide made them symbols of the urban crisis and of the cultural deficiencies that some believed had caused it.
Stephen H. Norwood
Strikebreakers have been drawn from many parts of the American population, most notably the permanently and seasonally unemployed and underemployed. Excluded from a vast range of occupations and shunned by many trade unions, African Americans constituted another potential pool of strikebreakers, especially during the early decades of the 20th century. During the first quarter of the 20th century, college students enthusiastically volunteered for strikebreaking, both because of their generally pro-business outlook and a desire to test their manhood in violent clashes.
A wide array of private and government forces has suppressed strikes. Beginning in the late 19th century, private detective agencies supplied guards who protected company property against strikers, sometimes assaulting them. During the early 20th century, several firms emerged that supplied strikebreakers and guards at companies’ request, drawing on what amounted to private armies of thousands of men. The largest of these operated nationally.
On many occasions the state itself intervened to break strikes. Like some strikebreaking firms, state militiamen deployed advanced weaponry against strikers and their sympathizers, including machine guns. Presidents Hayes and Cleveland called out federal troops to break the 1877 and 1894 interregional railroad strikes. In 1905, Pennsylvania established an elite mounted force to suppress coal miners’ strikes modeled on the British Constabulary patrols in Ireland.
Corporations directly intervened to break strikes, building weapons arsenals, including large supplies of tear gas, that they distributed to police forces. They initiated “back to work” movements to destroy strikers’ morale and used their considerable influence with the media to propagandize in the press and on the radio. Corporations, of course, discharged strikers, often permanently.
In the highly bureaucratized society of the late twentieth and early 21st century that stigmatized public displays of anger, management turned to new “union avoidance” firms to break strikes. These firms emphasized legal and psychological methods rather than violence. They advised employers on how to blur the line between management and labor, defame union leaders and activists, and sow discord among strikers.
From the 1890s to World War I, progressive reformers in the United States called upon their local, state, and federal governments to revitalize American democracy and address the most harmful social consequences of industrialization. The emergence of an increasingly powerful administrative state, which intervened on behalf of the public welfare in the economy and society, generated significant levels of conflict. Some of the opposition came from conservative business interests, who denounced state labor laws and other market regulations as meddlesome interferences with liberty of contract. But the historical record of the Progressive Era also reveals a broad undercurrent of resistance from ordinary Americans, who fought for personal liberty against the growth of police power in such areas as public health administration and the regulation of radical speech. Their struggles in the streets, statehouses, and courtrooms of the United States in the early 20th century shaped the legal culture of the period and revealed the contested meaning of individual liberty in a new social age.
Ann Durkin Keating
Since the beginning of the 19th century, outlying areas of American cities have been home to a variety of settlements and enterprises with close links to urban centers. Beginning in the early 19th century, the increasing scale of business and industrial enterprises separated workplaces from residences. This allowed some urban dwellers to live at a distance from their place of employment and commute to work. Others lived in the shadow of factories located at some distance from the city center. Still others provided food or raw materials for urban residents and businesses. The availability of employment led to further suburban growth. Changing intracity transportation, including railroads, interurbans, streetcars, and cable cars, enabled people and businesses to locate beyond the limits of a walking city.
By the late 19th century, metropolitan areas across the United States included outlying farm centers, industrial towns, residential rail (or streetcar) suburbs, and recreational/institutional centers. With suburbs generally located along rail or ferry lines into the early 20th century, the physical development of metropolitan areas often resembled a hub and spokes. However, across metropolitan regions, suburbs had a great range of function and diversity of populations. With the advent of automobile commutation and the growing use of trucks to haul freight, suburban development took place between railroad lines, filling in the earlier hub-and-spokes patterns into a more deliberate built-up area.
Although suburban settlements were integrally connected to their neighbors and within a metropolitan economy and society, independent suburban governments emerged to serve these outlying settlements and keep them separate. Developers often took the lead in providing differential services (and regulations). Suburban governments emerged as hybrid forms, serving relatively homogeneous populations by providing only some urban functions. Well before 1945, suburbs were home to a wide range of work and residents.
Becky Nicolaides and Andrew Wiese
Mass migration to suburban areas was a defining feature of American life after 1945. Before World War II, just 13% of Americans lived in suburbs. By 2010, however, suburbia was home to more than half of the U.S. population. The nation’s economy, politics, and society suburbanized in important ways. Suburbia shaped habits of car dependency and commuting, patterns of spending and saving, and experiences with issues as diverse as race and taxes, energy and nature, privacy and community. The owner occupied, single-family home, surrounded by a yard, and set in a neighborhood outside the urban core came to define everyday experience for most American households, and in the world of popular culture and the imagination, suburbia was the setting for the American dream. The nation’s suburbs were an equally critical economic landscape, home to vital high-tech industries, retailing, “logistics,” and office employment. In addition, American politics rested on a suburban majority, and over several decades, suburbia incubated political movements across the partisan spectrum, from grass-roots conservativism, to centrist meritocratic individualism, environmentalism, feminism, and social justice. In short, suburbia was a key setting for postwar American life.
Even as suburbia grew in magnitude and influence, it also grew more diverse, coming to reflect a much broader cross-section of America itself. This encompassing shift marked two key chronological stages in suburban history since 1945: the expansive, racialized, mass suburbanization of the postwar years (1945–1970) and an era of intensive social diversification and metropolitan complexity (since 1970). In the first period, suburbia witnessed the expansion of segregated white privilege, bolstered by government policies, exclusionary practices, and reinforced by grassroots political movements. By the second period, suburbia came to house a broader cross section of Americans, who brought with them a wide range of outlooks, lifeways, values, and politics. Suburbia became home to large numbers of immigrants, ethnic groups, African Americans, the poor, the elderly and diverse family types. In the face of stubborn exclusionism by affluent suburbs, inequality persisted across metropolitan areas and manifested anew in proliferating poorer, distressed suburbs. Reform efforts sought to alleviate metro-wide inequality and promote sustainable development, using coordinated regional approaches. In recent years, the twin discourses of suburban crisis and suburban rejuvenation captured the continued complexity of America’s suburbs.