You are looking at 261-280 of 380 articles
Christian J. Koot
Smuggling was a regular feature of the economy of colonial British America in the 17th and 18th centuries. Though the very nature of illicit commerce means that the extent of this trade is incalculable, a wide variety of British and colonial sources testify to the ability of merchants to trade where they pleased and to avoid paying duties in the process. Together admiralty proceedings, merchant correspondence and account books, customs reports, and petitions demonstrate that illicit trade enriched individuals and allowed settlers to shape their colonies’ development. Smuggling formed in resistance to British economic and political control. British authorities attempted to harness the trade of their Atlantic colonies by employing a series of laws that restricted overseas commerce (often referred to as the Navigation Acts). This legislation created the opportunity for illicit trade by raising the costs of legal trade. Hampered by insufficient resources, thousands of miles of coastline, and complicit local officials, British customs agents could not prevent smuggling. Economic self-interest and the pursuit of profit certainly motivated smugglers, but because it was tied to a larger transatlantic debate about the proper balance between regulation and free trade, smuggling was also a political act. Through smuggling colonists rejected what they saw as capricious regulations designed to enrich Britain at their expense.
Janine Giordano Drake
The term “Social Gospel” was coined by ministers and other well-meaning American Protestants with the intention of encouraging the urban and rural poor to understand that Christ cared about them and saw their struggles. The second half of the 19th century saw a rise of both domestic and international missionary fervor. Church and civic leaders feared a future in which freethinkers, agnostics, atheists, and other skeptics dominated spiritual life and well-educated ministers were marginal to American culture. They grew concerned with the rising number of independent and Pentecostal churches without extensive theological training or denominational authority. American Protestants especially feared that immigrant religious and cultural traditions, including Roman Catholicism, Judaism, and Eastern Orthodox Christianity, were not quintessentially American. Most of all, they worried that those belief systems could not promote what they saw as the traditional American values and mores central to the nation.
However, at least on the surface, the Social Gospel did not dwell on extinguishing ideas or traditions. Rather, as was typical of the Progressive Era, it forwarded a wide-ranging set of visions that emphasized scientific and professional expertise, guided by Christian ethics, to solve social and political problems. It fostered an energetic culture of conferences, magazines, and paperback books dedicated to reforming the nation. Books and articles unpacked social surveys that sorted through possible solutions to urban and rural poverty and reported on productive relationships between churches and municipal governments. Pastoral conferences often focused on planning revivals in urban auditoriums, churches, stadiums, or the open air, where participants not only were confronted with old-fashioned gospel messages but with lectures on what Christians could do to improve their communities.
The Social Gospel’s theological turn stressed the need for both individual redemption from sinful behavior, and the redemption of whole societies from damaged community relationships. Revivalists not only entreated listeners to reject personal habits like drinking, smoking, chewing tobacco, gambling, theater-going, and extramarital sex. They also encouraged listeners to replace the gathering space of the saloon with churches, schools, and public parks. Leaders usually saw themselves redeeming the “social sin” that produced impoverished neighborhoods, low-wage jobs, preventable diseases, and chronic unemployment and offering alternatives that kept businesses intact. In the Social Creed of the Churches (1908), ministers across the denominations proposed industrial reforms limiting work hours and improving working conditions, as well as government regulations setting a living wage and providing protection for the injured, sick, and elderly. Sometimes, Social Gospel leaders defended collective bargaining and built alliances with labor leaders. At other times, they proposed palliative solutions that would instill Christian “brotherhood” on the shop floor and render unions unnecessary. This wavering on principles produced complicated and sometimes tense relationships among union leaders, workers, and Social Gospel leaders.
Elements of the Social Gospel movement have carried even into the 21st century, leading some historians to challenge the idea that the movement died with the close of the Great War. The American Civil Liberties Union and Fellowship of Reconciliation, for example, did not lose any time in keeping alive the Social Gospel’s commitments to protecting the poor and defenseless. However, the rise of “premillennial dispensationalist” theology and the general disillusionment produced by the war’s massive casualties marked a major turning point, if not an endpoint, to the Social Gospel’s influence as a well-funded, Protestant evangelical force. The brutality of the war undermined American optimism—much of it fueled by Social Gospel thinking—about creating a more just, prosperous, and peaceful world. Meanwhile, attorney general A. Mitchell Palmer’s campaign against alleged anarchists and Bolsheviks immediately after the war—America’s first “Red Scare”—targeted a large number of labor and religious organizations with the accusation that socialist ideas were undemocratic and un-American. By the 1920s, many Social Gospel leaders had distanced themselves from the organized working classes. They either accepted new arrangements for harmonizing the interests of labor and capital or took their left-leaning political ideals underground.
Since the social sciences began to emerge as scholarly disciplines in the last quarter of the 19th century, they have frequently offered authoritative intellectual frameworks that have justified, and even shaped, a variety of U.S. foreign policy efforts. They played an important role in U.S. imperial expansion in the late 19th and early 20th centuries. Scholars devised racialized theories of social evolution that legitimated the confinement and assimilation of Native Americans and endorsed civilizing schemes in the Philippines, Cuba, and elsewhere. As attention shifted to Europe during and after World War I, social scientists working at the behest of Woodrow Wilson attempted to engineer a “scientific peace” at Versailles. The desire to render global politics the domain of objective, neutral experts intensified during World War II and the Cold War. After 1945, the social sciences became increasingly central players in foreign affairs, offering intellectual frameworks—like modernization theory—and bureaucratic tools—like systems analysis—that shaped U.S. interventions in developing nations, guided nuclear strategy, and justified the increasing use of the U.S. military around the world.
Throughout these eras, social scientists often reinforced American exceptionalism—the notion that the United States stands at the pinnacle of social and political development, and as such has a duty to spread liberty and democracy around the globe. The scholarly embrace of conventional political values was not the result of state coercion or financial co-optation; by and large social scientists and policymakers shared common American values. But other social scientists used their knowledge and intellectual authority to critique American foreign policy. The history of the relationship between social science and foreign relations offers important insights into the changing politics and ethics of expertise in American public policy.
K. Tsianina Lomawaima
In 1911, a group of American Indian intellectuals organized what would become known as the Society of American Indians, or SAI. SAI members convened in annual meetings between 1911 and 1923, and for much of that period the Society’s executive offices were a hub for political advocacy, lobbying Congress and the Office of Indian Affairs (OIA), publishing a journal, offering legal assistance to Native individuals and tribes, and maintaining an impressively voluminous correspondence across the country with American Indians, “Friends of the Indian” reformers, political allies, and staunch critics. Notable Native activists, clergy, entertainers, professionals, speakers, and writers—as well as Native representatives from on- and off-reservation communities—were active in the Society. They worked tirelessly to meet daunting, unrealistic expectations, principally to deliver a unified voice of Indian “public opinion” and to pursue controversial political goals without appearing too radical, especially obtaining U.S. citizenship for Indian individuals and allowing Indian nations to access the U.S. Court of Claims. They maintained their myriad activities with scant financial resources solely through the unpaid labor of dedicated Native volunteers. By 1923, the challenges exhausted the Society’s substantial human and miniscule financial capital. The Native “soul of unity” demanded by non-white spectators and hoped for by SAI leaders could no longer hold the center, and the SAI dissolved. Their work was not in vain, but citizenship and the ability to file claims materialized in circumscribed forms. In 1924 Congress passed the Indian Citizenship Act, granting birthright citizenship to American Indians, but citizenship for Indians was deemed compatible with continued wardship status. In 1946 Congress established an Indian Claims Commission, not a court, and successful claims could only result in monetary compensation, not regained lands.
Soldiers enlisted in the Union Army from every state in the Union and the Confederacy. The initial volunteers were motivated to preserve the accomplishments of the American Revolution and save the world’s hope that democratic government could survive. They were influenced by their culture’s ideals of manhood and republican ideals of the citizen soldier. They served in regiments that retained close ties with their sending communities throughout the war.
Recruits faced a difficult adjustment period when their units were mustered into the US Army. The test of battle taught soldiers to value some drills and discipline, but many soldiers insisted that officers respect their independence and equality. Soldiers successfully resisted many aspects of formal military discipline. Army life exposed conflicts between soldiers who sought to create moral regiments and soldiers who displayed manliness through fighting and drinking. Establishing honor before peers was an important component of soldier life. Effective soldiering involved enduring the boredom and disease of camp, the rigors of marching, and the terror of battle. To survive, soldiers formed close bonds with their comrades, mastered self-care techniques to stay healthy, applied skills learned from their civilian occupations on the battlefield, and remained connected to their families and communities. Conscription changed the character of the Union Army. Officers tightened discipline over the influx of lower-class “roughs.”
Union soldiers generally demonized their enemies as inferior barbarians. Because of their interaction with slaves in the South, Union soldiers quickly shifted their support to emancipation. Although Christianity and ideals of civilized behavior placed some restraints on Union soldiers when they encountered southerners, they supported and implemented hard war measures against the South’s population and resources, and treated guerrillas and their supporters with particular brutality. In the election of 1864, Union soldiers voted to fight until the Confederacy was defeated.
Chia Youyee Vang
In geopolitical terms, the Asian sub-region Southeast Asia consists of ten countries that are organized under the Association of Southeast Asian Nations (ASEAN). Current member nations include Brunei Darussalam, Kingdom of Cambodia, Republic of Indonesia, Lao People’s Democratic Republic (Laos), Malaysia, Republic of the Union of Myanmar (formerly Burma), Republic of the Philippines, Singapore, Kingdom of Thailand, and Socialist Republic of Vietnam. The term Southeast Asian Americans has been shaped largely by the flow of refugees from the American War in Vietnam’ however, Americans with origins in Southeast Asia have much more diverse migration and settlement experiences that are intricately tied to the complex histories of colonialism, imperialism, and war from the late 19th through the end of the 20th century. A commonality across Southeast Asian American groups today is that their immigration history resulted primarily from the political and military involvement of the United States in the region, aimed at building the United States as a global power. From Filipinos during the Spanish-American War in 1898 to Vietnamese, Cambodian, Lao, and Hmong refugees from the American War in Vietnam, military interventions generated migration flows that, once begun, became difficult to stop. Complicating this history is its role in supporting the international humanitarian apparatus by creating the possibility for displaced people to seek refuge in the United States. Additionally, the relationships between the United States, Malaysia, Indonesia, and Singapore are different from those of other SEA countries involved in the Vietnam War. Consequently, today’s Southeast Asian Americans are heterogeneous with varying levels of acculturation to U.S. society.
The Spanish-American War is best understood as a series of linked conflicts. Those conflicts punctuated Madrid’s decline to a third-rank European state and marked the United States’ transition from a regional to an imperial power. The central conflict was a brief conventional war fought in the Caribbean and the Pacific between Madrid and Washington. Those hostilities were preceded and followed by protracted and costly guerrilla wars in Cuba and the Philippines. The Spanish-American War was the consequence of the protracted stalemate in the Spanish-Cuban War. The economic and humanitarian distress which accompanied the fighting made it increasingly difficult for the United States to remain neutral until a series of Spanish missteps and bad fortune in early 1898 hastened the American entry to the war. The US Navy quickly moved to eliminate or blockade the strongest Spanish squadrons in the Philippines and Cuba; Spain’s inability to contest American control of the sea in either theater was decisive and permitted successful American attacks on outnumbered Spanish garrisons in Santiago de Cuba, Puerto Rico, and Manila. The transfer of the Philippines, along with Cuba, Puerto Rico, and Guam, to the United States in the Treaty of Paris confirmed American imperialist appetites for the Filipino nationalists, led by Emilio Aguinaldo, and contributed to tensions between the Filipino and American armies around and in Manila. Fighting broke out in February 1899, but the Filipino conventional forces were soon driven back from Manila and were utterly defeated by the end of the year. The Filipino forces that evaded capture re-emerged as guerrillas in early 1900, and for the next two and a half years the United States waged an increasingly severe anti-guerrilla war against Filipino irregulars. Despite Aguinaldo’s capture in early 1901, fighting continued in a handful of provinces until the spring of 1902, when the last organized resistance to American governance ended in Samar and Batangas provinces.
During the 1890s, the word segregation became the preferred term for the practice of coercing different groups of people, especially those designated by race, to live in separate and unequal urban residential neighborhoods. In the southern states of the United States, segregationists imported the word—originally used in the British colonies of Asia—to describe Jim Crow laws, and, in 1910, whites in Baltimore passed a “segregation ordinance” mandating separate black and white urban neighborhoods. Copy-cat legislation sprang up in cities across the South and the Midwest. But in 1917, a multiracial team of lawyers from the fledgling National Association for the Advancement of Colored People (NAACP) mounted a successful legal challenge to these ordinances in the U.S. Supreme Court—even as urban segregation laws were adopted in other places in the world, most notably in South Africa. The collapse of the movement for legislated racial segregation in the United States occurred just as African Americans began migrating in large numbers into cities in all regions of the United States, resulting in waves of anti-black mob violence. Segregationists were forced to rely on nonstatutory or formally nonracial techniques. In Chicago, an alliance of urban reformers and real estate professionals invented alternatives to explicitly racist segregation laws. The practices they promoted nationwide created one of the most successful forms of urban racial segregation in world history, rivaling and finally outliving South African apartheid. Understanding how this system came into being and how it persists today requires understanding both how the Chicago segregationists were connected to counterparts elsewhere in the world and how they adapted practices of city-splitting to suit the peculiarities of racial politics in the United States.
Peter C. Baldwin
Today the term nightlife typically refers to social activities in urban commercial spaces—particularly drinking, dancing, dining, and listening to live musical performances. This was not always so. Cities in the 18th and early 19th centuries knew relatively limited nightlife, most of it occurring in drinking places for men. Theater attracted mixed-gender audiences but was sometimes seen as disreputable in both its content and the character of the audience. Theater owners worked to shed this negative reputation starting in the mid-19th century, while nightlife continued to be tainted by the profusion of saloons, brothels, and gambling halls. Gradual improvements in street lighting and police protection encouraged people to go out at night, as did growing incomes and decreasing hours of labor. Nightlife attracted more women in the decades around 1900 as it expanded and diversified. Dance halls, vaudeville houses, movie theaters, restaurants, and cabarets thrived in the electrified “bright lights” districts of central cities. Commercial entertainment contracted again in the 1950s and 1960s as Americans spent more of their evening leisure hours watching television and began to regard urban public spaces with suspicion. Still, nightlife is viewed as an important component of urban economic life and is actively promoted by many municipal governments.
Over the first half of the 20th century, Rabbi Stephen S. Wise (1874–1949) devoted himself to solving the most controversial social and political problems of his day: corruption in municipal politics, abuse of industrial workers, women’s second-class citizenship, nativism and racism, and global war. He considered his activities an effort to define “Americanism” and apply its principles toward humanity’s improvement. On the one hand, Wise joined a long tradition of American Christian liberals committed to seeing their fellow citizens as their equals and to grounding this egalitarianism in their religious beliefs. On the other hand, he was in the vanguard of the Jewish Reform, or what he referred to as the Liberal Judaism movement, with its commitment to apply Jewish moral teachings to improve the world. His life’s work demonstrated that the two—liberal democracy and Liberal Judaism—went hand in hand. And while concerned with equality and justice, Wise’s Americanism had a democratic elitist character. His advocacy to engage the public on the meaning of citizenship and the role of the state relied on his own Jewish, male, and economically privileged perspective as well as those of an elite circle of political and business leaders, intellectual trendsetters, social scientists, philanthropists, labor leaders, and university faculty. In doing so, Wise drew upon on Jewish liberal teachings, transformed America’s liberal tradition, and helped to remake American’s national understanding of itself.
Conceptions of what constitutes a street gang or a youth gang have varied since the seminal sociological studies on these entities in the 1920s. Organizations of teenage youths and young adults in their twenties, congregating in public spaces and acting collectively, were fixtures of everyday life in American cities throughout the 20th century. While few studies historicize gangs in their own right, historians in a range of subfields cast gangs as key actors in critical dimensions of the American urban experience: the formation and defense of ethno-racial identities and communities; the creation and maintenance of segregated metropolitan spaces; the shaping of gender norms and forms of sociability in working-class districts; the structuring of contentious political mobilization challenging police practices and municipal policies; the evolution of underground and informal economies and organized crime activities; and the epidemic of gun violence that spread through minority communities in many major cities at the end of the 20th and beginning of the 21st centuries.
Although groups of white youths patrolling the streets of working-class neighborhoods and engaging in acts of defensive localism were commonplace in the urban Northeast, Mid-Atlantic, and Midwest states by the mid-19th century, street gangs exploded onto the urban landscape in the early 20th century as a consequence of massive demographic changes related to the wave of immigration from Europe, Asia, and Latin America and the migration of African Americans from the South. As immigrants and migrants moved into urban working-class neighborhoods and industrial workplaces, street gangs proliferated at the boundaries of ethno-racially defined communities, shaping the context within which immigrant and second-generation youths negotiated Americanization and learned the meanings of race and ethnicity. Although social workers in some cities noted the appearance of some female gangs by the 1930s, the milieu of youth gangs during this era was male dominated, and codes of honor and masculinity were often at stake in increasingly violent clashes over territory and resources like parks and beaches.
The interplay of race, ethnicity, and masculinity continued to shape the world of gangs in the 1940s and 1950s, when white male gangs claiming to defend the whiteness of their communities used terror tactics to reinforce the boundaries of ghettos and barrios in many cities. Such aggressions spurred the formation of fighting gangs in black and Latino neighborhoods, where youths entered into at times deadly combat against their aggressors but also fought for honor, respect, and status with rivals within their communities. In the 1960s and 1970s, with civil rights struggles and ideologies of racial empowerment circulating through minority neighborhoods, some of these same gangs, often with the support of community organizers affiliated with political organizations like the Black Panther Party, turned toward defending the rights of their communities and participating in contentious politics. However, such projects were cut short by the fierce repression of gangs in minority communities by local police forces, working at times in collaboration with the Federal Bureau of Investigation. By the mid-1970s, following the withdrawal of the Black Panthers and other mediating organizations from cities like Chicago and Los Angeles, so-called “super-gangs” claiming the allegiance of thousands of youths began federating into opposing camps—“People” against “Folks” in Chicago, “Crips” against “Bloods” in LA—to wage war for control of emerging drug markets. In the 1980s and 1990s, with minority communities dealing with high unemployment, cutbacks in social services, failing schools, hyperincarceration, drug trafficking, gun violence, and toxic relations with increasingly militarized police forces waging local “wars” against drugs and gangs, gangs proliferated in cities throughout the urban Sun Belt. Their prominence within popular and political discourse nationwide made them symbols of the urban crisis and of the cultural deficiencies that some believed had caused it.
Stephen H. Norwood
Strikebreakers have been drawn from many parts of the American population, most notably the permanently and seasonally unemployed and underemployed. Excluded from a vast range of occupations and shunned by many trade unions, African Americans constituted another potential pool of strikebreakers, especially during the early decades of the 20th century. During the first quarter of the 20th century, college students enthusiastically volunteered for strikebreaking, both because of their generally pro-business outlook and a desire to test their manhood in violent clashes.
A wide array of private and government forces has suppressed strikes. Beginning in the late 19th century, private detective agencies supplied guards who protected company property against strikers, sometimes assaulting them. During the early 20th century, several firms emerged that supplied strikebreakers and guards at companies’ request, drawing on what amounted to private armies of thousands of men. The largest of these operated nationally.
On many occasions the state itself intervened to break strikes. Like some strikebreaking firms, state militiamen deployed advanced weaponry against strikers and their sympathizers, including machine guns. Presidents Hayes and Cleveland called out federal troops to break the 1877 and 1894 interregional railroad strikes. In 1905, Pennsylvania established an elite mounted force to suppress coal miners’ strikes modeled on the British Constabulary patrols in Ireland.
Corporations directly intervened to break strikes, building weapons arsenals, including large supplies of tear gas, that they distributed to police forces. They initiated “back to work” movements to destroy strikers’ morale and used their considerable influence with the media to propagandize in the press and on the radio. Corporations, of course, discharged strikers, often permanently.
In the highly bureaucratized society of the late twentieth and early 21st century that stigmatized public displays of anger, management turned to new “union avoidance” firms to break strikes. These firms emphasized legal and psychological methods rather than violence. They advised employers on how to blur the line between management and labor, defame union leaders and activists, and sow discord among strikers.
From the 1890s to World War I, progressive reformers in the United States called upon their local, state, and federal governments to revitalize American democracy and address the most harmful social consequences of industrialization. The emergence of an increasingly powerful administrative state, which intervened on behalf of the public welfare in the economy and society, generated significant levels of conflict. Some of the opposition came from conservative business interests, who denounced state labor laws and other market regulations as meddlesome interferences with liberty of contract. But the historical record of the Progressive Era also reveals a broad undercurrent of resistance from ordinary Americans, who fought for personal liberty against the growth of police power in such areas as public health administration and the regulation of radical speech. Their struggles in the streets, statehouses, and courtrooms of the United States in the early 20th century shaped the legal culture of the period and revealed the contested meaning of individual liberty in a new social age.
Ann Durkin Keating
Since the beginning of the 19th century, outlying areas of American cities have been home to a variety of settlements and enterprises with close links to urban centers. Beginning in the early 19th century, the increasing scale of business and industrial enterprises separated workplaces from residences. This allowed some urban dwellers to live at a distance from their place of employment and commute to work. Others lived in the shadow of factories located at some distance from the city center. Still others provided food or raw materials for urban residents and businesses. The availability of employment led to further suburban growth. Changing intracity transportation, including railroads, interurbans, streetcars, and cable cars, enabled people and businesses to locate beyond the limits of a walking city.
By the late 19th century, metropolitan areas across the United States included outlying farm centers, industrial towns, residential rail (or streetcar) suburbs, and recreational/institutional centers. With suburbs generally located along rail or ferry lines into the early 20th century, the physical development of metropolitan areas often resembled a hub and spokes. However, across metropolitan regions, suburbs had a great range of function and diversity of populations. With the advent of automobile commutation and the growing use of trucks to haul freight, suburban development took place between railroad lines, filling in the earlier hub-and-spokes patterns into a more deliberate built-up area.
Although suburban settlements were integrally connected to their neighbors and within a metropolitan economy and society, independent suburban governments emerged to serve these outlying settlements and keep them separate. Developers often took the lead in providing differential services (and regulations). Suburban governments emerged as hybrid forms, serving relatively homogeneous populations by providing only some urban functions. Well before 1945, suburbs were home to a wide range of work and residents.
Becky Nicolaides and Andrew Wiese
Mass migration to suburban areas was a defining feature of American life after 1945. Before World War II, just 13% of Americans lived in suburbs. By 2010, however, suburbia was home to more than half of the U.S. population. The nation’s economy, politics, and society suburbanized in important ways. Suburbia shaped habits of car dependency and commuting, patterns of spending and saving, and experiences with issues as diverse as race and taxes, energy and nature, privacy and community. The owner occupied, single-family home, surrounded by a yard, and set in a neighborhood outside the urban core came to define everyday experience for most American households, and in the world of popular culture and the imagination, suburbia was the setting for the American dream. The nation’s suburbs were an equally critical economic landscape, home to vital high-tech industries, retailing, “logistics,” and office employment. In addition, American politics rested on a suburban majority, and over several decades, suburbia incubated political movements across the partisan spectrum, from grass-roots conservativism, to centrist meritocratic individualism, environmentalism, feminism, and social justice. In short, suburbia was a key setting for postwar American life.
Even as suburbia grew in magnitude and influence, it also grew more diverse, coming to reflect a much broader cross-section of America itself. This encompassing shift marked two key chronological stages in suburban history since 1945: the expansive, racialized, mass suburbanization of the postwar years (1945–1970) and an era of intensive social diversification and metropolitan complexity (since 1970). In the first period, suburbia witnessed the expansion of segregated white privilege, bolstered by government policies, exclusionary practices, and reinforced by grassroots political movements. By the second period, suburbia came to house a broader cross section of Americans, who brought with them a wide range of outlooks, lifeways, values, and politics. Suburbia became home to large numbers of immigrants, ethnic groups, African Americans, the poor, the elderly and diverse family types. In the face of stubborn exclusionism by affluent suburbs, inequality persisted across metropolitan areas and manifested anew in proliferating poorer, distressed suburbs. Reform efforts sought to alleviate metro-wide inequality and promote sustainable development, using coordinated regional approaches. In recent years, the twin discourses of suburban crisis and suburban rejuvenation captured the continued complexity of America’s suburbs.
Since the turn of the 20th century, teachers have tried to find a balance between bettering their own career prospects as workers and educating their students as public servants. To reach a workable combination, teachers have utilized methods drawn from union movements, the militant and labor-conscious approach favored by the American Federation of Teachers (AFT), as well as to professional organizations, the tradition from which the National Education Association (NEA) arose. Because teachers lacked the federally guaranteed labor rights that private-sector workers enjoyed after Congress passed the National Labor Relations Act in 1935, teachers’ fortunes—in terms of collective bargaining rights, control over classroom conditions, pay, and benefits—often remained tied to the broader public-sector labor movement and to state rather than federal law.
Opponents of teacher unionization consistently charged that as public servants paid by tax revenues, teachers and other public employees should not be allowed to form unions. Further, because women constituted the vast majority of teachers and union organizing often represented a “manly” domain, the opposition’s approach worked quite well, successfully preventing teachers from gaining widespread union recognition. But by the late 1960s and early 1970s, thanks to an improved economic climate and invigoration from the women’s movement, civil rights struggles, and the New Left, both AFT and NEA teacher unionism surged forward, infused with a powerful militancy devoted to strikes and other political action, and appeared poised to capture federal collective bargaining rights. Their newfound assertiveness proved ill-timed, however.
After the economic problems of the mid-1970s, opponents of teacher unions once again seized the opportunity to portray teacher unions and other public-sector unions as greedy and privileged interest groups functioning at the public’s expense. President Ronald Reagan accentuated this point when he fired all of the more than 10,000 striking air traffic controllers during the 1981 Professional Air Traffic Controllers Organization (PATCO) strike. Facing such opposition, teacher unions—and public-sector unions in general—shifted their efforts away from strikes and toward endorsing political candidates and lobbying governments to pass favorable legislation.
Given these constraints, public-sector unions enjoyed a large degree of success in the 1990s through the early 2000s, even as private-sector union membership plunged to less than 10 percent of the workforce. After the Great Recession of 2008, however, austerity politics targeted teachers and other public-sector workers and renewed political confrontations surrounding the legitimacy of teacher unions.
During the latter half of the 20th century, the International Brotherhood of Teamsters (IBT) represented the labor organization most readily recognized by the American public. Rooted in the vital sectors of transportation, warehousing, and distribution, the Teamsters became one of the largest unions in the United States, wielded considerable economic leverage, and used this leverage to improve the lives of low-wage workers across a broad swath of occupations and industries. The union’s reputation for militancy and toughness reached its apotheosis in the controversial career of Jimmy Hoffa, the Teamsters’ most prominent post-World War II leader. Under the leadership of Hoffa and his immediate predecessor Dave Beck, the union (with some notable exceptions) embraced a business ethos, often engaged in collusive and corrupt practices, and came to symbolize the labor movement’s squandered potential as a transformational social force. Fear of Hoffa and his associations with underworld figures provoked an intense backlash, resulting in the IBT’s 1957 expulsion from the AFL-CIO, concerted legal and legislative action aimed at curbing Teamster influence, and a lingering public perception that the union was a hopelessly corrupt and malign force.
Hoffa’s unsolved disappearance in 1975 cemented the Teamsters’ image as a suspect institution, and analysts of the IBT have often offered either superficial or sensational accounts of the organization’s history and operations. With the deregulation of the trucking industry in the 1980s, the IBT suffered serious losses in market share and membership that eclipsed many of the union’s crowning collective bargaining achievements. A series of lackluster, corrupt leaders who followed Hoffa as union president proved unable to counter these developments, triggering the rise of an aggressive internal reform movement (Teamsters for a Democratic Union), federal intervention and monitoring, and the election of a reform slate in 1991 that assumed leadership of the union. However, since the union’s victory in an epic strike against United Parcel Service in 1997, the Teamsters have struggled to regain their ability to assert working-class power, especially within the private sector transportation industry, where they once exercised nearly unchallenged hegemony.
Timothy James LeCain
Technology and environmental history are both relatively young disciplines among Americanists, and during their early years they developed as distinctly different and even antithetical fields, at least in topical terms. Historians of technology initially focused on human-made and presumably “unnatural” technologies, whereas environmental historians focused on nonhuman and presumably “natural” environments. However, in more recent decades, both disciplines have moved beyond this oppositional framing. Historians of technology increasingly came to view anthropogenic artifacts such as cities, domesticated animals, and machines as extensions of the natural world rather than its antithesis. Even the British and American Industrial Revolutions constituted not a distancing of humans from nature, as some scholars have suggested, but rather a deepening entanglement with the material environment. At the same time, many environmental historians were moving beyond the field’s initial emphasis on the ideal of an American and often Western “wilderness” to embrace a concept of the environment as including humans and productive work. Nonetheless, many environmental historians continued to emphasize the independent agency of the nonhuman environment of organisms and things. This insistence that not everything could be reduced to human culture remained the field’s most distinctive feature.
Since the turn of millennium, the two fields have increasingly come together in a variety of synthetic approaches, including Actor Network Theory, envirotechnical analysis, and neomaterialist theory. As the influence of the cultural turn has waned, the environmental historians’ emphasis on the independent agency of the nonhuman has come to the fore, gaining wider influence as it is applied to the dynamic “nature” or “wildness” that some scholars argue exists within both the technological and natural environment. The foundational distinctions between the history of technology and environmental history may now be giving way to more materially rooted attempts to understand how a dynamic hybrid environment helps to create human history in all of its dimensions—cultural, social, and biological.
Michael A. Krysko
Technology is ubiquitous in the history of US foreign relations. Throughout US history, technology has played an essential role in how a wide array of Americans have traveled to and from, learned about, understood, recorded and conveyed information about, and attempted to influence, benefit from, and exert power over other lands and peoples. The challenge for the historian is not to find where technology intersects with the history of US foreign relations, but how to place a focus on technology without falling prey to deterministic assumptions about the inevitability of the global power and influence—or lack thereof—the United States has exerted through the technology it has wielded.
“Foreign relations” and “technology” are, in fact, two terms with extraordinarily broad connotations. “Foreign relations” is not synonymous with “diplomacy,” but encompasses all aspects and arenas of American engagement with the world. “Technology” is itself “an unusually slippery term,” notes prominent technology historian David Nye, and can refer to simple tools, more complex machines, and even more complicated and expansive systems on which the functionality of many other innovations depends. Furthermore, processes of technological innovation, proliferation, and patterns of use are shaped by a dizzying array of influences embedded within the larger surrounding context, including but by no means limited to politics, economics, laws, culture, international exchanges, and environment. While some of the variables that have shaped how the United States has deployed its technological capacities were indeed distinctly American, others arose outside the United States and lay beyond any American ability to control. A technology-focused rendering of US foreign relations and global ascendancy is not, therefore, a narrative of uninterrupted progress and achievement, but an accounting of both successes and failures that illuminate how surrounding contexts and decisions have variably shaped, encouraged, and limited the technology and power Americans have wielded.
Described as a “chief among chiefs” by the British, and by his arch-rival, William Henry Harrison, as “one of those uncommon geniuses which spring up occasionally to produce revolutions and overturn the established order of things,” Tecumseh impressed all who knew him. Lauded for his oratory, military and diplomatic skills, and, ultimately, his humanity, Tecumseh presided over the greatest Indian resistance movement that had ever been assembled in the eastern half of North America. His genius lay in his ability to fully articulate religious, racial, and cultural ideals borne out of his people’s existence on fault lines between competing empires and Indian confederacies. Known as “southerners” by their Algonquian relatives, the Shawnees had a history of migrating between worlds. Tecumseh, and his brother, Tenskwatawa, converted this inheritance into a widespread social movement in the first decade and a half of the 19th-century, when more than a thousand warriors, from many different tribes, heeded their call to halt American expansion along the border of what is now Ohio and Indiana. Tecumseh articulated a vision of intertribal, pan-Indian unity based on revitalization and reform, and his ambitions very nearly rewrote early American history.