1-8 of 8 Results

  • Keywords: racism x
Clear all

Article

Ansley T. Erickson

“Urban infrastructure” calls to mind railways, highways, and sewer systems. Yet the school buildings—red brick, limestone, or concrete, low-slung, turreted, or glass-fronted—that hold and seek to shape the city’s children are ubiquitous forms of infrastructure as well. Schools occupy one of the largest line items in a municipal budget, and as many as a fifth of a city’s residents spend the majority of their waking hours in school classrooms, hallways, and gymnasiums. In the 19th and 20th centuries urban educational infrastructure grew, supported by developing consensus for publicly funded and publicly governed schools (if rarely fully accessible to all members of the public). Even before state commitment to other forms of social welfare, from pensions to public health, and infrastructure, from transit to fire, schooling was a government function. This commitment to public education ultimately was national, but schools in cities had their own story. Schooling in the United States is chiefly a local affair: Constitutional responsibility for education lies with the states; power is then further decentralized as states entrust decisions about school function and funding to school districts. School districts can be as small as a single town or a part of a city. Such localism is one reason that it is possible to speak about schools in U.S. cities as having a particular history, determined as much by the specificities of urban life as by national questions of citizenship, economy, religion, and culture. While city schools have been distinct, they have also been nationally influential. Urban scale both allowed for and demanded the most extensive educational system-building. Urban growth and diversity galvanized innovation, via exploration in teaching methods, curriculum, and understanding of children and communities. And it generated intense conflict. Throughout U.S. history, urban residents from myriad social, political, religious, and economic positions have struggled to define how schools would operate, for whom, and who would decide. During the 19th and 20th centuries, U.S. residents struggled over the purposes, funding, and governance of schools in cities shaped by capitalism, nativism, and white supremacy. They built a commitment to schooling as a public function of their cities, with many compromises and exclusions. In the 21st century, old struggles re-emerged in new form, perhaps raising the question of whether schools will continue as public, urban infrastructure.

Article

American Indian activism after 1945 was as much a part of the larger, global decolonization movement rooted in centuries of imperialism as it was a direct response to the ethos of civic nationalism and integration that had gained momentum in the United States following World War II. This ethos manifested itself in the disastrous federal policies of termination and relocation, which sought to end federal services to recognized Indian tribes and encourage Native people to leave reservations for cities. In response, tribal leaders from throughout Indian Country formed the National Congress of American Indians (NCAI) in 1944 to litigate and lobby for the collective well-being of Native peoples. The NCAI was the first intertribal organization to embrace the concepts of sovereignty, treaty rights, and cultural preservation—principles that continue to guide Native activists today. As American Indian activism grew increasingly militant in the late 1960s and 1970s, civil disobedience, demonstrations, and takeovers became the preferred tactics of “Red Power” organizations such as the National Indian Youth Council (NIYC), the Indians of All Tribes, and the American Indian Movement (AIM). At the same time, others established more focused efforts that employed less confrontational methods. For example, the Native American Rights Fund (NARF) served as a legal apparatus that represented Native nations, using the courts to protect treaty rights and expand sovereignty; the Council of Energy Resource Tribes (CERT) sought to secure greater returns on the mineral wealth found on tribal lands; and the American Indian Higher Education Consortium (AIHEC) brought Native educators together to work for greater self-determination and culturally rooted curricula in Indian schools. While the more militant of these organizations and efforts have withered, those that have exploited established channels have grown and flourished. Such efforts will no doubt continue into the unforeseeable future so long as the state of Native nations remains uncertain.

Article

The reproductive experiences of women and girls in the 20th-century United States followed historical patterns shaped by the politics of race and class. Laws and policies governing reproduction generally regarded white women as legitimate reproducers and potentially fit mothers and defined women of color as unfit for reproduction and motherhood; regulations provided for rewards and punishments accordingly. In addition, public policy and public rhetoric defined “population control” as the solution to a variety of social and political problems in the United States, including poverty, immigration, the “quality” of the population, environmental degradation, and “overpopulation.” Throughout the century, nonetheless, women, communities of color, and impoverished persons challenged official efforts, at times reducing or even eliminating barriers to reproductive freedom and community survival. Between 1900 and 1930, decades marked by increasing urbanization, industrialization, and immigration, eugenic fears of “race suicide” (concerns that white women were not having enough babies) fueled a reproductive control regime that pressured middle-class white women to reproduce robustly. At the same time, the state enacted anti-immigrant laws, undermined the integrity of Native families, and protected various forms of racial segregation and white supremacy, all of which attacked the reproductive dignity of millions of women. Also in these decades, many African American women escaped the brutal and sexually predatory Jim Crow culture of the South, and middle-class white women gained greater sexual freedom and access to reproductive health care, including contraceptive services. During the Great Depression, the government devised the Aid to Dependent Children program to provide destitute “worthy” white mothers with government aid while often denying such supports to women of color forced to subordinate their motherhood to agricultural and domestic labor. Following World War II, as the Civil Rights movement gathered form, focus, and adherents, and as African American and other women of color claimed their rights to motherhood and social provision, white policymakers railed against “welfare queens” and defined motherhood as a class privilege, suitable only for those who could afford to give their children “advantages.” The state, invoking the “population bomb,” fought to reduce the birth rates of poor women and women of color through sterilization and mandatory contraception, among other strategies. Between 1960 and 1980, white feminists employed the consumerist language of “choice” as part of the campaign for legalized abortion, even as Native, black, Latina, immigrant, and poor women struggled to secure the right to give birth to and raise their children with dignity and safety. The last decades of the 20th century saw severe cuts in social programs designed to aid low-income mothers and their children, cuts to funding for public education and housing, court decisions that dramatically reduced poor women’s access to reproductive health care including abortion, and the emergence of a powerful, often violent, anti-abortion movement. In response, in 1994 a group of women of color activists articulated the theory of reproductive justice, splicing together “social justice” and “reproductive rights.” The resulting Reproductive Justice movement, which would become increasingly influential in the 21st century, defined reproductive health, rights, and justice as human rights due to all persons and articulated what each individual requires to achieve these rights: the right not to have children, the right to have children, and the right to the social, economic, and environmental conditions necessary to raise children in healthy, peaceful, and sustainable households and communities.

Article

Emily Suzanne Clark

Religion and race provide rich categories of analysis for American history. Neither category is stable. They change, shift, and develop in light of historical and cultural contexts. Religion has played a vital role in the construction, deconstruction, and transgression of racial identities and boundaries. Race is a social concept and a means of classifying people. The “natural” and “inherent” differences between races are human constructs, social taxonomies created by cultures. In American history, the construction of racial identities and racial differences begins with the initial encounters between Europeans, Native Americans, and Africans. Access to and use of religious and political power has shaped how race has been conceived in American history. Racial categories and religious affiliations influenced how groups regarded each other throughout American history, with developments in the colonial period offering prime examples. Enslavement of Africans and their descendants, as well as conquered Native Americans, displayed the power of white Protestants. Even 19th-century American anti-Catholicism and anti-Mormonism intersected racial identifications. At the same time, just as religion has supported racial domination in American history, it also has inspired calls for self-determination among racial minorities, most notably in the 20th century. With the long shadow of slavery, the power of white supremacy, the emphasis on Native sovereignty, and the civil rights movement, much of the story of religion and race in American history focuses on Americans white, black, and red. However, this is not the whole story. Mexican-Americans and Latinx immigrants bring Catholic and transnational connections, but their presence has prompted xenophobia. Additionally, white Americans sought to restrict the arrival of Asian immigrants both legally and culturally. With the passing of the Immigration and Nationality Act of 1965, the religious, racial, and ethnic diversity of the United States increased further. This religious and racial pluralism in many ways reflects the diversity of America, as does the conflict that comes with it.

Article

A fear of foreignness shaped the immigration foreign policies of the United States up to the end of World War II. US leaders perceived nonwhite peoples of Latin America, Asia, and Europe as racially inferior, and feared that contact with them, even annexation of their territories, would invite their foreign mores, customs, and ideologies into US society. This belief in nonwhite peoples’ foreignness also influenced US immigration policy, as Washington codified laws that prohibited the immigration of nonwhite peoples to the United States, even as immigration was deemed a net gain for a US economy that was rapidly industrializing from the late 19th century to the first half of the 20th century. Ironically, this fear of foreignness fostered an aggressive US foreign policy for many of the years under study, as US leaders feared that European intervention into Latin America, for example, would undermine the United States’ regional hegemony. The fear of foreignness that seemed to oblige the United States to shore up its national security interests vis-à-vis European empires also demanded US intervention into the internal affairs of nonwhite nations. For US leaders, fear of foreignness was a two-sided coin: European aggression was encouraged by the internal instability of nonwhite nations, and nonwhite nations were unstable—and hence ripe pickings for Europe’s empires—because their citizens were racially inferior. To forestall both of these simultaneous foreign threats, the United States increasingly embedded itself into the political and economic affairs of foreign nations. The irony of opportunity, of territorial acquisitions as well as immigrants who fed US labor markets, and fear, of European encroachment and the racial inferiority of nonwhite peoples, lay at the root of the immigration and foreign policies of the United States up to 1945.

Article

Asian women, the immigrant generation, entered Hawai’i, when it was a kingdom and subsequently a US territory, and the Western US continent, from the 1840s to the 1930s as part of a global movement of people escaping imperial wars, colonialism, and homeland disorder. Most were wives or picture brides from China, Japan, Korea, the Philippines, and South Asia, joining menfolk who worked overseas to escape poverty and strife. Women also arrived independently; some on the East Coast. US immigration laws restricting the entry of Asian male laborers also limited Asian women. Asian women were critical for establishing Asian American families and ensuring such households’ survival and social mobility. They worked on plantations, in agricultural fields and canneries, as domestics and seamstresses, and helped operate family businesses, while doing housework, raising children, and navigating cultural differences. Their activities gave women more power in their families than by tradition and shifted gender roles toward more egalitarian households. Women’s organizations, and women’s leadership, ideas, and skills contributed to ethnic community formation. Second generation (US-born) Asian American women grew up in the late 19th and early 20th centuries and negotiated generational as well as cultural differences. Some were mixed race, namely, biracial or multiracial. Denied participation in many aspects of American youth culture, they formed ethnic-based clubs and organizations and held social activities that mirrored mainstream society. Some attended college. A few broke new ground professionally. Asian and Asian American women were diverse in national origin, class, and location. Both generations faced race and gender boundaries in education, employment, and public spaces, and they were active in civic affairs to improve their lives and their communities’ well-being. Across America, they marched, made speeches, and raised funds to free their homelands from foreign occupation and fought for racial and gender equality in the courts, workplaces, and elsewhere.

Article

Distinctive patterns of daily life defined the Jim Crow South. Contrary to many observers’ emphasis on de jure segregation—meaning racial separation demanded by law—neither law nor the physical separation of blacks and whites was at the center of the early 20th-century South’s social system. Instead, separation, whether by law or custom, was one of multiple tools whites used to subordinate and exclude blacks and to maintain notions of white racial purity. In turn, these notions themselves varied over time and across jurisdictions, at least in their details, as elites tried repeatedly to establish who was “white,” who was “black,” and how the legal fictions they created would apply to Native Americans and others who fit neither category. Within this complex multiracial world of the South, whites’ fundamental commitment to keeping blacks “in their place” manifested most routinely in day-to-day social dramas, often described in terms of racial “etiquette.” The black “place” in question was socially but not always physically distant from whites, and the increasing number of separate, racially marked spaces and actual Jim Crow laws was a development over time that became most pronounced in urban areas. It was a development that reveals blacks’ determination to resist racial oppression and whites’ perceived need to shore up a supposedly natural order that had, in fact, always been enforced by violence as well as political and economic power. Black resistance took many forms, from individual, covert acts of defiance to organized political movements. Whether in response to African Americans’ continued efforts to vote or their early 20th-century boycotts of segregated streetcars or World War I-era patterns of migration that threatened to deplete the agricultural labor force, whites found ways to counter blacks’ demands for equal citizenship and economic opportunity whenever and wherever they appeared. In the rural South, where the majority of black Southerners remained economically dependent on white landowners, a “culture of personalism” characterized daily life within a paternalistic model of white supremacy that was markedly different from urban—and largely national, not merely southern—racial patterns. Thus, distinctions between rural and urban areas and issues of age and gender are critical to understanding the Jim Crow South. Although schools were rigorously segregated, preadolescent children could be allowed greater interracial intimacy in less official settings. Puberty became a break point after which close contact, especially between black males and white females, was prohibited. All told, Jim Crow was an inconsistent and uneven system of racial distinction and separation whose great reach shaped the South’s landscape and the lives of all Southerners, including those who were neither black nor white.

Article

“Working-Class Environmentalism in America” traces working Americans’ efforts to protect the environment from antebellum times to the present. Antebellum topics include African American slaves’ environmental ethos; aesthetic nature appreciation by Lowell, Massachusetts “mill girls” working in New England’s first textile factories; and Boston’s 1840s fight for safe drinking water. Late-19th-century topics include working-class support for creating urban parks, workers’ early efforts to confront urban pollution and the “smoke nuisance,” and the exploration of conservationist ideas and policies by New England small farmers and fishermen in the late 1800s. In the early 20th century, working-class youth, including immigrants and African Americans, participated in the youth camping movement and the Boy Scouts and Girl Scouts of America, while working-class adults and their families, enjoying new automobility and two-day weekends, discovered picnicking, car-camping, and sport hunting and fishing in newly created wilderness preserves. Workers also learned of toxic dangers to workplace safety and health from shocking stories of 1920s New Jersey “radium girls” and tetraethyl lead factory workers, and from 1930s Midwestern miners who went on strike over deadly silicosis. The 1930s United States rediscovered natural resource conservation when the Civilian Conservation Corps (CCC) employed millions of working-class youth. Lumber workers advocated federal regulation of timber harvesting. Postwar America saw the United Auto Workers (UAW), United Steelworkers (USWA), Oil Chemical and Atomic Workers (OCAW), American Federation of Labor and Congress of Industrial Organizations (AFL-CIO), and other labor unions lobbying for wilderness and wildlife preservation, workplace and community health, and fighting air and water pollution, while the United Farmworkers (UFW) fought reckless pesticide use, and dissidents within the United Mine Workers (UMW) sought to ban surface coal mining. Radical organizations explored minority community environmentalism and interracial cooperation on environmental reform. Following post-1970s nationwide conservative retrenchment, working-class activists and communities of color fought toxic wastes and explored environmental justice and environmental racism at places like Love Canal, New York and Warren County, North Carolina and formed the Blue-Green Alliance with environmentalists.