1-20 of 103 Results  for:

  • 20th Century: Pre-1945 x
Clear all

Article

During the Holocene, the present geological epoch, an increasing portion of humans began to manipulate the reproduction of plants and animals in a series of environmental practices known as agriculture. No other ecological relationship sustains as many humans as farming; no other has transformed the landscape to the same extent. The domestication of plants by American Indians followed the end of the last glacial maximum (the Ice Age). About eight thousand years ago, the first domesticated maize and squash arrived from central Mexico, spreading to every region and as far north as the subarctic boreal forest. The incursion of Europeans into North America set off widespread deforestation, soil depletion, and the spread of settlement, followed by the introduction of industrial machines and chemicals. A series of institutions sponsored publically funded research into fertilizers and insecticides. By the late 19th century, writers and activists criticized the technological transformation of farming as destructive to the environment and rural society. During the 20th century, wind erosion contributed to the depopulation of much of the Great Plains. Vast projects in environmental engineering transformed deserts into highly productive regions of intensive fruit and vegetable production. Throughout much of the 19th and 20th centuries, access to land remained limited to whites, with American Indians, African Americans, Latinas/os, Chinese, and peoples of other ethnicities attempting to gain farms or hold on to the land they owned. Two broad periods describe the history of agriculture and the environment in that portion of North America that became the United States. In the first, the environment dominated, forcing humans to adapt during the end of thousands of years of extreme climate variability. In the second, institutional and technological change became more significant, though the environment remained a constant factor against which American agriculture took shape. A related historical pattern within this shift was the capitalist transformation of the United States. For thousands of years, households sustained themselves and exchanged some of what they produced for money. But during the 19th century among a majority of American farmers, commodities took over the entire purpose of agriculture, transforming environments to reflect commercial opportunity.

Article

On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history. Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.

Article

The first forty years of cinema in the United States, from the development and commercialization of modern motion picture technology in the mid-1890s to the full blossoming of sound-era Hollywood during the early 1930s, represents one of the most consequential periods in the history of the medium. It was a time of tremendous artistic and economic transformation, including but not limited to the storied transition from silent motion pictures to “the talkies” in the late 1920s. Though the nomenclature of the silent era implies a relatively unified period in film history, the years before the transition to sound saw a succession of important changes in film artistry and its means of production, and film historians generally regard the epoch as divided into at least three separate and largely distinct temporalities. During the period of early cinema, which lasted about a decade from the medium’s emergence in the mid-1890s through the middle years of the new century’s first decade, motion pictures existed primarily as a novelty amusement presented in vaudeville theatres and carnival fairgrounds. Film historians Tom Gunning and André Gaudreault have famously defined the aesthetic of this period as a “cinema of attractions,” in which the technology of recording and reproducing the world, along with the new ways in which it could frame, orient, and manipulate time and space, marked the primary concerns of the medium’s artists and spectators. A transitional period followed from around 1907 to the later 1910s when changes in the distribution model for motion pictures enabled the development of purpose-built exhibition halls and led to a marked increase in demand for the entertainment. On a formal and artistic level, the period saw a rise in the prominence of the story film and widespread experimentation with new techniques of cinematography and editing, many of which would become foundational to later cinematic style. The era also witnessed the introduction and growing prominence of feature-length filmmaking over narrative shorts. The production side was marked by intensifying competition between the original American motion picture studios based in and around New York City, several of which attempted to cement their influence by forming an oligopolistic trust, and a number of upstart “independent” West Coast studios located around Los Angeles. Both the artistic and production trends of the transitional period came to a head during the classical era that followed, when the visual experimentation of the previous years consolidated into the “classical style” favored by the major studios, and the competition between East Coast and West Coast studios resolved definitively in favor of the latter. This was the era of Hollywood’s ascendance over domestic filmmaking in the United States and its growing influence over worldwide film markets, due in part to the decimation of the European film industry during World War I. After nearly a decade of dominance, the Hollywood studio system was so refined that the advent of marketable synchronized sound technology around 1927 produced relatively few upheavals among the coterie of top studios. Rather, the American film industry managed to reorient itself around the production of talking motion pictures so swiftly that silent film production in the United States had effectively ceased at any appreciable scale by 1929. Artistically, the early years of “the talkies” proved challenging, as filmmakers struggled with the imperfections of early recording technology and the limitations they imposed on filmmaking practice. But filmgoing remained popular in the United States even during the depths of the Great Depression, and by the early 1930s a combination of improved technology and artistic adaptation led to such a marked increase in quality that many film historians regard the period to be the beginning of Hollywood’s Golden Era. With a new voluntary production code put in place to respond to criticism of immorality in Hollywood fare, the American film industry was poised by the early 1930s to solidify its prominent position in American cultural life.

Article

The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease. American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.

Article

Early 20th century American labor and working-class history is a subfield of American social history that focuses attention on the complex lives of working people in a rapidly changing global political and economic system. Once focused closely on institutional dynamics in the workplace and electoral politics, labor history has expanded and refined its approach to include questions about the families, communities, identities, and cultures workers have developed over time. With a critical eye on the limits of liberal capitalism and democracy for workers’ welfare, labor historians explore individual and collective struggles against exclusion from opportunity, as well as accommodation to political and economic contexts defined by rapid and volatile growth and deep inequality. Particularly important are the ways that workers both defined and were defined by differences of race, gender, ethnicity, class, and place. Individual workers and organized groups of working Americans both transformed and were transformed by the main struggles of the industrial era, including conflicts over the place of former slaves and their descendants in the United States, mass immigration and migrations, technological change, new management and business models, the development of a consumer economy, the rise of a more active federal government, and the evolution of popular culture. The period between 1896 and 1945 saw a crucial transition in the labor and working-class history of the United States. At its outset, Americans were working many more hours a day than the eight for which they had fought hard in the late 19th century. On average, Americans labored fifty-four to sixty-three hours per week in dangerous working conditions (approximately 35,000 workers died in accidents annually at the turn of the century). By 1920, half of all Americans lived in growing urban neighborhoods, and for many of them chronic unemployment, poverty, and deep social divides had become a regular part of life. Workers had little power in either the Democratic or Republican party. They faced a legal system that gave them no rights at work but the right to quit, judges who took the side of employers in the labor market by issuing thousands of injunctions against even nonviolent workers’ organizing, and vigilantes and police forces that did not hesitate to repress dissent violently. The ranks of organized labor were shrinking in the years before the economy began to recover in 1897. Dreams of a more democratic alternative to wage labor and corporate-dominated capitalism had been all but destroyed. Workers struggled to find their place in an emerging consumer-oriented culture that assumed everyone ought to strive for the often unattainable, and not necessarily desirable, marks of middle-class respectability. Yet American labor emerged from World War II with the main sectors of the industrial economy organized, with greater earning potential than any previous generation of American workers, and with unprecedented power as an organized interest group that could appeal to the federal government to promote its welfare. Though American workers as a whole had made no grand challenge to the nation’s basic corporate-centered political economy in the preceding four and one-half decades, they entered the postwar world with a greater level of power, and a bigger share in the proceeds of a booming economy, than anyone could have imagined in 1896. The labor and working-class history of the United States between 1900 and 1945, then, is the story of how working-class individuals, families, and communities—members of an extremely diverse American working class—managed to carve out positions of political, economic, and cultural influence, even as they remained divided among themselves, dependent upon corporate power, and increasingly invested in a individualistic, competitive, acquisitive culture.

Article

The story of mass culture from 1900 to 1945 is the story of its growth and increasing centrality to American life. Sparked by the development of such new media as radios, phonographs, and cinema that required less literacy and formal education, and the commodification of leisure pursuits, mass culture extended its purview to nearly the entire nation by the end of the Second World War. In the process, it became one way in which immigrant and second-generation Americans could learn about the United States and stake a claim to participation in civic and social life. Mass culture characteristically consisted of artifacts that stressed pleasure, sensation, and glamor rather than, as previously been the case, eternal and ethereal beauty, moral propriety, and personal transcendence. It had the power to determine acceptable values and beliefs and define qualities and characteristics of social groups. The constant and graphic stimulation led many custodians of culture to worry about the kinds of stimulation that mass culture provided and about a breakdown in social morality that would surely follow. As a result, they formed regulatory agencies and watchdogs to monitor the mass culture available on the market. Other critics charged the regime of mass culture with inducing homogenization of belief and practice and contributing to passive acceptance of the status quo. The spread of mass culture did not terminate regional, class, or racial cultures; indeed, mass culture artifacts often borrowed them. Nor did marginalized groups accept stereotypical portrayals; rather, they worked to expand the possibilities of prevailing ones and to provide alternatives.

Article

Radio debuted as a wireless alternative to telegraphy in the late 19th century. At its inception, wireless technology could only transmit signals and was incapable of broadcasting actual voices. During the 1920s, however, it transformed into a medium primarily identified as one used for entertainment and informational broadcasting. The commercialization of American broadcasting, which included the establishment of national networks and reliance on advertising to generate revenue, became the so-called American system of broadcasting. This transformation demonstrates how technology is shaped by the dynamic forces of the society in which it is embedded. Broadcasting’s aural attributes also engaged listeners in a way that distinguished it from other forms of mass media. Cognitive processes triggered by the disembodied voices and sounds emanating from radio’s loudspeakers illustrate how listeners, grounded in particular social, cultural, economic, and political contexts, made sense of and understood the content with which they were engaged. Through the 1940s, difficulties in expanding the international radio presence of the United States further highlight the significance of surrounding contexts in shaping the technology and in promoting (or discouraging) listener engagement with programing content.

Article

As places of dense habitation, cities have always required coordination and planning. City planning has involved the design and construction of large-scale infrastructure projects to provide basic necessities such as a water supply and drainage. By the 1850s, immigration and industrialization were fueling the rise of big cities, creating immense, collective problems of epidemics, slums, pollution, gridlock, and crime. From the 1850s to the 1900s, both local governments and utility companies responded to this explosive physical and demographic growth by constructing a “networked city” of modern technologies such as gaslight, telephones, and electricity. Building the urban environment also became a wellspring of innovation in science, medicine, and administration. In 1909–1910, a revolutionary idea—comprehensive city planning—opened a new era of professionalization and institutionalization in the planning departments of city halls and universities. Over the next thirty-five years, however, wars and depression limited their influence. From 1945 to 1965, in contrast, represents the golden age of formal planning. During this unprecedented period of peace and prosperity, academically trained experts played central roles in the modernization of the inner cities and the sprawl of the suburbs. But the planners’ clean-sweep approach to urban renewal and the massive destruction caused by highway construction provoked a revolt of the grassroots. Beginning in the Watts district of Los Angeles in 1965, mass uprisings escalated over the next three years into a national crisis of social disorder, racial and ethnic inequality, and environmental injustice. The postwar consensus of theory and practice was shattered, replaced by a fragmented profession ranging from defenders of top-down systems of computer-generated simulations to proponents of advocacy planning from the bottom up. Since the late 1980s, the ascendency of public-private partnerships in building the urban environment has favored the planners promoting systems approaches, who promise a future of high-tech “smart cities” under their complete control.

Article

Judy Yung and Erika Lee

The Angel Island Immigration Station (1910–1940), located in San Francisco Bay, was one of twenty-four ports of entry established by the U.S. government to process and detain immigrants entering and leaving the country. Although popularly called the “Ellis Island of the West,” the Angel Island station was in fact quite different from its counterpart in New York. Ellis Island was built in 1892 to welcome European immigrants and to enforce immigration laws that restricted but did not exclude European immigrants. In contrast, as the primary gateway for Chinese and other Asian immigrants, the Angel Island station was built in 1910 to better enforce discriminatory immigration policies that targeted Asians for exclusion. Chinese immigrants, in particular, were subjected to longer physical exams, interrogations, and detentions than any other immigrant group. Out of frustration, anger, and despair, many of them wrote and carved Chinese poems into the barrack walls. In 1940, a fire destroyed the administration building, and the immigration station was moved back to San Francisco. In 1963, the abandoned site became part of the state park system, and the remaining buildings were slated for demolition. Thanks to the collective efforts of Asian American activists and descendents of former detainees, the U.S. Immigration Station at Angel Island was designated a National Historic Landmark in 1997, and the immigration site, including the Chinese poetry on the barrack walls, was preserved and transformed into a museum of Pacific immigration for visitors.

Article

Anna May Wong (January 3, 1905–February 3, 1961) was the first Chinese American movie star and the first Asian American actress to gain international recognition. Wong broke the codes of yellowface in both American and European cinema to become one of the major global actresses of Asian descent between the world wars. She made close to sixty films that circulated around the world and in 1951 starred in her own television show, The Gallery of Madame Liu-Tsong, produced by the defunct Dumont Network. Examining Wong’s career is particularly fruitful because of race’s centrality to the motion pictures’ construction of the modern American nation-state, as well as its significance within the global circulation of moving images. Born near Los Angeles’s Chinatown, Wong began acting in films at an early age. During the silent era, she starred in films such as The Toll of the Sea (1922), one of the first two-tone Technicolor films, and The Thief of Baghdad (1924). Frustrated by Hollywood roles, Wong left for Europe in the late 1920s, where she starred in several films and plays, including Piccadilly (1929) and A Circle of Chalk (1929) opposite Laurence Olivier. Wong traveled between the United States and Europe for film and stage work. In 1935 she protested Metro-Goldwyn-Mayer’s refusal to consider her for the leading role of O-Lan in the Academy Award–winning film The Good Earth (1937). Wong then paid her one and only visit to China. In the late 1930s, she starred in several B films such as King of Chinatown (1939), graced the cover of the mass-circulating American magazine Look, and traveled to Australia. In 1961, Wong died of Laennec’s cirrhosis, a disease typically stemming from alcoholism. Yet, as her legacy shows, for a brief moment a glamorous Chinese American woman occupied a position of transnational importance.

Article

Between 1880 and 1924, an estimated half million Arab migrants left the Ottoman Empire to live and work in the Americas. Responding to new economic forces linking the Mediterranean and Atlantic capitalist economies to one another, Arab migrants entered the manufacturing industries of the settler societies they inhabited, including industrial textiles, small-scale commerce (peddling), heavy machining, and migrant services associated with continued immigration from the Middle East. The Ottoman Empire enacted few policies to halt emigration from Syria, Mount Lebanon, and Palestine, instead facilitating a remittance economy that enhanced the emerging cash economies of the Arab world. After 1920, the French Mandate in Syria and Lebanon moved to limit new migration to the Americas, working together with increasingly restrictive immigration regimes in the United States, Argentina, and Brazil to halt Arab labor immigration. Using informal archives, the Arab American press, and the records of diasporic mutual aid and philanthropic societies, new research in Arab American migration illustrates how migrants managed a transnational labor economy and confronted challenges presented by American nativism, travel restriction, and interwar deportations.

Article

Akram Fouad Khater

Between 1880 and 1940, more than 130,000 Arabs immigrated to the United States as part of the Great Migration of the long 19th century. They lived and worked across the breadth of the United States, fought its many wars, and were engaged in the transformative debates about labor, race, gender, and citizenship that raged throughout this time period. As they struggled to carve out a place in “Amirka” they encountered and fought efforts to racialize them as the uncivilized and undesirable “Other.” Their struggles not only contributed to shaping the United States and its immigration policies, but also confronted them with the conundrum of how to belong: to accept and seek admission into the existing system delineated by race, gender, and class, or to challenge the premises of that system. While there was not a singular response from this diverse community, the majority opted to fight for a place in “white” America even if in return this rendered them a liminal ethnicity.

Article

Jennifer Hoyt

Relations between the United States and Argentina can be best described as a cautious embrace punctuated by moments of intense frustration. Although never the center of U.S.–Latin American relations, Argentina has attempted to create a position of influence in the region. As a result, the United States has worked with Argentina and other nations of the Southern Cone—the region of South America that comprises Uruguay, Paraguay, Argentina, Chile, and southern Brazil—on matters of trade and economic development as well as hemispheric security and leadership. While Argentina has attempted to assert its position as one of Latin America’s most developed nations and therefore a regional leader, the equal partnership sought from the United States never materialized for the Southern Cone nation. Instead, competition for markets and U.S. interventionist and unilateral tendencies kept Argentina from attaining the influence and wealth it so desired. At the same time, the United States saw Argentina as an unreliable ally too sensitive to the pull of its volatile domestic politics. The two nations enjoyed moments of cooperation in World War I, the Cold War, and the 1990s, when Argentine leaders could balance this particular external partnership with internal demands. Yet at these times Argentine leaders found themselves walking a fine line as detractors back home saw cooperation with the United States as a violation of their nation’s sovereignty and autonomy. There has always been potential for a productive partnership, but each side’s intransigence and unique concerns limited this relationship’s accomplishments and led to a historical imbalance of power.

Article

Asian women, the immigrant generation, entered Hawai’i, when it was a kingdom and subsequently a US territory, and the Western US continent, from the 1840s to the 1930s as part of a global movement of people escaping imperial wars, colonialism, and homeland disorder. Most were wives or picture brides from China, Japan, Korea, the Philippines, and South Asia, joining menfolk who worked overseas to escape poverty and strife. Women also arrived independently; some on the East Coast. US immigration laws restricting the entry of Asian male laborers also limited Asian women. Asian women were critical for establishing Asian American families and ensuring such households’ survival and social mobility. They worked on plantations, in agricultural fields and canneries, as domestics and seamstresses, and helped operate family businesses, while doing housework, raising children, and navigating cultural differences. Their activities gave women more power in their families than by tradition and shifted gender roles toward more egalitarian households. Women’s organizations, and women’s leadership, ideas, and skills contributed to ethnic community formation. Second generation (US-born) Asian American women grew up in the late 19th and early 20th centuries and negotiated generational as well as cultural differences. Some were mixed race, namely, biracial or multiracial. Denied participation in many aspects of American youth culture, they formed ethnic-based clubs and organizations and held social activities that mirrored mainstream society. Some attended college. A few broke new ground professionally. Asian and Asian American women were diverse in national origin, class, and location. Both generations faced race and gender boundaries in education, employment, and public spaces, and they were active in civic affairs to improve their lives and their communities’ well-being. Across America, they marched, made speeches, and raised funds to free their homelands from foreign occupation and fought for racial and gender equality in the courts, workplaces, and elsewhere.

Article

Black internationalism describes the political culture and intellectual practice forged in response to slavery, colonialism, and white imperialism. It is a historical and ongoing collective struggle against racial oppression rooted in global consciousness. While the expression of black internationalism has certainly changed across time and place, black liberation through collaboration has been and remains its ultimate goal. Since the emergence of black internationalism as a result of the transatlantic slave trade and during the Age of Revolutions, black women such as the poet Phyllis Wheatley and evangelist Rebecca Protten have been at its forefront. Their writings and activism espoused an Afro-diasporic, global consciousness and promoted the cause of universal emancipation. During the 19th century, black women internationalists included abolitionists, missionaries, and clubwomen. They built on the work of their predecessors while laying the foundations for succeeding black women internationalists in the early 20th century. By World War I, a new generation of black women activists and intellectuals remained crucial parts of the International Council of Women, an organization founded by white suffragists from the United States, and the Universal Negro Improvement Association, a global organization formally led by Jamaican pan-Africanist Marcus Garvey. But they also formed an independent organization, the International Council of Women of the Darker Races (ICWDR). Within and outside of the ICWDR, black women from Africa and the African Diaspora faced and challenged discrimination on the basis of their sex and race. Their activism and intellectual work set a powerful precedent for a subsequent wave of black internationalism shaped by self-avowed black feminists.

Article

Ana Elizabeth Rosas

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. On August 4, 1942, the Mexican and U.S. governments launched the bi-national guest worker program, most commonly known as the Bracero Program. An estimated five million Mexican men between the ages of 19 and 45 separated from their families for three-to-nine-month contract cycles at a time, in anticipation of earning the prevailing U.S. wage this program had promised them. They labored in U.S. agriculture, railroad construction, and forestry, with hardly any employment protections or rights in place to support themselves and the families they had left behind in Mexico. The inhumane configuration and implementation of this program prevented most of these men and their families from meeting such goals. Instead, the labor exploitation and alienation that characterized this guest worker program and their program participation paved the way for, at best, fragile family relationships. This program lasted twenty-two years and grew in its expanse, despite its negative consequences, Mexican men and their families could not afford to settle for being unemployed in Mexico, nor could they pass up U.S. employment opportunities of any sort. The Mexican and U.S. governments’ persistently negligent management of the Bracero Program, coupled with their conveniently selective acknowledgement of the severity of the plight of Mexican women and men, consistently cornered Mexican men and their families to shoulder the full extent of the Bracero Program’s exploitative conditions and terms.

Article

David Blanke

The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.

Article

Tyson Reeder

The United States has shared an intricate and turbulent history with Caribbean islands and nations since its inception. In its relations with the Caribbean, the United States has displayed the dueling tendencies of imperialism and anticolonialism that characterized its foreign policy with South America and the rest of the world. For nearly two and a half centuries, the Caribbean has stood at the epicenter of some of the US government’s most controversial and divisive foreign policies. After the American Revolution severed political ties between the United States and the British West Indies, US officials and traders hoped to expand their political and economic influence in the Caribbean. US trade in the Caribbean played an influential role in the events that led to the War of 1812. The Monroe Doctrine provided a blueprint for reconciling imperial ambitions in the Caribbean with anti-imperial sentiment. During the mid-19th century, Americans debated the propriety of annexing Caribbean islands, especially Cuba. After the Spanish-American War of 1898, the US government took an increasingly imperialist approach to its relations with the Caribbean, acquiring some islands as federal territories and augmenting its political, military, and economic influence in others. Contingents of the US population and government disapproved of such imperialistic measures, and beginning in the 1930s the US government softened, but did not relinquish, its influence in the Caribbean. Between the 1950s and the end of the Cold War, US officials wrestled with how to exert influence in the Caribbean in a postcolonial world. Since the end of the Cold War, the United States has intervened in Caribbean domestic politics to enhance democracy, continuing its oscillation between democratic and imperial impulses.

Article

The NAACP, established in 1909, was formed as an integrated organization to confront racism in the United States rather than seeing the issue as simply a southern problem. It is the longest running civil rights organization and continues to operate today. The original name of the organization was The National Negro League, but this was changed to the NAACP on May 30, 1910. Organized to promote racial equality and integration, the NAACP pursued this goal via legal cases, political lobbying, and public campaigns. Early campaigns involved lobbying for national anti-lynching legislation, pursuing through the US Supreme Court desegregation in areas such as housing and higher education, and the pursuit of voting rights. The NAACP is renowned for the US Supreme Court case of Brown v. Board of Education (1954) that desegregated primary and secondary schools and is seen as a catalyst for the civil rights movement (1955–1968). It also advocated public education by promoting African American achievements in education and the arts to counteract racial stereotypes. The organization published a monthly journal, The Crisis, and promoted African American art forms and culture as another means to advance equality. NAACP branches were established all across the United States and became a network of information, campaigning, and finance that underpinned activism. Youth groups and university branches mobilized younger members of the community. Women were also invaluable to the NAACP in local, regional, and national decision-making processes and campaigning. The organization sought to integrate African Americans and other minorities into the American social, political, and economic model as codified by the US Constitution.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.