You are looking at 41-60 of 312 articles
Buddhist history in the United States traces to the mid-19th century, when early scholars and spiritual pioneers first introduced the subject to Americans, followed soon by the arrival of Chinese immigrants to the West Coast. Interest in Buddhism was significant during the late Victorian era, but practice was almost completely confined to Asian immigrants, who faced severe white prejudice and legal discrimination. The Japanese were the first to establish robust, long-lasting temple networks, though they, too, faced persecution, culminating in the 1942 incarceration of 120,000 Japanese Americans, a severe blow to American Buddhism. Outside the Japanese American community, Buddhism grew slowly in the earlier decades of the 20th century, but it began to take off in the 1960s, aided soon by the lifting of onerous immigration laws and the return of large-scale Asian immigration. By the end of the 20th century American Buddhism had become extremely diverse and complex, with clear evidence of permanence in Asian American and other communities.
The history of Calvinism in the United States is part of a much larger development, the globalization of western Christianity. American Calvinism owes its existence to the transplanting of European churches and religious institutions to North America, a process that began in the 16th century, first with Spanish and French Roman Catholics, and accelerated a century later when Dutch, English, Scottish, and German colonists and immigrants of diverse Protestant backgrounds settled in the New World. The initial variety of Calvinists in North America was the result of the different circumstances under which Protestantism emerged in Europe as a rival to the Roman Catholic Church, to the diverse civil governments that supported established Protestant churches, and to the various business sponsors that included the Christian ministry as part of imperial or colonial designs.
Once the British dominated the Eastern seaboard (roughly 1675), and after English colonists successfully fought for political independence (1783), Calvinism lost its variety. Beyond their separate denominations, English-speaking Protestants (whether English, Scottish, or Irish) created a plethora of interdenominational religious agencies for the purpose of establishing a Christian presence in an expanding American society. For these Calvinists, being Protestant went hand in hand with loyalty to the United States. Outside this pan-Protestant network of Anglo-American churches and religious institutions were ethnic-based Calvinist denominations caught between Old World ways of being Christian and American patterns of religious life. Over time, most Calvinist groups adapted to national norms, while some retained institutional autonomy for fear of compromising their faith.
Since 1970, when the United States entered an era sometimes called post-Protestant, Calvinist churches and institutions have either declined or become stagnant. But in certain academic, literary, and popular culture settings, Calvinism has for some Americans, whether connected or not to Calvinist churches, continued to be a source for sober reflection on human existence and earnest belief and religious practice.
Cambodians entered the United States as refugees after a group of Cambodian Communists named Khmer Rouge, led by the French-educated Pol Pot, won a civil war that had raged from March 1970 to April 1975 and proceeded to rule the country with extraordinary brutality. In power from April 17, 1975, to January 7, 1979, they destroyed all the major institutions in the country. An estimated 1.7 million people out of an estimated total population of 7.9 million died from executions, hunger, disease, injuries, coerced labor, and exposure to the elements. The refuge-seekers came in three waves: (1) just before the Khmer Rouge takeover, (2) during the regime’s existence, and (3) after the regime was overthrown. Some former Khmer Rouge personnel, who had escaped to Vietnam because they opposed Pol Pot’s extremist ideology and savage practices, returned in late December 1978, accompanied by 120,000 Vietnamese troops, to topple the government of their former comrades. A second civil war then erupted along the Thai-Cambodian border pitting the rump Khmer Rouge against two groups of non-communist combatants. Though fighting among themselves, all three groups opposed the new Cambodian government that was supported and controlled by Vietnam. When hundreds of thousands of Cambodians, along with Laotians and Vietnamese, showed up at the Thai-Cambodian border to seek refuge in Thailand, the Thai government and military did not welcome them. Thailand treated the Cambodians especially harshly for reasons related to the Thai officials’ concerns about the internal security of their country.
Almost 158,000 Cambodians gained entry into the United States between 1975 and 1994, mainly as refugees but with a smaller number as immigrants and “humanitarian parolees.” Cambodian ethnic communities sprang up on American soil, many of them in locations chosen by the U.S. Office of Refugee Resettlement. By the time the 1990 U.S. census was taken, Cambodians could be found in all fifty states. The refugees encountered enormous difficulties adapting to life in the United States. Only about 5 percent of them, mostly educated people from the first wave of refugees who came in 1975 and who, therefore, did not experience the atrocities of the Khmer Rouge era, managed to find white-collar jobs, often serving as intermediaries between their compatriots and the larger American society. About 40 to 50 percent of the Cambodian newcomers who arrived in the second and third waves found employment in blue-collar occupations. The rest of the population has relied on welfare and other forms of public assistance. A significant portion of this last group is composed of households headed by women whose fathers, husbands, or sons the Khmer Rouge had killed. It is they who have had to struggle the hardest to keep themselves and their children alive. Many women had to learn to become the main bread winners in their families even though they had never engaged in wage labor in their homeland. Large numbers of refugees have suffered from post-traumatic stress disorder but have received very little help to deal with the symptoms. Some children, lacking role models, have not done well academically and dropped out of school. Others have joined gangs. Despite myriad difficulties, Cambodians in the United States are determined to resuscitate their social institutions and culture that the Khmer Rouge had tried to destroy during their reign of terror. By reviving Cambodian classical dance, music, and other performing and visual arts, and by rebuilding institutions, particularly Buddhist temples, they are trying valiantly to transcend the tragedies that befell them in order to survive as a people and a culture.
The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.
Carlos Montezuma was one of the most influential Indians of his day and a prominent leader among the Red Progressives of the late 19th and early 20th centuries. Born to Yavapai parents in central Arizona, he was kidnapped by O’odham (Pima) raiders at a young age, and sold soon after into the Indian slave trade that for centuries had engulfed the US-Mexico borderlands. Educated primarily at public schools in Illinois, Montezuma eventually went on to be the first Native American graduate of the University of Illinois (1884) and one of the first Native American doctors (Chicago Medical College, 1889). Montezuma was a lifelong friend of Richard Henry Pratt, the founder of the Carlisle Indian Industrial School, and he firmly believed in the importance of Indian education. He insisted that educated Indians like himself must serve as examples of what Indians were capable of achieving if given the opportunities. He became deeply involved in the pan-Indian reform movements of the day and was one of the founding members of the Society of American Indians. Montezuma had a rocky relationship with the group, however, because many in the organization found his calls for the immediate abolition of the Indian Bureau and an end to the reservation system difficult to accept. From 1916 to 1922, he published his own journal, Wassaja, in which he relentlessly assailed the Indian Bureau, the reservations, and anyone who stood in the way of Indian “progress.” But Montezuma’s most important work was as an advocate for his own people, the Yavapais of Fort McDowell, Arizona, and other Arizona Indian groups. He spent the final decade of his life working to protect their water, land, and culture, and eventually returned to his Arizona homelands to die, in 1923. Although he was largely forgotten by historians and scholars in the decades after his death, Carlos Montezuma is now correctly remembered as one of the most important figures in Native American history during the Progressive Era.
The Catholic Church has been a presence in the United States since the arrival of French and Spanish missionaries in the 16th and 17th centuries. The Spanish established a number of missions in what is now the western part of the United States; the most important French colony was New Orleans. Although they were a minority in the thirteen British colonies prior to the American Revolution, Catholics found ways to participate in communal forms of worship when no priest was available to celebrate Mass. John Carroll was appointed superior of the Mission of the United States of America in 1785. Four years later, Carroll was elected the first bishop in the United States; his diocese encompassed the entire country. The Catholic population of the United States began to grow during the first half of the 19th century primarily due to Irish and German immigration. Protestant America was often critical of the newcomers, believing one could not be a good Catholic and a good American at the same time. By 1850, Roman Catholicism was the largest denomination in the United States.
The number of Catholics arriving in the United States declined during the Civil War but began to increase after the cessation of hostilities. Catholic immigrants during the late 19th and early 20th centuries were primarily from southern and Eastern Europe, and they were not often welcomed by a church that was dominated by Irish and Irish American leaders. At the same time that the church was expanding its network of parishes, schools, and hospitals to meet the physical and spiritual needs of the new immigrants, other Catholics were determining how their church could speak to issues of social and economic justice. Dorothy Day, Father Charles Coughlin, and Monsignor John A. Ryan are three examples of practicing Catholics who believed that the principles of Catholicism could help to solve problems related to international relations, poverty, nuclear weapons, and the struggle between labor and capital.
In addition to changes resulting from suburbanization, the Second Vatican Council transformed Catholicism in the United States. Catholics experienced other changes as a decrease in the number of men and women entering religious life led to fewer priests and sisters staffing parochial schools and parishes. In the early decades of the 21st century, the church in the United States was trying to recover from the sexual abuse crisis. Visiting America in 2015, Pope Francis reminded Catholics of the important teachings of the church regarding poverty, justice, and climate change. It remains to be seen what impact his papacy will have on the future of Catholicism in the United States.
The central business district, often referred to as the “downtown,” was the economic nucleus of the American city in the 19th and 20th centuries. It stood at the core of urban commercial life, if not always the geographic center of the metropolis. Here was where the greatest number of offices, banks, stores, and service institutions were concentrated—and where land values and building heights reached their peaks. The central business district was also the most easily accessible point in a city, the place where public transit lines intersected and brought together masses of commuters from outlying as well as nearby neighborhoods. In the downtown, laborers, capitalists, shoppers, and tourists mingled together on bustling streets and sidewalks. Not all occupants enjoyed equal influence in the central business district. Still, as historian Jon C. Teaford explained in his classic study of American cities, the downtown was “the one bit of turf common to all,” the space where “the diverse ethnic, economic, and social strains of urban life were bound together, working, spending, speculating, and investing.”
The central business district was not a static place. Boundaries shifted, expanding and contracting as the city grew and the economy evolved. So too did the primary land uses. Initially a multifunctional space where retail, wholesale, manufacturing, and financial institutions crowded together, the central business district became increasingly segmented along commercial lines in the 19th century. By the early 20th century, rising real estate prices and traffic congestion drove most manufacturing and processing operations to the periphery. Remaining behind in the city center were the bulk of the nation’s offices, stores, and service institutions. As suburban growth accelerated in the mid-20th century, many of these businesses also vacated the downtown, following the flow of middle-class, white families. Competition with the suburbs drained the central business district of much of its commercial vitality in the second half of the 20th century. It also inspired a variety of downtown revitalization schemes that tended to reinforce inequalities of race and class.
In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.
By the end of the 19th century, the medical specialties of gynecology and obstetrics established a new trend in women’s healthcare. In the 20th century, more and more American mothers gave birth under the care of a university-trained physician. The transition from laboring and delivering with the assistance of female family, neighbors, and midwives to giving birth under medical supervision is one of the most defining shifts in the history of childbirth. By the 1940s, the majority of American mothers no longer expected to give birth at home, but instead traveled to hospitals, where they sought reassurance from medical experts as well as access to pain-relieving drugs and life-saving technologies. Infant feeding followed a similar trajectory. Traditionally, infant feeding in the West had been synonymous with breastfeeding, although alternatives such as wet nursing and the use of animal milks and broths had existed as well. By the early 20th century, the experiences of women changed in relation to sweeping historical shifts in immigration, urbanization, and industrialization, and so too did their abilities and interests in breastfeeding. Scientific study of infant feeding yielded increasingly safer substitutes for breastfeeding, and by the 1960s fewer than 1 in 5 mothers breastfed. In the 1940s and 1950s, however, mothers began to organize and to resist the medical management of childbirth and infant feeding. The formation of childbirth education groups helped spread information about natural childbirth methods and the first dedicated breastfeeding support organization, La Leche League, formed in 1956. By the 1970s, the trend toward medicalized childbirth and infant feeding that had defined the first half of the century was in significant flux. By the end of the 20th century, efforts to harmonize women’s interests in more “natural” motherhood experiences with the existing medical system led to renewed interest in midwifery, home birth, and birth centers. Despite the cultural shift in favor of fewer medical interventions, rates of cesarean sections climbed to new heights by the end of the 1990s. Similarly, although pressures on mothers to breastfeed mounted by the end of the century, the practice itself increasingly relied upon the use of technologies such as the breast pump. By the close of the century, women’s agency in pursuing more natural options proceeded in tension with the technological, social, medical, and political systems that continued to shape their options.
Boys and girls of European and African descent in Colonial America shared commonalities initially as unfree laborers, with promises of emancipation for all. However, as labor costs and demands changed, white servitude disappeared and slavery in perpetuity prevailed for the majority of blacks in the South following the American Revolution. Children were aware of differences in their legal status, social positions, life changing opportunities, and vulnerabilities within an environment where blackness signaled slavery or the absence of liberty, and whiteness garnered license or freedom.
Slavery and freedom existed concomitantly, and relationships among children, even black ones, in North America were affected by time and place. Slave societies and societies with slaves determined the nature of interactions among enslaved and emancipated children. To be sure, few, if any, freed or free-born blacks did not have a relative or friend who was not or had never been enslaved, especially in states when gradual emancipation laws liberated family members born after a specific date and left older relatives in thralldom. As a result, free blacks were never completely aloof from their enslaved contemporaries. And, freedom was more meaningful if and when enjoyed by all.
Just as interactions among enslaved and free black children varied, slaveholding children were sometimes benevolent and at other times brutal toward those they claimed as property. And, enslaved children did not always assume subservient positions under masters and mistresses in the making. Ultimately, fields of play rather than fields of labor fostered the most fair and enjoyable moments among slaveholding and enslaved children.
Play days for enslaved girls and boys ended when they were mature enough to work outside their own abodes. As enslaved children entered the workplace, white boys of means, often within slaveholding families, engaged in formal studies, while white girls across classes received less formal education but honed skills associated with domestic arts.
The paths of white and black children diverged as they reached adolescence, but there were instances when they shared facets of literacy, sometimes surreptitiously, and developed genuine friendships that mitigated the harshness of slavery. Even so, the majority of unfree children survived the furies of bondage by inculcating behavior that was acceptable for both a slave and a child.
Chinese were one of the few immigrant groups who brought with them a deep-rooted medical tradition. Chinese herbal doctors and stores came and appeared in California as soon as the Gold Rush began. Traditional Chinese medicine had a long history and was an important part of Chinese culture. Herbal medical knowledge and therapy was popular among Chinese immigrants. Chinese herbal doctors treated American patients as well. Established herbal doctors had more white patients than Chinese patients especially after Chinese population declined due to Chinese Exclusion laws. Chinese herbal medicine attracted American patients in the late 19th and early 20th century because Western medicine could not cure many diseases and symptoms during that period. Thriving Chinese herbal medical business made some doctors of Western medicine upset. California State Board of Medical Examiners did not allow Chinese herbal doctors to practice as medical doctors and had them arrested as practitioners without doctor license. Many of Chinese herbal doctors managed to operate their medical business as merchants selling herbs. Chinese herbal doctors often defended their career in court and newspaper articles. Their profession eventually discontinued when People’s Republic of China was established in 1949 and the United States passed the Trading with Enemy Economy Act in December 1950 that cut herbal medical imports from China.
Carol L. Higham
Comparing Catholic and Protestant missionaries in North America can be a herculean task. It means comparing many religious groups, at least five governments, and hundreds of groups of Indians. But missions to the Indians played important roles in social, cultural, and political changes for Indians, Europeans, and Americans from the very beginning of contact in the 1500s to the present. By comparing Catholic and Protestant missions to the Indians, this article provides a better understanding of the relationship between these movements and their functions in the history of borders and frontiers, including how the missions changed both European and Indian cultures.
John D. Fairfield
The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.
Claudrena N. Harold
The civil rights movement in the urban South transformed the political, economic, and cultural landscape of post–World War II America. Between 1955 and 1968, African Americans and their white allies relied on nonviolent direct action, political lobbying, litigation, and economic boycotts to dismantle the Jim Crow system. Not all but many of the movement’s most decisive political battles occurred in the cities of Montgomery and Birmingham, Alabama; Nashville and Memphis, Tennessee; Greensboro and Durham, North Carolina; and Atlanta, Georgia. In these and other urban centers, civil rights activists launched full-throttled campaigns against white supremacy, economic exploitation, and state-sanctioned violence against African Americans. Their fight for racial justice coincided with monumental changes in the urban South as the upsurge in federal spending in the region created unprecedented levels of economic prosperity in the newly forged “Sunbelt.”
A dynamic and multifaceted movement that encompassed a wide range of political organizations and perspectives, the black freedom struggle proved successful in dismantling legal segregation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 expanded black southerners’ economic, political, and educational opportunities. And yet, many African Americans continued to struggle as they confronted not just the long-term effects of racial discrimination and exclusion but also the new challenges engendered by deindustrialization and urban renewal as well as entrenched patterns of racial segregation in the public-school system.
American cities developed under relatively quiescent climatic conditions. A gradual rise in average global temperatures during the 19th and 20th centuries had a negligible impact on how urban Americans experienced the weather. Much more significant were the dramatic changes in urban form and social organization that meditated the relationship between routine weather fluctuations and the lives of city dwellers. Overcoming weather-related impediments to profit, comfort, and good health contributed to many aspects of urbanization, including population migration to Sunbelt locations, increased reliance on fossil fuels, and comprehensive re-engineering of urban hydrological systems. Other structural shifts such as sprawling development, intensification of the built environment, socioeconomic segregation, and the tight coupling of infrastructural networks were less directly responsive to weather conditions but nonetheless profoundly affected the magnitude and social distribution of weather-related risks. Although fatalities resulting from extreme meteorological events declined in the 20th century, the scale of urban disruption and property damage increased. In addition, social impacts became more concentrated among poorer Americans, including many people of color, as Hurricane Katrina tragically demonstrated in 2005. Through the 20th century, cities responded to weather hazards through improved forecasting and systematic planning for relief and recovery rather than alterations in metropolitan design. In recent decades, however, growing awareness and concern about climate change impacts have made volatile weather more central to urban planning.
James R. Barrett
The largest and most important revolutionary socialist organization in US history, the Communist Party USA was always a minority influence. It reached considerable size and influence, however, during the Great Depression and World War II years when it followed the more open line associated with the term “Popular Front.” In these years communists were much more flexible in their strategies and relations with other groups, though the party remained a hierarchical vanguard organization. It grew from a largely isolated sect dominated by unskilled and unemployed immigrant men in the 1920s to a socially diverse movement of nearly 100,000 based heavily on American born men and women from the working and professional classes by the late 1930s and during World War II, exerting considerable influence in the labor movement and American cultural life. In these years, the Communist Party helped to build the industrial union movement, advanced the cause of African American civil rights, and laid the foundation for the postwar feminist movement. But the party was always prone to abrupt changes in line and vulnerable to attack as a sinister outside force because of its close adherence to Soviet policies and goals. Several factors contributed to its catastrophic decline in the 1950s: the increasingly antagonistic Cold War struggle between the Soviet Union and the United States; an unprecedented attack from employers and government at various levels—criminal cases and imprisonment, deportation, and blacklisting; and within the party itself, a turn back toward a more dogmatic version of Marxism-Leninism and a heightened atmosphere of factional conflict and purges.
Company towns can be defined as communities dominated by a single company, typically focused on one industry. Beyond that very basic definition, company towns varied in their essentials. Some were purpose-built by companies, often in remote areas convenient to needed natural resources. There, workers were often required to live in company-owned housing as a condition of employment. Others began as small towns with privately owned housing, usually expanding alongside a growing hometown corporation. Residences were shoddy in some company towns. In others, company-built housing may have been excellent, with indoor plumbing and central heating, and located close to such amenities as schools, libraries, perhaps even theaters.
Company towns played a key role in US economic and social development. Such places can be found across the globe, but America’s vast expanse of undeveloped land, generous stock of natural resources, tradition of social experimentation, and laissez-faire attitude toward business provided singular opportunities for the emergence of such towns, large and small, in many regions of the United States. Historians have identified as many as 2,500 such places.
A tour of company towns can serve as a survey of the country’s industrial development, from the first large-scale planned industrial community—the textile town of Lowell, Massachusetts—to Appalachian mining villages, Western lumber towns, and steelmaking principalities such as the mammoth development at Gary, Indiana. More recent office-park and high-tech industrial-park complexes probably do not qualify as company towns, although they have some similar attributes. Nor do such planned towns as Disney Corporation’s Celebration, Florida, qualify, despite close ties to a single corporation, because its residents do not necessarily work for Disney.
Company towns have generally tended toward one of two models. First, and perhaps most familiar, are total institutions—communities where one business exerts a Big Brother–ish grip over the population, controlling or even taking the place of government, collecting rent on company-owned housing, dictating buying habits (possibly at the company store), and even directing where people worship and how they may spend their leisure time. A second form consists of model towns—planned, ideal communities backed by companies that promised to share their bounty with workers and families. Several such places were carefully put together by experienced architects and urban planners. Such model company towns were marked by a paternalistic, watchful attitude toward the citizenry on the part of the company overlords.
Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.
In May 1861, three enslaved men who were determined not to be separated from their families ran to Fort Monroe, Virginia. Their flight led to the phenomenon of Civil War contraband camps. Contraband camps were refugee camps to which between four hundred thousand and five hundred thousand enslaved men, women, and children in the Union-occupied portions of the Confederacy fled to escape their owners by getting themselves to the Union Army. Army personnel had not envisioned overseeing a massive network of refugee camps. Responding to the interplay between the actions of the former slaves who fled to the camps, Republican legislation and policy, military orders, and real conditions on the ground, the army improvised. In the contraband camps, former slaves endured overcrowding, food and clothing shortages, poor sanitary conditions, and constant danger. They also gained the protection of the Union Army and access to the power of the US government as new, though unsteady, allies in the pursuit of their key interests, including education, employment, and the reconstitution of family, kin, and social life. The camps brought together actors who had previously had little to no contact with each other, exposed everyone involved to massive structural forces that were much larger than the human ability to control them, and led to unexpected outcomes. They produced a refugee crisis on US soil, affected the course and outcome of the Civil War, influenced the progress of wartime emancipation, and altered the relationship between the individual and the national government. Contraband camps were simultaneously humanitarian crises and incubators for a new relationship between African Americans and the US government.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Counterinsurgency (known as COIN) is a theory of war that seeks to describe a proven set of techniques that a government may use to defeat a violent, internal, organized challenge to its authority and legitimacy. The term is sometimes also used to describe the set of activities itself (e.g., “conducting counterinsurgency”). The term originates from the middle of the 20th century, when it emerged from officials in U.S. President John F. Kennedy’s administration, as well as from British and French thinkers and practitioners with whom these officials were consulting. The Kennedy Administration and its allies were grappling with how to deal with what they viewed as Soviet attempts to destabilize post-colonial governments in the Third World and bring those nascent countries into the Soviet orbit. Encouraged by British and French experience in post-colonial rebellions and prior experience of imperial policing, the Kennedy administration hoped to apply their lessons learned to Cold War problems, most notably the growing challenges in Vietnam.
Rebellions, “irregular warfare,” “guerrilla warfare,” or “small wars,” or for that matter, thinking about means to put them down, go back to the beginnings of organized conflict itself. But 20th-century thinkers were informed most especially by British and French theorists of the 19th and early 20th centuries, such as British Colonel Charles E. Calwell and future Marshal of France, Hubert Lyautey. The most significant influence came from veterans, such as Sir Robert Thompson, of Britain’s “Emergency” in Malaya, from 1948–1960, and from David Galula, veteran of France’s conflict in Algeria from 1954–1960. Though these theorists differ on a number of points and on emphasis, the intellectual paternity is clear.
At its heart, the premise of counterinsurgency theory is that rebellions can only be eliminated by gaining the support of the population. Because rebels can hide amongst the people, influence them, and convince “fence sitters” to join in an insurgency, the government can only succeed when the majority of the population rejects the rebels and their message, refuses to offer them assistance, and ultimately turns them over to the authorities. Counterinsurgency theorists often invoke an image from a work by Chinese leader Mao Zedong, On Guerrilla Warfare, in which he described the people as water and guerrilla fighters as fish swimming in it.
Theorists argued for decades (indeed, the argument goes on) about whether America’s war in Vietnam failed because the nation was unable or unwilling to fully implement proper counterinsurgency practices. When the U.S.-led wars in Iraq and Afghanistan in the 21st century began to falter, counterinsurgency and its proponents were once again center stage. Indeed, many maintain that, in 2007, the United States began to implement COIN, and that this turned the tide. But this argument remains in dispute, as do the theoretical and historical foundations of COIN more broadly.