You are looking at 41-60 of 375 articles
Since the introduction of “Fordism” in the early 1910s, which emphasized technological improvements and maximizing productive efficiency, US autoworkers have struggled with repetitive, exhausting, often dangerous jobs. Yet beginning with Ford’s Five Dollar Day, introduced in 1914, auto jobs have also provided higher pay than most other wage work, attracting hundreds of thousands of people, especially to Detroit, Michigan, through the 1920s, and again from World War II until the mid-1950s. Successful unionization campaigns by the United Auto Workers (UAW) in the 1930s and early 1940s resulted in contracts that guaranteed particular wage increases, reduced the power of foremen, and created a process for resolving workplace conflicts. In the late 1940s and early 1950s UAW president Walter Reuther negotiated generous medical benefits and pensions for autoworkers. The volatility of the auto industry, however, often brought layoffs that undermined economic security. By the 1950s overproduction and automation contributed heavily to instability for autoworkers. The UAW officially supported racial and gender equality, but realities in auto plants and the makeup of union leadership often belied those principles. Beginning in the 1970s US autoworkers faced disruptions caused by high oil prices, foreign competition, and outsourcing to Mexico. Contract concessions at unionized plants began in the late 1970s and continued into the 2000s. By the end of the 20th century, many American autoworkers did not belong to the UAW because they were employed by foreign automakers, who built factories in the United States and successfully opposed unionization. For good reason, autoworkers who survived the industry’s turbulence and were able to retire with guaranteed pensions and medical care look back fondly on all that they gained from working in the industry under UAW contracts. Countless others left auto work permanently and often reluctantly in periodic massive layoffs and the continuous loss of jobs from automation.
Sharon Ann Murphy
In creating a new nation, the United States also had to create a financial system from scratch. During the period from the Revolution to the Civil War, the country experimented with numerous options. Although the Constitution deliberately banned the issuance of paper money by either Congress or the states, states indirectly reclaimed this power by incorporating state-chartered banks with the ability to print banknotes. These provided Americans with a medium of exchange to facilitate trade and an expansionary money supply to meet the economic needs of a growing nation. The federal government likewise entered into the world of money and finance with the incorporation of the First and Second Banks of the United States. Not only did critics challenge the constitutionality of these banks, but contemporaries likewise debated whether any banking institutions promoted the economic welfare of the nation or if they instead introduced unnecessary instability into the economy. These debates became particularly heated during moments of crisis. Periods of war, including the Revolutionary War, the War of 1812, and the Civil War, highlighted the necessity of a robust financial system to support the military effort, while periods of economic panic such as the Panic of 1819, the Panics of 1837 and 1839, and the Panic of 1857 drew attention to the weaknesses inherent in this decentralized, largely unregulated system. Whereas Andrew Jackson succeeded in destroying the Second Bank of the United States during the Bank War, state-chartered commercial banks, savings banks, and investment banks still multiplied rapidly throughout the period. Numerous states introduced regulations intended to control the worst excesses of these banks, but the most comprehensive legislation occurred with the federal government’s Civil War-era Banking Acts, which created the first uniform currency for the nation.
Thomas J. Sugrue
Racism in the United States has long been a national problem, not a regional phenomenon. The long and well-documented history of slavery, Jim Crow laws, and racial violence in the South overshadows the persistent reality of racial discrimination, systemic segregation, and entrenched inequality north of the Mason-Dixon line. From the mid-19th century forward, African Americans and their allies mounted a series of challenges to racially separate schools, segregated public accommodations, racially divided workplaces, endemic housing segregation, and discriminatory policing. The northern civil rights movement expanded dramatically in the aftermath of the Great Migration of blacks northward and the intensification of segregation in northern hotels, restaurants, and theaters, workplaces, housing markets, and schools in the early 20th century. During the Great Depression and World War II, emboldened civil rights organizations engaged in protest, litigation, and lobbying efforts to undermine persistent racial discrimination and segregation. Their efforts resulted in legal and legislative victories against racially separate and unequal institutions, particularly workplaces and stores. But segregated housing and schools remained more impervious to change. By the 1960s, many black activists in the North grew frustrated with the pace of change, even as they succeeded in increasing black representation in elected office, in higher education, and in certain sectors of the economy. In the late 20th century, civil rights activists launched efforts to fight the ongoing problem of police brutality and the rise of the prison-industrial complex. And they pushed, mostly through the courts, for the protection of the fragile gains of the civil rights era. The black freedom struggle in the North remained incomplete in the face of ongoing segregation, persistent racism, and ongoing racial inequality in employment, education, income, and wealth.
Maxine Leeds Craig
Black beauty culture developed in the context of widespread disparagement of black men and women in images produced by whites, and black women’s exclusion from mainstream cultural institutions, such as beauty contests, which defined beauty standards on a national scale. Though mainstream media rarely represented black women as beautiful, black women’s beauty was valued within black communities. Moreover many black women used cosmetics, hair products and styling, and clothing to meet their communities’ standards for feminine appearance. At the beginning of the 20th century, the black press, which included newspapers, general magazines, and women’s magazines, showcased the beauty of black women. As early as the 1890s, black communities organized beauty contests that celebrated black women’s beauty and served as fora for debating definitions of black beauty. Still, generally, but not always, the black press and black women’s beauty pageants favored women with lighter skin tones, and many cosmetics firms that marketed to black women sold skin lighteners. The favoring of light skin was nonetheless debated and contested within black communities, especially during periods of heightened black political activism. In the 1910s and 1920s and later in the 1960s and 1970s, social movements fostered critiques of black aesthetics and beauty practices deemed Eurocentric. One focus of criticism was the widespread black practice of hair straightening—a critique that has produced an enduring association between hairstyles perceived as natural and racial pride. In the last decades of the 20th century and the beginning of the 21st, African migration and the transnational dissemination of information via the internet contributed to a creative proliferation of African American hairstyles. While such styles display hair textures associated with African American hair, and are celebrated as natural hairstyles, they generally require the use of hair products and may incorporate synthetic hair extensions.
Beauty culture provided an important vehicle for African American entrepreneurship at a time when racial discrimination barred black women from other opportunities and most national cosmetics companies ignored black women. Black women’s beauty-culture business activities included beauticians who provided hair care in home settings and the extremely successful nationwide and international brand of hair- and skin-care products developed in the first two decades of the 20th century by Madam C. J. Walker. Hair-care shops provided important places for sharing information and community organizing. By the end of the 20th century, a few black-owned hair-care and cosmetics companies achieved broad markets and substantial profitability, but most declined or disappeared as they faced increased competition from or were purchased by larger white-owned corporations.
Brandon R. Byrd
Black internationalism describes the political culture and intellectual practice forged in response to slavery, colonialism, and white imperialism. It is a historical and ongoing collective struggle against racial oppression rooted in global consciousness. While the expression of black internationalism has certainly changed across time and place, black liberation through collaboration has been and remains its ultimate goal.
Since the emergence of black internationalism as a result of the transatlantic slave trade and during the Age of Revolutions, black women such as the poet Phyllis Wheatley and evangelist Rebecca Protten have been at its forefront. Their writings and activism espoused an Afro-diasporic, global consciousness and promoted the cause of universal emancipation. During the 19th century, black women internationalists included abolitionists, missionaries, and clubwomen. They built on the work of their predecessors while laying the foundations for succeeding black women internationalists in the early 20th century. By World War I, a new generation of black women activists and intellectuals remained crucial parts of the International Council of Women, an organization founded by white suffragists from the United States, and the Universal Negro Improvement Association, a global organization formally led by Jamaican pan-Africanist Marcus Garvey. But they also formed an independent organization, the International Council of Women of the Darker Races (ICWDR).
Within and outside of the ICWDR, black women from Africa and the African Diaspora faced and challenged discrimination on the basis of their sex and race. Their activism and intellectual work set a powerful precedent for a subsequent wave of black internationalism shaped by self-avowed black feminists.
Ana Elizabeth Rosas
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
On August 4, 1942, the Mexican and U.S. governments launched the bi-national guest worker program, most commonly known as the Bracero Program. An estimated five million Mexican men between the ages of 19 and 45 separated from their families for three-to-nine-month contract cycles at a time, in anticipation of earning the prevailing U.S. wage this program had promised them. They labored in U.S. agriculture, railroad construction, and forestry, with hardly any employment protections or rights in place to support themselves and the families they had left behind in Mexico. The inhumane configuration and implementation of this program prevented most of these men and their families from meeting such goals. Instead, the labor exploitation and alienation that characterized this guest worker program and their program participation paved the way for, at best, fragile family relationships. This program lasted twenty-two years and grew in its expanse, despite its negative consequences, Mexican men and their families could not afford to settle for being unemployed in Mexico, nor could they pass up U.S. employment opportunities of any sort. The Mexican and U.S. governments’ persistently negligent management of the Bracero Program, coupled with their conveniently selective acknowledgement of the severity of the plight of Mexican women and men, consistently cornered Mexican men and their families to shoulder the full extent of the Bracero Program’s exploitative conditions and terms.
Buddhist history in the United States traces to the mid-19th century, when early scholars and spiritual pioneers first introduced the subject to Americans, followed soon by the arrival of Chinese immigrants to the West Coast. Interest in Buddhism was significant during the late Victorian era, but practice was almost completely confined to Asian immigrants, who faced severe white prejudice and legal discrimination. The Japanese were the first to establish robust, long-lasting temple networks, though they, too, faced persecution, culminating in the 1942 incarceration of 120,000 Japanese Americans, a severe blow to American Buddhism. Outside the Japanese American community, Buddhism grew slowly in the earlier decades of the 20th century, but it began to take off in the 1960s, aided soon by the lifting of onerous immigration laws and the return of large-scale Asian immigration. By the end of the 20th century American Buddhism had become extremely diverse and complex, with clear evidence of permanence in Asian American and other communities.
The history of Calvinism in the United States is part of a much larger development, the globalization of western Christianity. American Calvinism owes its existence to the transplanting of European churches and religious institutions to North America, a process that began in the 16th century, first with Spanish and French Roman Catholics, and accelerated a century later when Dutch, English, Scottish, and German colonists and immigrants of diverse Protestant backgrounds settled in the New World. The initial variety of Calvinists in North America was the result of the different circumstances under which Protestantism emerged in Europe as a rival to the Roman Catholic Church, to the diverse civil governments that supported established Protestant churches, and to the various business sponsors that included the Christian ministry as part of imperial or colonial designs.
Once the British dominated the Eastern seaboard (roughly 1675), and after English colonists successfully fought for political independence (1783), Calvinism lost its variety. Beyond their separate denominations, English-speaking Protestants (whether English, Scottish, or Irish) created a plethora of interdenominational religious agencies for the purpose of establishing a Christian presence in an expanding American society. For these Calvinists, being Protestant went hand in hand with loyalty to the United States. Outside this pan-Protestant network of Anglo-American churches and religious institutions were ethnic-based Calvinist denominations caught between Old World ways of being Christian and American patterns of religious life. Over time, most Calvinist groups adapted to national norms, while some retained institutional autonomy for fear of compromising their faith.
Since 1970, when the United States entered an era sometimes called post-Protestant, Calvinist churches and institutions have either declined or become stagnant. But in certain academic, literary, and popular culture settings, Calvinism has for some Americans, whether connected or not to Calvinist churches, continued to be a source for sober reflection on human existence and earnest belief and religious practice.
Cambodians entered the United States as refugees after a group of Cambodian Communists named Khmer Rouge, led by the French-educated Pol Pot, won a civil war that had raged from March 1970 to April 1975 and proceeded to rule the country with extraordinary brutality. In power from April 17, 1975, to January 7, 1979, they destroyed all the major institutions in the country. An estimated 1.7 million people out of an estimated total population of 7.9 million died from executions, hunger, disease, injuries, coerced labor, and exposure to the elements. The refuge-seekers came in three waves: (1) just before the Khmer Rouge takeover, (2) during the regime’s existence, and (3) after the regime was overthrown. Some former Khmer Rouge personnel, who had escaped to Vietnam because they opposed Pol Pot’s extremist ideology and savage practices, returned in late December 1978, accompanied by 120,000 Vietnamese troops, to topple the government of their former comrades. A second civil war then erupted along the Thai-Cambodian border pitting the rump Khmer Rouge against two groups of non-communist combatants. Though fighting among themselves, all three groups opposed the new Cambodian government that was supported and controlled by Vietnam. When hundreds of thousands of Cambodians, along with Laotians and Vietnamese, showed up at the Thai-Cambodian border to seek refuge in Thailand, the Thai government and military did not welcome them. Thailand treated the Cambodians especially harshly for reasons related to the Thai officials’ concerns about the internal security of their country.
Almost 158,000 Cambodians gained entry into the United States between 1975 and 1994, mainly as refugees but with a smaller number as immigrants and “humanitarian parolees.” Cambodian ethnic communities sprang up on American soil, many of them in locations chosen by the U.S. Office of Refugee Resettlement. By the time the 1990 U.S. census was taken, Cambodians could be found in all fifty states. The refugees encountered enormous difficulties adapting to life in the United States. Only about 5 percent of them, mostly educated people from the first wave of refugees who came in 1975 and who, therefore, did not experience the atrocities of the Khmer Rouge era, managed to find white-collar jobs, often serving as intermediaries between their compatriots and the larger American society. About 40 to 50 percent of the Cambodian newcomers who arrived in the second and third waves found employment in blue-collar occupations. The rest of the population has relied on welfare and other forms of public assistance. A significant portion of this last group is composed of households headed by women whose fathers, husbands, or sons the Khmer Rouge had killed. It is they who have had to struggle the hardest to keep themselves and their children alive. Many women had to learn to become the main bread winners in their families even though they had never engaged in wage labor in their homeland. Large numbers of refugees have suffered from post-traumatic stress disorder but have received very little help to deal with the symptoms. Some children, lacking role models, have not done well academically and dropped out of school. Others have joined gangs. Despite myriad difficulties, Cambodians in the United States are determined to resuscitate their social institutions and culture that the Khmer Rouge had tried to destroy during their reign of terror. By reviving Cambodian classical dance, music, and other performing and visual arts, and by rebuilding institutions, particularly Buddhist temples, they are trying valiantly to transcend the tragedies that befell them in order to survive as a people and a culture.
The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.
Carlos Montezuma was one of the most influential Indians of his day and a prominent leader among the Red Progressives of the late 19th and early 20th centuries. Born to Yavapai parents in central Arizona, he was kidnapped by O’odham (Pima) raiders at a young age, and sold soon after into the Indian slave trade that for centuries had engulfed the US-Mexico borderlands. Educated primarily at public schools in Illinois, Montezuma eventually went on to be the first Native American graduate of the University of Illinois (1884) and one of the first Native American doctors (Chicago Medical College, 1889). Montezuma was a lifelong friend of Richard Henry Pratt, the founder of the Carlisle Indian Industrial School, and he firmly believed in the importance of Indian education. He insisted that educated Indians like himself must serve as examples of what Indians were capable of achieving if given the opportunities. He became deeply involved in the pan-Indian reform movements of the day and was one of the founding members of the Society of American Indians. Montezuma had a rocky relationship with the group, however, because many in the organization found his calls for the immediate abolition of the Indian Bureau and an end to the reservation system difficult to accept. From 1916 to 1922, he published his own journal, Wassaja, in which he relentlessly assailed the Indian Bureau, the reservations, and anyone who stood in the way of Indian “progress.” But Montezuma’s most important work was as an advocate for his own people, the Yavapais of Fort McDowell, Arizona, and other Arizona Indian groups. He spent the final decade of his life working to protect their water, land, and culture, and eventually returned to his Arizona homelands to die, in 1923. Although he was largely forgotten by historians and scholars in the decades after his death, Carlos Montezuma is now correctly remembered as one of the most important figures in Native American history during the Progressive Era.
The Catholic Church has been a presence in the United States since the arrival of French and Spanish missionaries in the 16th and 17th centuries. The Spanish established a number of missions in what is now the western part of the United States; the most important French colony was New Orleans. Although they were a minority in the thirteen British colonies prior to the American Revolution, Catholics found ways to participate in communal forms of worship when no priest was available to celebrate Mass. John Carroll was appointed superior of the Mission of the United States of America in 1785. Four years later, Carroll was elected the first bishop in the United States; his diocese encompassed the entire country. The Catholic population of the United States began to grow during the first half of the 19th century primarily due to Irish and German immigration. Protestant America was often critical of the newcomers, believing one could not be a good Catholic and a good American at the same time. By 1850, Roman Catholicism was the largest denomination in the United States.
The number of Catholics arriving in the United States declined during the Civil War but began to increase after the cessation of hostilities. Catholic immigrants during the late 19th and early 20th centuries were primarily from southern and Eastern Europe, and they were not often welcomed by a church that was dominated by Irish and Irish American leaders. At the same time that the church was expanding its network of parishes, schools, and hospitals to meet the physical and spiritual needs of the new immigrants, other Catholics were determining how their church could speak to issues of social and economic justice. Dorothy Day, Father Charles Coughlin, and Monsignor John A. Ryan are three examples of practicing Catholics who believed that the principles of Catholicism could help to solve problems related to international relations, poverty, nuclear weapons, and the struggle between labor and capital.
In addition to changes resulting from suburbanization, the Second Vatican Council transformed Catholicism in the United States. Catholics experienced other changes as a decrease in the number of men and women entering religious life led to fewer priests and sisters staffing parochial schools and parishes. In the early decades of the 21st century, the church in the United States was trying to recover from the sexual abuse crisis. Visiting America in 2015, Pope Francis reminded Catholics of the important teachings of the church regarding poverty, justice, and climate change. It remains to be seen what impact his papacy will have on the future of Catholicism in the United States.
The central business district, often referred to as the “downtown,” was the economic nucleus of the American city in the 19th and 20th centuries. It stood at the core of urban commercial life, if not always the geographic center of the metropolis. Here was where the greatest number of offices, banks, stores, and service institutions were concentrated—and where land values and building heights reached their peaks. The central business district was also the most easily accessible point in a city, the place where public transit lines intersected and brought together masses of commuters from outlying as well as nearby neighborhoods. In the downtown, laborers, capitalists, shoppers, and tourists mingled together on bustling streets and sidewalks. Not all occupants enjoyed equal influence in the central business district. Still, as historian Jon C. Teaford explained in his classic study of American cities, the downtown was “the one bit of turf common to all,” the space where “the diverse ethnic, economic, and social strains of urban life were bound together, working, spending, speculating, and investing.”
The central business district was not a static place. Boundaries shifted, expanding and contracting as the city grew and the economy evolved. So too did the primary land uses. Initially a multifunctional space where retail, wholesale, manufacturing, and financial institutions crowded together, the central business district became increasingly segmented along commercial lines in the 19th century. By the early 20th century, rising real estate prices and traffic congestion drove most manufacturing and processing operations to the periphery. Remaining behind in the city center were the bulk of the nation’s offices, stores, and service institutions. As suburban growth accelerated in the mid-20th century, many of these businesses also vacated the downtown, following the flow of middle-class, white families. Competition with the suburbs drained the central business district of much of its commercial vitality in the second half of the 20th century. It also inspired a variety of downtown revitalization schemes that tended to reinforce inequalities of race and class.
In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.
By the end of the 19th century, the medical specialties of gynecology and obstetrics established a new trend in women’s healthcare. In the 20th century, more and more American mothers gave birth under the care of a university-trained physician. The transition from laboring and delivering with the assistance of female family, neighbors, and midwives to giving birth under medical supervision is one of the most defining shifts in the history of childbirth. By the 1940s, the majority of American mothers no longer expected to give birth at home, but instead traveled to hospitals, where they sought reassurance from medical experts as well as access to pain-relieving drugs and life-saving technologies. Infant feeding followed a similar trajectory. Traditionally, infant feeding in the West had been synonymous with breastfeeding, although alternatives such as wet nursing and the use of animal milks and broths had existed as well. By the early 20th century, the experiences of women changed in relation to sweeping historical shifts in immigration, urbanization, and industrialization, and so too did their abilities and interests in breastfeeding. Scientific study of infant feeding yielded increasingly safer substitutes for breastfeeding, and by the 1960s fewer than 1 in 5 mothers breastfed. In the 1940s and 1950s, however, mothers began to organize and to resist the medical management of childbirth and infant feeding. The formation of childbirth education groups helped spread information about natural childbirth methods and the first dedicated breastfeeding support organization, La Leche League, formed in 1956. By the 1970s, the trend toward medicalized childbirth and infant feeding that had defined the first half of the century was in significant flux. By the end of the 20th century, efforts to harmonize women’s interests in more “natural” motherhood experiences with the existing medical system led to renewed interest in midwifery, home birth, and birth centers. Despite the cultural shift in favor of fewer medical interventions, rates of cesarean sections climbed to new heights by the end of the 1990s. Similarly, although pressures on mothers to breastfeed mounted by the end of the century, the practice itself increasingly relied upon the use of technologies such as the breast pump. By the close of the century, women’s agency in pursuing more natural options proceeded in tension with the technological, social, medical, and political systems that continued to shape their options.
Boys and girls of European and African descent in Colonial America shared commonalities initially as unfree laborers, with promises of emancipation for all. However, as labor costs and demands changed, white servitude disappeared and slavery in perpetuity prevailed for the majority of blacks in the South following the American Revolution. Children were aware of differences in their legal status, social positions, life changing opportunities, and vulnerabilities within an environment where blackness signaled slavery or the absence of liberty, and whiteness garnered license or freedom.
Slavery and freedom existed concomitantly, and relationships among children, even black ones, in North America were affected by time and place. Slave societies and societies with slaves determined the nature of interactions among enslaved and emancipated children. To be sure, few, if any, freed or free-born blacks did not have a relative or friend who was not or had never been enslaved, especially in states when gradual emancipation laws liberated family members born after a specific date and left older relatives in thralldom. As a result, free blacks were never completely aloof from their enslaved contemporaries. And, freedom was more meaningful if and when enjoyed by all.
Just as interactions among enslaved and free black children varied, slaveholding children were sometimes benevolent and at other times brutal toward those they claimed as property. And, enslaved children did not always assume subservient positions under masters and mistresses in the making. Ultimately, fields of play rather than fields of labor fostered the most fair and enjoyable moments among slaveholding and enslaved children.
Play days for enslaved girls and boys ended when they were mature enough to work outside their own abodes. As enslaved children entered the workplace, white boys of means, often within slaveholding families, engaged in formal studies, while white girls across classes received less formal education but honed skills associated with domestic arts.
The paths of white and black children diverged as they reached adolescence, but there were instances when they shared facets of literacy, sometimes surreptitiously, and developed genuine friendships that mitigated the harshness of slavery. Even so, the majority of unfree children survived the furies of bondage by inculcating behavior that was acceptable for both a slave and a child.
Chinese were one of the few immigrant groups who brought with them a deep-rooted medical tradition. Chinese herbal doctors and stores came and appeared in California as soon as the Gold Rush began. Traditional Chinese medicine had a long history and was an important part of Chinese culture. Herbal medical knowledge and therapy was popular among Chinese immigrants. Chinese herbal doctors treated American patients as well. Established herbal doctors had more white patients than Chinese patients especially after Chinese population declined due to Chinese Exclusion laws. Chinese herbal medicine attracted American patients in the late 19th and early 20th century because Western medicine could not cure many diseases and symptoms during that period. Thriving Chinese herbal medical business made some doctors of Western medicine upset. California State Board of Medical Examiners did not allow Chinese herbal doctors to practice as medical doctors and had them arrested as practitioners without doctor license. Many of Chinese herbal doctors managed to operate their medical business as merchants selling herbs. Chinese herbal doctors often defended their career in court and newspaper articles. Their profession eventually discontinued when People’s Republic of China was established in 1949 and the United States passed the Trading with Enemy Economy Act in December 1950 that cut herbal medical imports from China.
Carol L. Higham
Comparing Catholic and Protestant missionaries in North America can be a herculean task. It means comparing many religious groups, at least five governments, and hundreds of groups of Indians. But missions to the Indians played important roles in social, cultural, and political changes for Indians, Europeans, and Americans from the very beginning of contact in the 1500s to the present. By comparing Catholic and Protestant missions to the Indians, this article provides a better understanding of the relationship between these movements and their functions in the history of borders and frontiers, including how the missions changed both European and Indian cultures.
John D. Fairfield
The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.
Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995.
Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament.
The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects.
Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.