81-100 of 532 Results

Article

Evan D. McCormick

Since gaining independence in 1823, the states comprising Central America have had a front seat to the rise of the United States as a global superpower. Indeed, more so than anywhere else, the United States has sought to use its power to shape Central America into a system that heeds US interests and abides by principles of liberal democratic capitalism. Relations have been characterized by US power wielded freely by officials and non-state actors alike to override the aspirations of Central American actors in favor of US political and economic objectives: from the days of US filibusterers invading Nicaragua in search of territory; to the occupations of the Dollar Diplomacy era, designed to maintain financial and economic stability; to the covert interventions of the Cold War era. For their part, the Central American states have, at various times, sought to challenge the brunt of US hegemony, most effectively when coordinating their foreign policies to balance against US power. These efforts—even when not rejected by the United States—have generally been short-lived, hampered by economic dependency and political rivalries. The result is a history of US-Central American relations that wavers between confrontation and cooperation, but is remarkable for the consistency of its main element: US dominance.

Article

The central business district, often referred to as the “downtown,” was the economic nucleus of the American city in the 19th and 20th centuries. It stood at the core of urban commercial life, if not always the geographic center of the metropolis. Here was where the greatest number of offices, banks, stores, and service institutions were concentrated—and where land values and building heights reached their peaks. The central business district was also the most easily accessible point in a city, the place where public transit lines intersected and brought together masses of commuters from outlying as well as nearby neighborhoods. In the downtown, laborers, capitalists, shoppers, and tourists mingled together on bustling streets and sidewalks. Not all occupants enjoyed equal influence in the central business district. Still, as historian Jon C. Teaford explained in his classic study of American cities, the downtown was “the one bit of turf common to all,” the space where “the diverse ethnic, economic, and social strains of urban life were bound together, working, spending, speculating, and investing.” The central business district was not a static place. Boundaries shifted, expanding and contracting as the city grew and the economy evolved. So too did the primary land uses. Initially a multifunctional space where retail, wholesale, manufacturing, and financial institutions crowded together, the central business district became increasingly segmented along commercial lines in the 19th century. By the early 20th century, rising real estate prices and traffic congestion drove most manufacturing and processing operations to the periphery. Remaining behind in the city center were the bulk of the nation’s offices, stores, and service institutions. As suburban growth accelerated in the mid-20th century, many of these businesses also vacated the downtown, following the flow of middle-class, white families. Competition with the suburbs drained the central business district of much of its commercial vitality in the second half of the 20th century. It also inspired a variety of downtown revitalization schemes that tended to reinforce inequalities of race and class.

Article

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

By the end of the 19th century, the medical specialties of gynecology and obstetrics established a new trend in women’s healthcare. In the 20th century, more and more American mothers gave birth under the care of a university-trained physician. The transition from laboring and delivering with the assistance of female family, neighbors, and midwives to giving birth under medical supervision is one of the most defining shifts in the history of childbirth. By the 1940s, the majority of American mothers no longer expected to give birth at home, but instead traveled to hospitals, where they sought reassurance from medical experts as well as access to pain-relieving drugs and life-saving technologies. Infant feeding followed a similar trajectory. Traditionally, infant feeding in the West had been synonymous with breastfeeding, although alternatives such as wet nursing and the use of animal milks and broths had existed as well. By the early 20th century, the experiences of women changed in relation to sweeping historical shifts in immigration, urbanization, and industrialization, and so too did their abilities and interests in breastfeeding. Scientific study of infant feeding yielded increasingly safer substitutes for breastfeeding, and by the 1960s fewer than 1 in 5 mothers breastfed. In the 1940s and 1950s, however, mothers began to organize and to resist the medical management of childbirth and infant feeding. The formation of childbirth education groups helped spread information about natural childbirth methods and the first dedicated breastfeeding support organization, La Leche League, formed in 1956. By the 1970s, the trend toward medicalized childbirth and infant feeding that had defined the first half of the century was in significant flux. By the end of the 20th century, efforts to harmonize women’s interests in more “natural” motherhood experiences with the existing medical system led to renewed interest in midwifery, home birth, and birth centers. Despite the cultural shift in favor of fewer medical interventions, rates of cesarean sections climbed to new heights by the end of the 1990s. Similarly, although pressures on mothers to breastfeed mounted by the end of the century, the practice itself increasingly relied upon the use of technologies such as the breast pump. By the close of the century, women’s agency in pursuing more natural options proceeded in tension with the technological, social, medical, and political systems that continued to shape their options.

Article

Wilma King

Boys and girls of European and African descent in Colonial America shared commonalities initially as unfree laborers, with promises of emancipation for all. However, as labor costs and demands changed, white servitude disappeared and slavery in perpetuity prevailed for the majority of blacks in the South following the American Revolution. Children were aware of differences in their legal status, social positions, life changing opportunities, and vulnerabilities within an environment where blackness signaled slavery or the absence of liberty, and whiteness garnered license or freedom. Slavery and freedom existed concomitantly, and relationships among children, even black ones, in North America were affected by time and place. Slave societies and societies with slaves determined the nature of interactions among enslaved and emancipated children. To be sure, few, if any, freed or free-born blacks did not have a relative or friend who was not or had never been enslaved, especially in states when gradual emancipation laws liberated family members born after a specific date and left older relatives in thralldom. As a result, free blacks were never completely aloof from their enslaved contemporaries. And, freedom was more meaningful if and when enjoyed by all. Just as interactions among enslaved and free black children varied, slaveholding children were sometimes benevolent and at other times brutal toward those they claimed as property. And, enslaved children did not always assume subservient positions under masters and mistresses in the making. Ultimately, fields of play rather than fields of labor fostered the most fair and enjoyable moments among slaveholding and enslaved children. Play days for enslaved girls and boys ended when they were mature enough to work outside their own abodes. As enslaved children entered the workplace, white boys of means, often within slaveholding families, engaged in formal studies, while white girls across classes received less formal education but honed skills associated with domestic arts. The paths of white and black children diverged as they reached adolescence, but there were instances when they shared facets of literacy, sometimes surreptitiously, and developed genuine friendships that mitigated the harshness of slavery. Even so, the majority of unfree children survived the furies of bondage by inculcating behavior that was acceptable for both a slave and a child.

Article

Ivón Padilla-Rodríguez

Child migration has garnered widespread media coverage in the 21st century, becoming a central topic of national political discourse and immigration policymaking. Contemporary surges of child migrants are part of a much longer history of migration to the United States. In the first half of the 20th century, millions of European and Asian child migrants passed through immigration inspection stations in the New York harbor and San Francisco Bay. Even though some accompanied and unaccompanied European child migrants experienced detention at Ellis Island, most were processed and admitted into the United States fairly quickly in the early 20th century. Few of the European child migrants were deported from Ellis Island. Predominantly accompanied Chinese and Japanese child migrants, however, like Latin American and Caribbean migrants in recent years, were more frequently subjected to family separation, abuse, detention, and deportation at Angel Island. Once inside the United States, both European and Asian children struggled to overcome poverty, labor exploitation, educational inequity, the attitudes of hostile officials, and public health problems. After World War II, Korean refugee “orphans” came to the United States under the Refugee Relief Act of 1953 and the Immigration and Nationality Act. European, Cuban, and Indochinese refugee children were admitted into the United States through a series of ad hoc programs and temporary legislation until the 1980 Refugee Act created a permanent mechanism for the admission of refugee and unaccompanied children. Exclusionary immigration laws, the hardening of US international boundaries, and the United States preference for refugees who fled Communist regimes made unlawful entry the only option for thousands of accompanied and unaccompanied Mexican, Central American, and Haitian children in the second half of the 20th century. Black and brown migrant and asylum-seeking children were forced to endure educational deprivation, labor trafficking, mandatory detention, deportation, and deadly abuse by US authorities and employers at US borders and inside the country.

Article

Patrick William Kelly

The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.

Article

Gregg A. Brazinsky

Throughout the 19th and 20th centuries, America’s relationship with China ran the gamut from friendship and alliance to enmity and competition. Americans have long believed in China’s potential to become an important global actor, primarily in ways that would benefit the United States. The Chinese have at times embraced, at times rejected, and at times adapted to the US agenda. While there have been some consistent themes in this relationship, Sino-American interactions unquestionably increased their breadth in the 20th century. Trade with China grew from its modest beginnings in the 19th and early 20th centuries into a critical part of the global economy by the 21st century. While Americans have often perceived China as a country that offered significant opportunities for mutual benefit, China has also been seen as a threat and rival. During the Cold War, the two competed vigorously for influence in Asia and Africa. Today we see echoes of this same competition as China continues to grow economically while expanding its influence abroad. The history of Sino-American relations illustrates a complex dichotomy of cooperation and competition; this defines the relationship today and has widespread ramifications for global politics.

Article

Chinese were one of the few immigrant groups who brought with them a deep-rooted medical tradition. Chinese herbal doctors and stores came and appeared in California as soon as the Gold Rush began. Traditional Chinese medicine had a long history and was an important part of Chinese culture. Herbal medical knowledge and therapy was popular among Chinese immigrants. Chinese herbal doctors treated American patients as well. Established herbal doctors had more white patients than Chinese patients especially after Chinese population declined due to Chinese Exclusion laws. Chinese herbal medicine attracted American patients in the late 19th and early 20th century because Western medicine could not cure many diseases and symptoms during that period. Thriving Chinese herbal medical business made some doctors of Western medicine upset. California State Board of Medical Examiners did not allow Chinese herbal doctors to practice as medical doctors and had them arrested as practitioners without doctor license. Many of Chinese herbal doctors managed to operate their medical business as merchants selling herbs. Chinese herbal doctors often defended their career in court and newspaper articles. Their profession eventually discontinued when People’s Republic of China was established in 1949 and the United States passed the Trading with Enemy Economy Act in December 1950 that cut herbal medical imports from China.

Article

Comparing Catholic and Protestant missionaries in North America can be a herculean task. It means comparing many religious groups, at least five governments, and hundreds of groups of Indians. But missions to the Indians played important roles in social, cultural, and political changes for Indians, Europeans, and Americans from the very beginning of contact in the 1500s to the present. By comparing Catholic and Protestant missions to the Indians, this article provides a better understanding of the relationship between these movements and their functions in the history of borders and frontiers, including how the missions changed both European and Indian cultures.

Article

The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.

Article

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

The 1969 Supreme Court ruling in Tinker v. Des Moines established that students in public elementary and secondary schools do not “shed their constitutional rights to freedom of speech or expression at the schoolhouse gate.” Before Tinker, students often faced punishment from school officials for their role in protests both on and off campus. A rise in civil rights protests and the role of young people in the social movements of the 1960s led to frequent conflicts between students and school administrators. Many black students were especially vocal in contesting racial discrimination at school in the two decades following the 1954Brown v. Board of Education decision. But before Tinker, students in public elementary and secondary schools were not considered to have any constitutional rights, including the right to free expression. Some of these students brought lawsuits in response to punishments they believed unfairly disciplined them for participating in legitimate protests. The political activism of young people and developments in First Amendment law eventually brought the Constitution into the public school classroom, leading to Tinker and other cases that established students’ rights.

Article

The civil rights movement in the urban South transformed the political, economic, and cultural landscape of post–World War II America. Between 1955 and 1968, African Americans and their white allies relied on nonviolent direct action, political lobbying, litigation, and economic boycotts to dismantle the Jim Crow system. Not all but many of the movement’s most decisive political battles occurred in the cities of Montgomery and Birmingham, Alabama; Nashville and Memphis, Tennessee; Greensboro and Durham, North Carolina; and Atlanta, Georgia. In these and other urban centers, civil rights activists launched full-throttled campaigns against white supremacy, economic exploitation, and state-sanctioned violence against African Americans. Their fight for racial justice coincided with monumental changes in the urban South as the upsurge in federal spending in the region created unprecedented levels of economic prosperity in the newly forged “Sunbelt.” A dynamic and multifaceted movement that encompassed a wide range of political organizations and perspectives, the black freedom struggle proved successful in dismantling legal segregation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 expanded black southerners’ economic, political, and educational opportunities. And yet, many African Americans continued to struggle as they confronted not just the long-term effects of racial discrimination and exclusion but also the new challenges engendered by deindustrialization and urban renewal as well as entrenched patterns of racial segregation in the public-school system.

Article

America’s Civil War became part of a much larger international crisis as European powers, happy to see the experiment in self-government fail in America’s “Great Republic,” took advantage of the situation to reclaim former colonies in the Caribbean and establish a European monarchy in Mexico. Overseas, in addition to their formal diplomatic appeals to European governments, both sides also experimented with public diplomacy campaigns to influence public opinion. Confederate foreign policy sought to win recognition and aid from Europe by offering free trade in cotton and aligning their cause with that of the aristocratic anti-democratic governing classes of Europe. The Union, instead, appealed to liberal, republican sentiment abroad by depicting the war as a trial of democratic government and embracing emancipation of the slaves. The Union victory led to the withdrawal of European empires from the New World: Spain from Santo Domingo, France from Mexico, Russia from Alaska, and Britain from Canada, and the destruction of slavery in the United States hastened its end in Puerto Rico, Cuba, and Brazil.

Article

Lesley J. Gordon

The regiment was the essential “building block” of Civil War armies. Assigned by states, most volunteer regiments were organized based on soldiers’ home residence and reflective of those local communities. Each branch of the army—infantry, artillery, and cavalry—formed into regiments with varying numbers of companies and overall strength. There were regular army regiments and units specially designated for African American troops. As the war dragged on, regimental strengths diminished dramatically. The Confederate Army tried to refill older units with conscripts and new recruits, while the Union created new regiments to replace depleted ones and later consolidated smaller ones. Neither side was entirely successful in restoring regiments to full authorized strength. Nonetheless, the regiment was more than a mode of organization—it was the prime source of identity and pride for volunteers and later veterans. While armies, divisions, and brigades were crucial to winning battles, and companies forged tight bonds of loyalty, it was the regiment to which most soldiers claimed a personal allegiance. Famed regiments like the 1st Minnesota Infantry Regiment, the 1st Texas Infantry Regiment, and the 54th Massachusetts Infantry Regiment cited their battle honors and high casualty numbers as proof of their fighting prowess. After the war ended, veterans produced hundreds of regimental histories, recounting their battle service and seeking to claim a place in history. Although many historians dismiss these accounts as worthless for serious scholarly research, regimental histories offer rich firsthand accounts of the conflict. They also offer a vehicle for narrating the war in a form well familiar to the soldiers who experienced it.

Article

American cities developed under relatively quiescent climatic conditions. A gradual rise in average global temperatures during the 19th and 20th centuries had a negligible impact on how urban Americans experienced the weather. Much more significant were the dramatic changes in urban form and social organization that meditated the relationship between routine weather fluctuations and the lives of city dwellers. Overcoming weather-related impediments to profit, comfort, and good health contributed to many aspects of urbanization, including population migration to Sunbelt locations, increased reliance on fossil fuels, and comprehensive re-engineering of urban hydrological systems. Other structural shifts such as sprawling development, intensification of the built environment, socioeconomic segregation, and the tight coupling of infrastructural networks were less directly responsive to weather conditions but nonetheless profoundly affected the magnitude and social distribution of weather-related risks. Although fatalities resulting from extreme meteorological events declined in the 20th century, the scale of urban disruption and property damage increased. In addition, social impacts became more concentrated among poorer Americans, including many people of color, as Hurricane Katrina tragically demonstrated in 2005. Through the 20th century, cities responded to weather hazards through improved forecasting and systematic planning for relief and recovery rather than alterations in metropolitan design. In recent decades, however, growing awareness and concern about climate change impacts have made volatile weather more central to urban planning.

Article

Clodagh Harrington

The Clinton scandals have settled in the annals of American political history in the context of the era’s recurrent presidential misbehavior. Viewed through a historical lens, the activities, investigation, and impeachment trial of the forty-second president are almost inevitably measured against the weight of Watergate and Iran-Contra. As a result, the actions and consequences of this high-profile moment in the late-20th-century political history of the United States arguably took on a weightier meaning than it might otherwise have. If Watergate tested the U.S. constitutional system to its limits and Iran-Contra was arguably as grave, the Clinton affair was crisis-light by comparison. Originating with an investigation into a failed 1970s Arkansas land deal by Bill Clinton and his wife, the saga developed to include such meandering subplots as Filegate, Travelgate, Troopergate, the death of White House counsel Vince Foster, and, most infamously, the president’s affair with a White House intern. Unlike Richard Nixon and Ronald Reagan, even Bill Clinton’s most ardent critics could not find a national security threat among the myriad scandals linked to his name. By the time that Justice Department appointee Robert Fiske was replaced as prosecutor by the infinitely more zealous Kenneth Starr, the case had become synonymous with the culture wars that permeated 1990s American society. As the Whitewater and related tentacles of the investigation failed to result in any meaningfully negative impact on the president, it was his marital infidelities that came closest to unseating him. Pursued with vigor by the Independent Counsel, his supporters remained loyal as his detractors spotted political opportunity via his lapses in judgment. Certain key factors made the Clinton scandal particular to its era. First, in an unprecedented development, the personal indiscretion aspect of the story broke via the Internet. In addition, had the Independent Counsel legislation not been renewed, prosecutor Fiske would likely have wrapped up his investigation in a timely fashion with no intention of pursuing an impeachment path. And, the relentless cable news cycle and increasingly febrile partisan atmosphere of the decade ensured that the nation remained as focused as it was divided on the topic.

Article

The US working class and the institutional labor movement was shaped by anticommunism. Anticommunism preceded the founding of the Soviet Union and the Cold War, and this early history affected the later experience. It reinforced conservative positions on union issues even in the period before the Cold War, and forged the alliances that influenced the labor movement’s direction, including the campaign to organize the South, the methods and structures of unions, and US labor’s foreign policy positions. While the Communist Party of the USA (CP) was a hierarchical organization straitjacketed by an allegiance to the Soviet Union, the unions it fostered cultivated radical democratic methods, while anticommunism often justified opposition to militancy and obstructed progressive policies. In the hottest moments of the postwar development of domestic anticommunism, unions and their members were vilified and purged from the labor movement, forced to take loyalty oaths, and fired for their association with the CP. The Cold War in the working class removed critical perspectives on capitalism, reinforced a moderate and conservative labor officialdom, and led to conformity with the state on foreign policy issues.