81-100 of 191 Results  for:

  • 20th Century: Post-1945 x
Clear all

Article

Marat Grinberg

There is an intricate, long, and rich history of Jewish presence in Hollywood, from executives to producers to directors to screenwriters to performers. It starts with the Jewish moguls who were at the helm of most major studios in the 1920s and 30s and tried to separate as much as possible from their Jewish heritage and past. This preponderance of Jews prompted an anti-Semitic response in the American entertainment scene which could hardly be ignored. The result was an overt timidity in the representation of Jews and Jewish topics on screen, with some Jewish actors perceived as “too Jewish” for the general taste. The changes in the perception of identity in the 1960s, marked by culture wars and the Civil Rights movement, on the one hand, and the flourishing of American Jewish literature and the pride many American Jews took in Israel’s triumph in the Six-Day War of 1967, on the other, enabled a much more open and unabashed embrace of Jewishness in Hollywood. Consequently, the late 1960s usher in the New Jewish Wave, when the issues of Jewish identity and experience start to dominate the screen and are defined by such auteurs as Woody Allen, Mel Brooks, Sidney Lumet, and Paul Mazursky, and such actors as Dustin Hoffman, Barbra Streisand, Richard Dreyfuss, Eliot Gould, George Seagal, and Woody Allen throughout the 1970s and 80s. The Jewish representation grows in the 1990s and 2010s with such directors as the Coen brothers, Darren Aronofsky, David Cronenberg, David Mamet, Jonathan Glazer, Stephen Spielberg, and the Safdie brothers.

Article

Emancipation celebrations in the United States have been important and complicated moments of celebration and commemoration. Since the end of the slave trade in 1808 and the enactment of the British Emancipation Act in 1834 people of African descent throughout the Atlantic world have gathered, often in festival form, to remember and use that memory for more promising futures. In the United States, emancipation celebrations exploded after the Civil War, when each local community celebrated their own experience of emancipation. For many, the commemoration took the form of a somber church service, Watch Night, which recognized the signing of the Emancipation Proclamation on January 1, 1863. Juneteenth, which recognized the end of slavery in Texas on June 19, 1865, became one of the most vibrant and longstanding celebrations. Although many emancipation celebrations disappeared after World War I, Juneteenth remained a celebration in most of Texas through the late 1960s when it disappeared from all cities in the state. However, because of the Second Great Migration, Texans transplanted in Western cities continued the celebration in their new communities far from Texas. In Texas, Juneteenth was resurrected in 1979 when state representative, later Congressman, Al Edwards successfully sponsored a bill to make Juneteenth a state holiday and campaigned to spread Juneteenth throughout the country. This grassroots movement brought Juneteenth resolutions to forty-six states and street festivals in hundreds of neighborhoods. Juneteenth’s remarkable post-1980 spread has given it great resonance in popular culture as well, even becoming a focus of two major television episodes in 2016 and 2017.

Article

Korean immigration to the United States has been shaped by multiple factors, including militarization, colonialism, and war. While Koreans migrated to the American-occupied islands of Hawai’i in the early 20th century as sugar plantation laborers, Japanese imperial rule (1910–1945) and racially exclusive immigration policy curtailed Korean migration to the United States until the end of World War II. Since then, Korean immigration has been shaped by racialized, gendered, and sexualized conditions related to the Korean War and American military occupation. Although existing social science literature dominantly frames Korean immigration through the paradigm of migration “waves,” these periodizations are arbitrary to the degree that they centralize perceived US policy changes or “breaks” within a linear historical timeline. In contrast, emphasizing the continuing role of peninsular instability and militarized division points to the accumulative effects of the Korean War that continue to impact Korean immigration. With the beginning of the American military occupation of Korea in 1945 and warfare erupting in 1950, Koreans experienced familial separations and displacements. Following the signing of the Korean armistice in 1953, which halted armed fighting without formally ending the war, the American military remained in the southern half of the Peninsula. The presence of the US military in South Korea had immediate repercussions among civilians, as American occupation engendered sexual intimacies between Korean women and US soldiers. Eventually, a multiracial population emerged as children were born to Korean women and American soldiers. Given the racial exclusivity of American immigration policy at the time, the US government established legislative “loopholes” to facilitate the migrations of Korean spouses of US soldiers and multiracial children adopted by American families. Between 1951 and 1964 over 90 percent of the 14,027 Koreans who entered the United States were Korean “war brides” and transnational adoptees. Since 1965, Korean spouses of American servicemen have played key roles in supporting the migration of family members through visa sponsorship. Legal provisions that affected the arrivals of Korean women and children to the United States provided a precedent for US immigration reform after 1950. For instance, the 1952 and 1965 Immigration and Nationality Acts integrated core elements of these emergency orders, including privileging heterosexual relationships within immigration preferences. Simultaneously, while the 1965 Immigration and Nationality Act “opened” the doors of American immigration to millions of people, South Korean military dictatorial rule and the imminent threat of rekindled warfare also influenced Korean emigration. As a result, official US immigration categories do not necessarily capture the complex conditions informing Koreans’ decisions to migrate to the United States. Finally, in light of the national surge of anti-immigrant sentiments that have crystallized since the American presidential election of Donald Trump in November 2016, immigration rights advocates have highlighted the need to address the prevalence of undocumented immigrant status among Korean Americans. While definitive statistics do not exist, emergent data suggests that at least 10 percent of the Korean American population is undocumented. Given this significant number, the undocumented status of Korean Americans is a critical site of study that warrants further research.

Article

The United States and the Kingdom of Joseon (Korea) established formal diplomatic relations after signing a “Treaty of Peace, Commerce, Amity, and Navigation” in 1882. Relations between the two states were not close and the United States closed its legation in 1905 following the Japanese annexation of Korea subsequent to the Russo-Japanese War. No formal relations existed for the following forty-four years, but American interest in Korea grew following the 1907 Pyongyang Revival and the rapid growth of Christianity there. Activists in the Korean Independence movement kept the issue of Korea alive in the United States, especially during World War I and World War II, and pressured the American government to support the re-emergence of an independent Korea. Their activism, as well as a distrust of the Soviet Union, was among the factors that spurred the United States to suggest the joint occupation of the Korean peninsula in 1945, which subsequently led to the creation of the Republic of Korea (ROK) in the American zone and the Democratic People’s Republic of Korea (DPRK) in the Soviet zone. The United States withdrew from the ROK in 1948 only to return in 1950 to thwart the DPRK’s attempt to reunite the peninsula by force during the Korean War. The war ended in stalemate, with an armistice agreement in 1953. In the same year the United States and the ROK signed a military alliance and American forces have remained on the peninsula ever since. While the United States has enjoyed close political and security relations with the ROK, formal diplomatic relations have never been established between the United States and the DPRK, and the relationship between the two has been marked by increasing tensions over the latter’s nuclear program since the early 1990s.

Article

James I. Matray

On June 25, 1950, North Korea’s invasion of South Korea ignited a conventional war that had origins dating from at least the end of World War II. In April 1945, President Harry S. Truman abandoned a trusteeship plan for postwar Korea in favor of seeking unilateral U.S. occupation of the peninsula after an atomic attack forced Japan’s prompt surrender. Soviet entry into the Pacific war led to a last minute agreement dividing Korea at the 38th parallel into zones of occupation. Two Koreas emerged after Soviet-American negotiations failed to agree on a plan to end the division. Kim Il Sung in the north and Syngman Rhee in the south both were determined to reunite Korea, instigating major military clashes at the parallel in the summer of 1949. Moscow and Washington opposed their clients’ invasion plans until April 1950 when Kim persuaded Soviet Premier Joseph Stalin that with mass support in South Korea, he would achieve a quick victory. At first, Truman hoped that South Korea could defend itself with more military equipment and U.S. air support. Commitment of U.S. ground forces came after General Douglas MacArthur, U.S. occupation commander in Japan, visited the front and advised that the South Koreans could not halt the advance. Overconfident U.S. soldiers would sustain defeat as well, retreating to the Pusan Perimeter, a rectangular area in the southeast corner of the peninsula. On September 15, MacArthur staged a risky amphibious landing at Inchon behind enemy lines that sent Communist forces fleeing back into North Korea. The People’s Republic of China viewed the U.S. offensive for reunification that followed as a threat to its security and prestige. In late November, Chinese “volunteers” attacked in mass. After a chaotic retreat, U.S. forces counterattacked in February 1951 and moved the line of battle just north of the parallel. After two Chinese offensives failed, negotiations to end the war began in July 1951, but stalemated in May 1952 over the issue of repatriation of prisoners of war. Peace came because of Stalin’s death in March 1953, rather than President Dwight D. Eisenhower’s veiled threat to stage nuclear strikes against China. Scholars have disagreed about many issues surrounding the Korean War, but the most important debate continues to center on whether the conflict had international or domestic origins. Initially, historians relied mainly on U.S. government publications to write accounts that ignored events prior to North Korea’s attack, endorsing an orthodox interpretation assigning blame to the Soviet Union and applauding the U.S. response. Declassification of U.S. government documents and presidential papers during the 1970s led to the publication of studies assigning considerable responsibility to the United States for helping to create a kind of war in Korea before June 1950. Moreover, left revisionist writers labeled the conflict a classic civil war. Release of Chinese and Soviet sources after 1989 established that Stalin and Chinese leader Mao Zedong approved the North Korean invasion, prompting right revisionist scholars to reassert key orthodox arguments. This essay describes how and why recent access to Communist documents has not settled the disagreements among historians about the causes, course, and consequences of the Korean War.

Article

The American labor movement has declined significantly since 1960. Once a powerful part of American life, bringing economic democracy to the nation, organized labor has become a shell of itself, with numbers far lower than a half-century ago. The 1960s began with a powerful movement divided on race but also deeply influenced by the civil rights movement. Deindustrialization and capital mobility cut into labor’s power after 1965 as factories closed. The rise of public sector unionism in the 1970s briefly gave labor new power, but private sector unions faced enormous internal dissension throughout that decade. The Reagan administration ushered in a new era of warfare against organized labor when the president fired the striking air traffic controllers in 1981. Soon, private sector employers engaged in brutal anti-union campaigns. Reforms within labor in the 1990s sought to renew the movement’s long tradition of organizing, but with mixed success at best. Since the 1980s, we have seen more attacks on organized labor, especially Republican-led campaigns against public sector union rights beginning in 2011 that culminated in the 2019 Supreme Court ruling that declared required dues for non-union members unconstitutional. Labor’s decline has led to a new era of income inequality but also brought a stronger class-centric politics back into American life as everyday people seek new answers to the tenuousness of their economic lives.

Article

Donna T. Haverty-Stacke

The first Labor Day parade was held on September 5, 1882, in New York City. It, and the annual holiday demonstrations that followed in that decade and the next, resulted from the growth of the modern organized labor movement that took place in the context of the second industrial revolution. These first Labor Day celebrations also became part of the then ongoing ideological and tactical divisions within that movement. By the early 1900s, workers’ desire to enjoy the fruits of their labor by participating in popular leisure pursuits came to characterize the day. But union leaders, who considered such leisure pursuits a distraction from displays of union solidarity, continued to encourage the organization of parades. With the protections afforded to organized labor by the New Deal, and with the gains made during and after World War II (particularly among unionized white, male, industrial laborers), Labor Day parades declined further after 1945 as workers enjoyed access to mass cultural pursuits, increasingly in suburban settings. This decline was indicative of a broader loss of union movement culture that had served to build solidarity within unions, display working-class militancy to employers, and communicate the legitimacy of organized labor to the American public. From time to time since the late 1970s unions have attempted to reclaim the power of Labor Day to make concerted demands through their display of workers’ united power; but, for most Americans the holiday has become part of a three-day weekend devoted to shopping or leisure that marks the end of the summer season.

Article

If one considers all the links in the food chain—from crop cultivation to harvesting to processing to transportation to provision and service—millions of workers are required to get food from fields and farms to our grocery stores, restaurants, and kitchen tables. One out of every seven workers in the United States performs a job related in some way to food, whether it is in direct on-farm employment, in stores, in eating/drinking establishments, or in other agriculture-related sectors. According to demographic breakdowns of US food labor, people of color and immigrants (of varying legal and citizenship statuses) hold the majority of low-wage jobs in the US food system. Since the late 19th century Latinos (people of Latin American descent living in the United States) have played a tremendous role in powering the nation’s food industry. In the Southwest, Mexicans and Mexican Americans have historically worked as farmworkers, street vendors, restaurateurs, and employees in food factories. The Bracero Program (1942–1964) only strengthened the pattern of hiring Latinos as food workers by importing a steady stream of Mexican guest workers into fields, orchards, and vineyards across all regions of the United States. Meanwhile, mid-20th-century Puerto Rican agricultural guest workers served the farms and food processing factories of the Midwest and East Coast. In the late 20th and early 21st centuries, Central American food labor has become more noticeable in restaurants, the meat and seafood industries, and street food vending. It is deeply ironic, then, that the workers who help to nourish us and get our food to us go so unnourished themselves. Across the board, food laborers lack many privileges and basic rights. There is still no federal minimum wage for the almost three million farmworkers who labor in the nation’s fruit orchards, vineyards, and vegetable fields. Farmworkers (who are overwhelmingly Latino and undocumented) earn very low wages and face various health risks from pesticide exposure, extreme weather, a lack of nutritious, affordable food and potable water, substandard and unsanitary housing conditions, workplace abuse, unsafe transportation, and sexual harassment and assault. Other kinds of food workers—such as restaurant workers and street vendors—experience similar economic precarity and physical/social invisibility. While many of these substandard conditions exist because of employer decisions about costs and the treatment of their workers, American consumers seeking the lowest prices for food are also caught up in this cycle of exploitation. In efforts to stay competitive and profitable in what they give to grocery stores, restaurants, and the American public, farmers and food distributors trim costs wherever they can, which often negatively impacts the wages and conditions of those who are working the hardest at the bottom of the national food chain. To push back against these forms of exploitation, food entrepreneurs, worker unions, and other advocates have vocally supported Latinos in the US food industry and tried to address problems ranging from xenophobia to human trafficking.

Article

G. Cristina Mora

The question of how to classify and count Latinxs has perplexed citizens and state officials alike for decades. Although Latinxs in the United States have been counted in every census the nation has conducted, it was not until the 1930s that the issue of race came to the fore as the politics of who Latinxs were and whether the government should simply classify them as White became contested. These issues were amplified in the 1960s when Chicano and Boricua—Puerto Rican—activists, inspired by the Black civil rights movement, demanded that their communities be counted as distinct from Anglos. Decades of racial terror, community denigration, and colonialism, they contended, had made the Latinx experience distinct from that of Whites. A separate classification, activists argued, would allow them to have data on the state of their communities and make claims on government resources. Having census data on Hispanic/Latino poverty, for example, would allow Latinx advocacy groups to lobby for anti-poverty programs in their communities. Yet the issue of race and Latinxs continued to be thorny as the Census Bureau struggled with how to create a classification broad enough to encompass the immense racial, social, and cultural diversity of Latinxs. As of 2020, the issue remains unresolved as the Bureau continues to officially classify Latinxs as ethnically Hispanic/Latino but racially White, even though the bulk of research shows that about half of Latinxs consistently check the “some other race” box on census forms. More recent Latinx census politics centers on the issue of whether the Census Bureau should include a citizenship question on census forms. Latinx advocacy groups and academics have long argued that such a question would dampen Latinx census participation and effect the usefulness of census data for making claims about the size, growth, and future of the Latinx community. These politics came to a head in the months leading up to the 2020 census count as the Trump administration attempted to overturn decades of protocol and add a citizenship question to the decennial census form.

Article

Laura Isabel Serna

Latinos have constituted part of the United States’ cinematic imagination since the emergence of motion pictures in the late 19th century. Though shifting in their specific contours, representations of Latinos have remained consistently stereotypical; Latinos have primarily appeared on screen as bandits, criminals, nameless maids, or sultry señoritas. These representations have been shaped by broader political and social issues and have influenced the public perception of Latinos in the United States. However, the history of Latinos and film should not be limited to the topic of representation. Latinos have participated in the film industry as actors, creative personnel (including directors and cinematographers), and have responded to representations on screen as members of audiences with a shared sense of identity, whether as mexicanos de afuera in the early 20th century, Hispanics in the 1980s and 1990s, or Latinos in the 21st century. Both participation in production and reception have been shaped by the ideas about race that characterize the film industry and its products. Hollywood’s labor hierarchy has been highly stratified according to race, and Hollywood films that represent Latinos in a stereotypical fashion have been protested by Latino audiences. While some Latino/a filmmakers have opted to work outside the confines of the commercial film industry, others have sought to gain entry and reform the industry from the inside. Throughout the course of this long history, Latino representation on screen and on set has been shaped by debates over international relations, immigration, citizenship, and the continuous circulation of people and films between the United States and Latin America.

Article

A. K. Sandoval-Strausz

“Latino urbanism” describes a culturally specific set of spatial forms and practices created by people of Hispanic origin. It includes many different aspects of those forms and practices, including town planning; domestic, religious, and civic architecture; the adaptation of existing residential, commercial, and other structures; and the everyday use of spaces such as yards, sidewalks, storefronts, streets, and parks. Latino urbanism has developed over both time and space. It is the evolving product of half a millennium of colonization, settlement, international and domestic migration, and globalization. It has spanned a wide geographic range, beginning in the southern half of North America and gradually expanding to much of the hemisphere. There have been many variations on Latino urbanism, but most include certain key features: shared central places where people show their sense of community, a walking culture that encourages face-to-face interaction with neighbors, and a sense that sociability should take place as much in the public realm as in the privacy of the home. More recently, planners and architects have realized that Latino urbanism offers solutions to problems such as sprawl, social isolation, and environmental unsustainability. The term “urbanism” connotes city spaces, and Latino urbanism is most concentrated and most apparent at the center of metropolitan areas. At the same time, it has also been manifested in a wide variety of places and at different scales, from small religious altars in private homes; to Spanish-dominant commercial streetscapes in Latino neighborhoods; and ultimately to settlement patterns that reach from the densely packed centers of cities to the diversifying suburbs that surround them, out to the agricultural hinterlands at their far peripheries—and across borders to big cities and small pueblos elsewhere in the Americas.

Article

Entrepreneurship has been a basic element of Latinx life in the United States since long before the nation’s founding, varying in scale and cutting across race, class, and gender to different degrees. Indigenous forms of commerce pre-dated Spanish contact in the Americas and continued thereafter. Beginning in the 16th century, the raising, trading, and production of cattle and cattle-related products became foundational to Spanish, Mexican, and later American Southwest society and culture. By the 19th century, Latinxs in US metropolitan areas began to establish enterprises in the form of storefronts, warehouses, factories, as well as smaller ventures including peddling. At times, they succeeded previous ethnic owners; in other moments, they established new businesses that shaped everyday life and politics of their respective communities. Whatever the scale of their ventures, Latinx business owners continued to capitalize on the migration of Latinx people to the United States from Latin America and the Caribbean during the 20th century. These entrepreneurs entered business for different reasons, often responding to restricted or constrained labor options, though many sought the flexibility that entrepreneurship offered. Despite an increasing association between Latinx people and entrepreneurship, profits from Latinx ventures produced uneven results during the second half of the 20th century. For some, finance and business ownership has generated immense wealth and political influence. For others at the margins of society, it has remained a tool for achieving sustenance amid the variability of a racially stratified labor market. No monolithic account can wholly capture the vastness and complexity of Latinx economic activity. Latinx business and entrepreneurship remains a vital piece of the place-making and politics of the US Latinx population. This article provides an overview of major trends and pivotal moments in its rich history.

Article

Since World War II, the United States has witnessed major changes in lesbian, gay, bisexual, transgender, and queer (LGBTQ) politics. Indeed, because the history of LGBTQ activism is almost entirely concentrated in the postwar years, the LGBTQ movement is typically said to have achieved rapid change in a short period of time. But if popular accounts characterize LGBTQ history as a straightforward narrative of progress, the reality is more complex. Postwar LGBTQ politics has been both diverse and divided, marked by differences of identity and ideology. At the same time, LGBTQ politics has been embedded in the contexts of state-building and the Cold War, the New Left and the New Right, the growth of neoliberalism, and the HIV/AIDS epidemic. As the field of LGBTQ history has grown, scholars have increasingly been able to place analyses of state regulation into conversation with community-based histories. Moving between such outside and inside perspectives helps to reveal how multiple modes of LGBTQ politics have shaped one another and how they have been interwoven with broader social change. Looking from the outside, it is apparent that LGBTQ politics has been catalyzed by exclusions from citizenship; from the inside, we can see that activists have responded to such exclusions in different ways, including both by seeking social inclusion and by rejecting assimilationist terms. Court rulings and the administration of law have run alongside the debates inside activist communities. Competing visions for LGBTQ politics have centered around both leftist and liberal agendas, as well as viewpoints shaped by race, gender, gender expression, and class.

Article

In 1944 President Franklin D. Roosevelt’s State of the Union address set out what he termed an “economic Bill of Rights” that would act as a manifesto of liberal policies after World War Two. Politically, however, the United States was a different place than the country that had faced the ravages of the Great Depression of the 1930s and ushered in Roosevelt’s New Deal to transform the relationship between government and the people. Key legacies of the New Deal, such as Social Security, remained and were gradually expanded, but opponents of governmental regulation of the economy launched a bitter campaign after the war to roll back labor union rights and dismantle the New Deal state. Liberal heirs to FDR in the 1950s, represented by figures like two-time presidential candidate Adlai Stevenson, struggled to rework liberalism to tackle the realities of a more prosperous age. The long shadow of the U.S. Cold War with the Soviet Union also set up new challenges for liberal politicians trying to juggle domestic and international priorities in an era of superpower rivalry and American global dominance. The election of John F. Kennedy as president in November 1960 seemed to represent a narrow victory for Cold War liberalism, and his election coincided with the intensification of the struggle for racial equality in the United States that would do much to shape liberal politics in the 1960s. After his assassination in 1963, President Lyndon Johnson launched his “Great Society,” a commitment to eradicate poverty and to provide greater economic security for Americans through policies such as Medicare. But his administration’s deepening involvement in the Vietnam War and its mixed record on alleviating poverty did much to taint the positive connotations of “liberalism” that had dominated politics during the New Deal era.

Article

Benjamin C. Waterhouse

Political lobbying has always played a key role in American governance, but the concept of paid influence peddling has been marked by a persistent tension throughout the country’s history. On the one hand, lobbying represents a democratic process by which citizens maintain open access to government. On the other, the outsized clout of certain groups engenders corruption and perpetuates inequality. The practice of lobbying itself has reflected broader social, political, and economic changes, particularly in the scope of state power and the scale of business organization. During the Gilded Age, associational activity flourished and lobbying became increasingly the province of organized trade associations. By the early 20th century, a wide range at political reforms worked to counter the political influence of corporations. Even after the Great Depression and New Deal recast the administrative and regulatory role of the federal government, business associations remained the primary vehicle through which corporations and their designated lobbyists influenced government policy. By the 1970s, corporate lobbyists had become more effective and better organized, and trade associations spurred a broad-based political mobilization of business. Business lobbying expanded in the latter decades of the 20th century; while the number of companies with a lobbying presence leveled off in the 1980s and 1990s, the number of lobbyists per company increased steadily and corporate lobbyists grew increasingly professionalized. A series of high-profile political scandals involving lobbyists in 2005 and 2006 sparked another effort at regulation. Yet despite popular disapproval of lobbying and distaste for politicians, efforts to substantially curtail the activities of lobbyists and trade associations did not achieve significant success.

Article

Landon R. Y. Storrs

The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States. The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.

Article

On February 19, 1942, President Franklin Delano Roosevelt signed Executive Order 9066 authorizing the incarceration of 120,000 Japanese Americans, living primarily on the West Coast of the continental United States. On August 10, 1988, President Ronald Reagan signed legislation authorizing formal apologies and checks for $20,000 to those still alive who had been unjustly imprisoned during WWII. In the interim period, nearly a half century, there were enormous shifts in memories of the events, mainstream accounts, and internal ethnic accountabilities. To be sure, there were significant acts of resistance, from the beginning of mass forced removal to the Supreme Court decisions toward the end of the war. But for a quarter of a century, between 1945 and approximately 1970, there was little to threaten a master narrative that posited Japanese Americans, led by the Japanese American Citizens League (JACL), as a once-embattled ethnic/racial minority that had transcended its victimized past to become America’s treasured model minority. The fact that the Japanese American community began effective mobilization for government apology and reparations in the 1970s only confirmed its emergence as a bona fide part of the American body politic. But where the earlier narrative extolled the memories of Japanese American war heroes and leaders of the JACL, memory making changed dramatically in the 1990s and 2000s. In the years since Reagan’s affirmation that “here we admit a wrong,” Japanese Americans have unleashed a torrent of memorials, museums, and monuments honoring those who fought the injustices and who swore they would resist current or future attempts to scapegoat other groups in the name of national security.

Article

Iliana Yamileth Rodriguez

Mexican American history in the United States spans centuries. In the 16th and 17th centuries, the Spanish Empire colonized North American territories. Though met with colonial rivalries in the southeast, Spanish control remained strong in the US southwest through the 19th century. The mid-1800s were an era of power struggles over territory and the construction of borders, which greatly impacted ethnic Mexicans living in the US-Mexico borderlands. After the Mexican-American War (1846–1848), the passage of the Treaty of Guadalupe Hidalgo allowed the United States to take all or parts of California, Arizona, Nevada, Utah, Colorado, and New Mexico. Ethnic Mexicans living in newly incorporated regions in the mid- through late 19th century witnessed the radical restructuring of their lives along legal, economic, political, and cultural lines. The early 20th century witnessed the rise of anti-Mexican sentiment and violence. As ethnic Mexican communities came under attack, Mexican Americans took leadership roles in institutions, labor unions, and community groups to fight for equality. Both tensions and coalition-building efforts between Mexican Americans and Mexican migrants animated the mid-20th century, as did questions about wartime identity, citizenship, and belonging. By the late 20th century, Chicana/o politics took center stage and brought forth a radical politics informed by the Mexican American experience. Finally, the late 20th through early 21st centuries saw further geographic diversification of Mexican American communities outside of the southwest.

Article

Military assistance programs have been crucial instruments of American foreign policy since World War II, valued by policymakers for combating internal subversion in the “free world,” deterring aggression, and protecting overseas interests. The 1958 Draper Committee, consisting of eight members of the Senate Foreign Relations Committee, concluded that economic and military assistance were interchangeable; as the committee put it, without internal security and the “feeling of confidence engendered by adequate military forces, there is little hope for economic progress.” Less explicitly, military assistance was also designed to uphold the U.S. global system of military bases established after World War II, ensure access to raw materials, and help recruit intelligence assets while keeping a light American footprint. Police and military aid was often invited and welcomed by government elites in so-called free world nations for enhancing domestic security or enabling the swift repression of political opponents. It sometimes coincided with an influx of economic aid, as under the Marshall Plan and Alliance for Progress. In cases like Vietnam, the programs contributed to stark human rights abuses owing to political circumstances and prioritizing national security over civil liberties.

Article

The Japanese American Redress Movement refers to the various efforts of Japanese Americans from the 1940s to the 1980s to obtain restitution for their removal and confinement during World War II. This included judicial and legislative campaigns at local, state, and federal levels for recognition of government wrongdoing and compensation for losses, both material and immaterial. The push for redress originated in the late 1940s as the Cold War opened up opportunities for Japanese Americans to demand concessions from the government. During the 1960s and 1970s, Japanese Americans began to connect the struggle for redress with anti-racist and anti-imperialist movements of the time. Despite their growing political divisions, Japanese Americans came together to launch several successful campaigns that laid the groundwork for redress. During the early 1980s, the government increased its involvement in redress by forming a congressional commission to conduct an official review of the World War II incarceration. The commission’s recommendations of monetary payments and an official apology paved the way for the passage of the Civil Liberties Act of 1988 and other redress actions. Beyond its legislative and judicial victories, the redress movement also created a space for collective healing and generated new forms of activism that continue into the present.