1-20 of 54 Results  for:

  • 20th Century: Post-1945 x
  • Political History x
Clear all

Article

Richard N. L. Andrews

Between 1964 and 2017, the United States adopted the concept of environmental policy as a new focus for a broad range of previously disparate policy issues affecting human interactions with the natural environment. These policies ranged from environmental health, pollution, and toxic exposure to management of ecosystems, resources, and use of the public lands, environmental aspects of urbanization, agricultural practices, and energy use, and negotiation of international agreements to address global environmental problems. In doing so, it nationalized many responsibilities that had previously been considered primarily state or local matters. It changed the United States’ approach to federalism by authorizing new powers for the federal government to set national minimum environmental standards and regulatory frameworks with the states mandated to participate in their implementation and compliance. Finally, it explicitly formalized administrative procedures for federal environmental decision-making with stricter requirements for scientific and economic justification rather than merely administrative discretion. In addition, it greatly increased public access to information and opportunities for input, as well as for judicial review, thus allowing citizen advocates for environmental protection and appreciative uses equal legitimacy with commodity producers to voice their preferences for use of public environmental resources. These policies initially reflected widespread public demand and broad bipartisan support. Over several decades, however, they became flashpoints, first, between business interests and environmental advocacy groups and, subsequently, between increasingly ideological and partisan agendas concerning the role of the federal government. Beginning in the 1980s, the long-standing Progressive ideal of the “public interest” was increasingly supplanted by a narrative of “government overreach,” and the 1990s witnessed campaigns to delegitimize the underlying evidence justifying environmental policies by labeling it “junk science” or a “hoax.” From the 1980s forward, the stated priorities of environmental policy vacillated repeatedly between presidential administrations and Congresses supporting continuation and expansion of environmental protection and preservation policies versus those seeking to weaken or even reverse protections in favor of private-property rights and more damaging uses of resources. Yet despite these apparent shifts, the basic environmental laws and policies enacted during the 1970s remained largely in place: political gridlock, in effect, maintained the status quo, with the addition of a very few innovations such as “cap and trade” policies. One reason was that environmental policies retained considerable latent public support: in electoral campaigns, they were often overshadowed by economic and other issues, but they still aroused widespread support in their defense when threatened. Another reason was that decisions by the courts also continued to reaffirm many existing policies and to reject attempts to dismantle them. With the election of Donald Trump in 2016, along with conservative majorities in both houses of Congress, US environmental policy came under the most hostile and wide-ranging attack since its origins. More than almost any other issue, the incoming president targeted environmental policy for rhetorical attacks and budget cuts, and sought to eradicate the executive policies of his predecessor, weaken or rescind protective regulations, and undermine the regulatory and even the scientific capacity of the federal environmental agencies. In the early 21st century, it is as yet unclear how much of his agenda will actually be accomplished, or whether, as in past attempts, much of it will ultimately be blocked by Congress, the courts, public backlash, and business and state government interests seeking stable policy expectations rather than disruptive deregulation.

Article

American Indian activism after 1945 was as much a part of the larger, global decolonization movement rooted in centuries of imperialism as it was a direct response to the ethos of civic nationalism and integration that had gained momentum in the United States following World War II. This ethos manifested itself in the disastrous federal policies of termination and relocation, which sought to end federal services to recognized Indian tribes and encourage Native people to leave reservations for cities. In response, tribal leaders from throughout Indian Country formed the National Congress of American Indians (NCAI) in 1944 to litigate and lobby for the collective well-being of Native peoples. The NCAI was the first intertribal organization to embrace the concepts of sovereignty, treaty rights, and cultural preservation—principles that continue to guide Native activists today. As American Indian activism grew increasingly militant in the late 1960s and 1970s, civil disobedience, demonstrations, and takeovers became the preferred tactics of “Red Power” organizations such as the National Indian Youth Council (NIYC), the Indians of All Tribes, and the American Indian Movement (AIM). At the same time, others established more focused efforts that employed less confrontational methods. For example, the Native American Rights Fund (NARF) served as a legal apparatus that represented Native nations, using the courts to protect treaty rights and expand sovereignty; the Council of Energy Resource Tribes (CERT) sought to secure greater returns on the mineral wealth found on tribal lands; and the American Indian Higher Education Consortium (AIHEC) brought Native educators together to work for greater self-determination and culturally rooted curricula in Indian schools. While the more militant of these organizations and efforts have withered, those that have exploited established channels have grown and flourished. Such efforts will no doubt continue into the unforeseeable future so long as the state of Native nations remains uncertain.

Article

American activists who challenged South African apartheid during the Cold War era extended their opposition to racial discrimination in the United States into world politics. US antiapartheid organizations worked in solidarity with forces struggling against the racist regime in South Africa and played a significant role in the global antiapartheid movement. More than four decades of organizing preceded the legislative showdown of 1986, when a bipartisan coalition in Congress overrode President Ronald Reagan’s veto, to enact economic sanctions against the apartheid regime in South Africa. Adoption of sanctions by the United States, along with transnational solidarity with the resistance to apartheid by South Africans, helped prompt the apartheid regime to relinquish power and allow the democratic elections that brought Nelson Mandela and the African National Congress to power in 1994. Drawing on the tactics, strategies and moral authority of the civil rights movement, antiapartheid campaigners mobilized public opinion while increasing African American influence in the formulation of US foreign policy. Long-lasting organizations such as the American Committee on Africa and TransAfrica called for boycotts and divestment while lobbying for economic sanctions. Utilizing tactics such as rallies, demonstrations, and nonviolent civil disobedience actions, antiapartheid activists made their voices heard on college campuses, corporate boardrooms, municipal and state governments, as well as the halls of Congress. Cultural expressions of criticism and resistance served to reinforce public sentiment against apartheid. Novels, plays, movies, and music provided a way for Americans to connect to the struggles of those suffering under apartheid. By extending the moral logic of the movement for African American civil rights, American anti-apartheid activists created a multicultural coalition that brought about institutional and governmental divestment from apartheid, prompted Congress to impose economic sanctions on South Africa, and increased the influence of African Americans regarding issues of race and American foreign policy.

Article

Antimonopoly, meaning opposition to the exclusive or near-exclusive control of an industry or business by one or a very few businesses, played a relatively muted role in the history of the post-1945 era, certainly compared to some earlier periods in American history. However, the subject of antimonopoly is important because it sheds light on changing attitudes toward concentrated power, corporations, and the federal government in the United States after World War II. Paradoxically, as antimonopoly declined as a grass-roots force in American politics, the technical, expert-driven field of antitrust enjoyed a golden age. From the 1940s to the 1960s, antitrust operated on principles that were broadly in line with those that inspired its creation in the late 19th and early 20th century, acknowledging the special contribution small-business owners made to US democratic culture. In these years, antimonopoly remained sufficiently potent as a political force to sustain the careers of national-level politicians such as congressmen Wright Patman and Estes Kefauver and to inform the opinions of Supreme Court justices such as Hugo Black and William O. Douglas. Antimonopoly and consumer politics overlapped in this period. From the mid-1960s onward, Ralph Nader repeatedly tapped antimonopoly ideas in his writings and consumer activism, skillfully exploiting popular anxieties about concentrated economic power. At the same time, as part of the United States’ rise to global hegemony, officials in the federal government’s Antitrust Division exported antitrust overseas, building it into the political, economic, and legal architecture of the postwar world. Beginning in the 1940s, conservative lawyers and economists launched a counterattack against the conception of antitrust elaborated in the progressive era. By making consumer welfare—understood in terms of low prices and market efficiency—the determining factor in antitrust cases, they made a major intellectual and political contribution to the rightward thrust of US politics in the 1970s and 1980s. Robert Bork’s The Antitrust Paradox, published in 1978, popularized and signaled the ascendency of this new approach. In the 1980s and 1990s antimonopoly drifted to the margin of political debate. Fear of big government now loomed larger in US politics than the specter of monopoly or of corporate domination. In the late 20th century, Americans, more often than not, directed their antipathy toward concentrated power in its public, rather than its private, forms. This fundamental shift in the political landscape accounts in large part for the overall decline of antimonopoly—a venerable American political tradition—in the period 1945 to 2000.

Article

American policy toward the Arab-Israeli conflict has reflected dueling impulses at the heart of US-Middle East relations since World War II: growing support for Zionism and Israeli statehood on the one hand, the need for cheap oil resources and strong alliances with Arab states on the other, unfolding alongside the ebb and flow of concerns over Soviet influence in the region during the Cold War. These tensions have tracked with successive Arab–Israeli conflagrations, from the 1948 war through the international conflicts of 1967 and 1973, as well as shifting modes of intervention in Lebanon, and more recently, the Palestinian uprisings in the occupied territories and several wars on the Gaza Strip. US policy has been shaped by diverging priorities in domestic and foreign policy, a halting recognition of the need to tackle Palestinian national aspirations, and a burgeoning peace process which has drawn American diplomats into the position of mediating between the parties. Against the backdrop of regional upheaval, this long history of involvement continues into the 21st century as the unresolved conflict between Israel and the Arab world faces a host of new challenges.

Article

The global political divides of the Cold War propelled the dismantling of Asian exclusion in ways that provided greater, if conditional, integration for Asian Americans, in a central aspect of the reworking of racial inequality in the United States after World War II. The forging of strategic alliances with Asian nations and peoples in that conflict mandated at least token gestures of greater acceptance and equity, in the form of changes to immigration and citizenship laws that had previously barred Asians as “aliens ineligible to citizenship.”1 During the Cold War, shared politics and economic considerations continued to trump racial difference as the United States sought leadership of the “free” capitalist world and competed with Soviet-led communism for the affiliation and cooperation of emerging, postcolonial Third World nations. U.S. courtship of once-scorned peoples required the end of Jim Crow systems of segregation through the repeal of discriminatory laws, although actual practices and institutions proved far more resistant to change. Politically and ideologically, culture and values came to dominate explanations for categories and inequalities once attributed to differences in biological race. Mainstream media and cultural productions celebrated America’s newfound embrace of its ethnic populations, even as the liberatory aspirations inflamed by World War II set in motion the civil rights movement and increasingly confrontational mobilizations for greater access and equality. These contestations transformed the character of America as a multiracial democracy, with Asian Americans advancing more than any other racial group to become widely perceived as a “model minority” by the 1980s with the popularization of a racial trope first articulated during the 1960s. Asian American gains were attained in part through the diminishing of barriers in immigration, employment, residence, education, and miscegenation, but also because their successes affirmed U.S. claims regarding its multiracial democracy and because reforms of immigration law admitted growing numbers of Asians who had been screened for family connections, refugee status, and especially their capacity to contribute economically. The 1965 Immigration Act cemented these preferences for educated and skilled Asian workers, with employers assuming great powers as routes to immigration and permanent status. The United States became the chief beneficiary of “brain drain” from Asian countries. Geometric rates of Asian American population growth since 1965, disproportionately screened through this economic preference system, have sharply reduced the ranks of Asian Americans linked to the exclusion era and set them apart from Latino, black, and Native Americans who remain much more entrenched in the systems of inequality rooted in the era of sanctioned racial segregation.

Article

The NAACP, established in 1909, was formed as an integrated organization to confront racism in the United States rather than seeing the issue as simply a southern problem. It is the longest running civil rights organization and continues to operate today. The original name of the organization was The National Negro League, but this was changed to the NAACP on May 30, 1910. Organized to promote racial equality and integration, the NAACP pursued this goal via legal cases, political lobbying, and public campaigns. Early campaigns involved lobbying for national anti-lynching legislation, pursuing through the US Supreme Court desegregation in areas such as housing and higher education, and the pursuit of voting rights. The NAACP is renowned for the US Supreme Court case of Brown v. Board of Education (1954) that desegregated primary and secondary schools and is seen as a catalyst for the civil rights movement (1955–1968). It also advocated public education by promoting African American achievements in education and the arts to counteract racial stereotypes. The organization published a monthly journal, The Crisis, and promoted African American art forms and culture as another means to advance equality. NAACP branches were established all across the United States and became a network of information, campaigning, and finance that underpinned activism. Youth groups and university branches mobilized younger members of the community. Women were also invaluable to the NAACP in local, regional, and national decision-making processes and campaigning. The organization sought to integrate African Americans and other minorities into the American social, political, and economic model as codified by the US Constitution.

Article

Ivón Padilla-Rodríguez

Child migration has garnered widespread media coverage in the 21st century, becoming a central topic of national political discourse and immigration policymaking. Contemporary surges of child migrants are part of a much longer history of migration to the United States. In the first half of the 20th century, millions of European and Asian child migrants passed through immigration inspection stations in the New York harbor and San Francisco Bay. Even though some accompanied and unaccompanied European child migrants experienced detention at Ellis Island, most were processed and admitted into the United States fairly quickly in the early 20th century. Few of the European child migrants were deported from Ellis Island. Predominantly accompanied Chinese and Japanese child migrants, however, like Latin American and Caribbean migrants in recent years, were more frequently subjected to family separation, abuse, detention, and deportation at Angel Island. Once inside the United States, both European and Asian children struggled to overcome poverty, labor exploitation, educational inequity, the attitudes of hostile officials, and public health problems. After World War II, Korean refugee “orphans” came to the United States under the Refugee Relief Act of 1953 and the Immigration and Nationality Act. European, Cuban, and Indochinese refugee children were admitted into the United States through a series of ad hoc programs and temporary legislation until the 1980 Refugee Act created a permanent mechanism for the admission of refugee and unaccompanied children. Exclusionary immigration laws, the hardening of US international boundaries, and the United States preference for refugees who fled Communist regimes made unlawful entry the only option for thousands of accompanied and unaccompanied Mexican, Central American, and Haitian children in the second half of the 20th century. Black and brown migrant and asylum-seeking children were forced to endure educational deprivation, labor trafficking, mandatory detention, deportation, and deadly abuse by US authorities and employers at US borders and inside the country.

Article

Patrick William Kelly

The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.

Article

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

The civil rights movement in the urban South transformed the political, economic, and cultural landscape of post–World War II America. Between 1955 and 1968, African Americans and their white allies relied on nonviolent direct action, political lobbying, litigation, and economic boycotts to dismantle the Jim Crow system. Not all but many of the movement’s most decisive political battles occurred in the cities of Montgomery and Birmingham, Alabama; Nashville and Memphis, Tennessee; Greensboro and Durham, North Carolina; and Atlanta, Georgia. In these and other urban centers, civil rights activists launched full-throttled campaigns against white supremacy, economic exploitation, and state-sanctioned violence against African Americans. Their fight for racial justice coincided with monumental changes in the urban South as the upsurge in federal spending in the region created unprecedented levels of economic prosperity in the newly forged “Sunbelt.” A dynamic and multifaceted movement that encompassed a wide range of political organizations and perspectives, the black freedom struggle proved successful in dismantling legal segregation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 expanded black southerners’ economic, political, and educational opportunities. And yet, many African Americans continued to struggle as they confronted not just the long-term effects of racial discrimination and exclusion but also the new challenges engendered by deindustrialization and urban renewal as well as entrenched patterns of racial segregation in the public-school system.

Article

Clodagh Harrington

The Clinton scandals have settled in the annals of American political history in the context of the era’s recurrent presidential misbehavior. Viewed through a historical lens, the activities, investigation, and impeachment trial of the forty-second president are almost inevitably measured against the weight of Watergate and Iran-Contra. As a result, the actions and consequences of this high-profile moment in the late-20th-century political history of the United States arguably took on a weightier meaning than it might otherwise have. If Watergate tested the U.S. constitutional system to its limits and Iran-Contra was arguably as grave, the Clinton affair was crisis-light by comparison. Originating with an investigation into a failed 1970s Arkansas land deal by Bill Clinton and his wife, the saga developed to include such meandering subplots as Filegate, Travelgate, Troopergate, the death of White House counsel Vince Foster, and, most infamously, the president’s affair with a White House intern. Unlike Richard Nixon and Ronald Reagan, even Bill Clinton’s most ardent critics could not find a national security threat among the myriad scandals linked to his name. By the time that Justice Department appointee Robert Fiske was replaced as prosecutor by the infinitely more zealous Kenneth Starr, the case had become synonymous with the culture wars that permeated 1990s American society. As the Whitewater and related tentacles of the investigation failed to result in any meaningfully negative impact on the president, it was his marital infidelities that came closest to unseating him. Pursued with vigor by the Independent Counsel, his supporters remained loyal as his detractors spotted political opportunity via his lapses in judgment. Certain key factors made the Clinton scandal particular to its era. First, in an unprecedented development, the personal indiscretion aspect of the story broke via the Internet. In addition, had the Independent Counsel legislation not been renewed, prosecutor Fiske would likely have wrapped up his investigation in a timely fashion with no intention of pursuing an impeachment path. And, the relentless cable news cycle and increasingly febrile partisan atmosphere of the decade ensured that the nation remained as focused as it was divided on the topic.

Article

Michael J. Bustamante

The Cuban Revolution transformed the largest island nation of the Caribbean into a flashpoint of the Cold War. After overthrowing US-backed ruler Fulgencio Batista in early 1959, Fidel Castro established a socialist, anti-imperialist government that defied the island’s history as a dependent and dependable ally of the United States. But the Cuban Revolution is not only significant for its challenge to US interests and foreign policy prerogatives. For Cubans, it fundamentally reordered their lives, inspiring multitudes yet also driving thousands of others to migrate to Miami and other points north. Sixty years later, Fidel Castro may be dead and the Soviet Union may be long gone. Cuban socialism has become more hybrid in economic structure, and in 2014 the Cuban and US governments moved to restore diplomatic ties. But Cuba’s leaders continue to insist that “the Revolution,” far from a terminal political event, is still alive. Today, as the founding generation of Cuban leaders passes from the scene, “the Revolution” faces another important crossroads of uncertainty and reform.

Article

The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded. Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.

Article

Probably no American president was more thoroughly versed in matters of national security and foreign policy before entering office than Dwight David Eisenhower. As a young military officer, Eisenhower served stateside in World War I and then in Panama and the Philippines in the interwar years. On assignments in Washington and Manila, he worked on war plans, gaining an understanding that national security entailed economic and psychological factors in addition to manpower and materiel. In World War II, he commanded Allied forces in the European Theatre of Operations and honed his skills in coalition building and diplomacy. After the war, he oversaw the German occupation and then became Army Chief of Staff as the nation hastily demobilized. At the onset of the Cold War, Eisenhower embraced President Harry S. Truman’s containment doctrine and participated in the discussions leading to the 1947 National Security Act establishing the Central Intelligence Agency, the National Security Council, and the Department of Defense. After briefly retiring from the military, Eisenhower twice returned to public service at the behest of President Truman to assume the temporary chairmanship of the Joint Chiefs of Staff and then, following the outbreak of the Korean War, to become the first Supreme Allied Commander, Europe, charged with transforming the North Atlantic Treaty Organization into a viable military force. These experiences colored Eisenhower’s foreign policy views, which in turn led him to seek the presidency. He viewed the Cold War as a long-term proposition and worried that Truman’s military buildup would overtax finite American resources. He sought a coherent strategic concept that would be sustainable over the long haul without adversely affecting the free enterprise system and American democratic institutions. He also worried that Republican Party leaders were dangerously insular. As president, his New Look policy pursued a cost-effective strategy of containment by means of increased reliance on nuclear forces over more expensive conventional ones, sustained existing regional alliances and developed new ones, sought an orderly process of decolonization under Western guidance, resorted to covert operations to safeguard vital interests, and employed psychological warfare in the battle with communism for world opinion, particularly in the so-called Third World. His foreign policy laid the basis for what would become the overall American strategy for the duration of the Cold War. The legacy of that policy, however, was decidedly mixed. Eisenhower avoided the disaster of global war, but technological innovations did not produce the fiscal savings that he had envisioned. The NATO alliance expanded and mostly stood firm, but other alliances were more problematic. Decolonization rarely proceeded as smoothly as envisioned and caused conflict with European allies. Covert operations had long-term negative consequences. In Southeast Asia and Cuba, the Eisenhower administration’s policies bequeathed a poisoned chalice for succeeding administrations.

Article

Sam Lebovic

According to the First Amendment of the US Constitution, Congress is barred from abridging the freedom of the press (“Congress shall make no law . . . abridging the freedom of speech, or of the press”). In practice, the history of press freedom is far more complicated than this simple constitutional right suggests. Over time, the meaning of the First Amendment has changed greatly. The Supreme Court largely ignored the First Amendment until the 20th century, leaving the scope of press freedom to state courts and legislatures. Since World War I, jurisprudence has greatly expanded the types of publication protected from government interference. The press now has broad rights to publish criticism of public officials, salacious material, private information, national security secrets, and much else. To understand the shifting history of press freedom, however, it is important to understand not only the expansion of formal constitutional rights but also how those rights have been shaped by such factors as economic transformations in the newspaper industry, the evolution of professional standards in the press, and the broader political and cultural relations between politicians and the press.

Article

The late 20th century saw gender roles transformed as the so-called Second Wave of American feminism that began in the 1960s gained support. By the early 1970s public opinion increasingly favored the movement and politicians in both major political parties supported it. In 1972 Congress overwhelmingly approved the Equal Rights Amendment (ERA) and sent it to the states. Many quickly ratified, prompting women committed to traditional gender roles to organize. However, by 1975 ERA opponents led by veteran Republican activist Phyllis Schlafly, founder of Stop ERA, had slowed the ratification process, although federal support for feminism continued. Congresswoman Bella Abzug (D-NY), inspired by the United Nations’ International Women’s Year (IWY) program, introduced a bill approved by Congress that mandated state and national IWY conferences at which women would produce recommendations to guide the federal government on policy regarding women. Federal funding of these conferences (held in 1977), and the fact that feminists were appointed to organize them, led to an escalation in tensions between feminist and conservative women, and the conferences proved to be profoundly polarizing events. Feminists elected most of the delegates to the culminating IWY event, the National Women’s Conference held in Houston, Texas, and the “National Plan of Action” adopted there endorsed a wide range of feminist goals including the ERA, abortion rights, and gay rights. But the IWY conferences presented conservatives with a golden opportunity to mobilize, and anti-ERA, pro-life, and anti-gay groups banded together as never before. By the end of 1977, these groups, supported by conservative Catholics, Mormons, and evangelical and fundamentalist Protestants, had come together to form a “Pro-Family Movement” that became a powerful force in American politics. By 1980 they had persuaded the Republican Party to drop its support for women’s rights. Afterward, as Democrats continued to support feminist goals and the GOP presented itself as the defender of “family values,” national politics became more deeply polarized and bitterly partisan.

Article

Kathryn Cramer Brownell

Hollywood has always been political. Since its early days, it has intersected with national, state, and local politics. As a new entertainment industry attempting to gain a footing in a society of which it sat firmly on the outskirts, the Jewish industry leaders worked hard to advance the merits of their industry to a Christian political establishment. At the local and state level, film producers faced threats of censorship and potential regulation of more democratic spaces they provided for immigrants and working class patrons in theaters. As Hollywood gained economic and cultural influence, the political establishment took note, attempting to shape silver screen productions and deploy Hollywood’s publicity innovations for its own purposes. Over the course of the 20th century, industry leaders forged political connections with politicians from both parties to promote their economic interests, and politically motivated actors, directors, writers, and producers across the ideological spectrum used their entertainment skills to advance ideas and messages on and off the silver screen. At times this collaboration generated enthusiasm for its ability to bring new citizens into the electoral process. At other times, however, it created intense criticism and fears abounded that entertainment would undermine the democratic process with a focus on style over substance. As Hollywood personalities entered the political realm—for personal, professional, and political gain—the industry slowly reshaped American political life, bringing entertainment, glamor, and emotion to the political process and transforming how Americans communicate with their elected officials and, indeed, how they view their political leaders.

Article

In its formulation of foreign policy, the United States takes account of many priorities and factors, including national security concerns, economic interests, and alliance relationships. An additional factor with significance that has risen and fallen over time is human rights, or more specifically violations of human rights. The extent to which the United States should consider such abuses or seek to moderate them has been and continues to be the subject of considerable debate.

Article

Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.