You are looking at 161-180 of 370 articles
Jennifer M. Spear
On December 20, 1803, residents of New Orleans gathered at the Place d’Armes in the city center to watch as the French flag was lowered and the flag of the United States was raised in its place. Toasts were made to the US president, the French First Consul, and the Spanish king (whose flag had been lowered in a similar ceremony just twenty days earlier), and the celebrations continued throughout the night. The following day, however, began the process of determining just what it meant now that Louisiana was a part of the United States, initiating the first great test for the United States of its ability to expand its borders, incorporating both territories and peoples. The treaty ratifying the transfer, signed in Paris the previous April 30th, promised that “the inhabitants of the ceded territory shall be incorporated in the Union of the United States” where they would experience “the enjoyment of all these rights, advantages and immunities of citizens of the United States.” These inhabitants included thousands of people of French and Spanish descent, several thousand slaves of African descent, and about fifteen hundred free people of at least partial African ancestry; most of these inhabitants spoke French or (far fewer) Spanish and practiced Catholicism. In addition, the territory was home to tens of thousands of indigenous peoples, many of whom still lived on traditional territories and under their own sovereignty. For a few inhabitants of what would become the Territory of Orleans and later the state of Louisiana, incorporation did lead to “the enjoyment of all these rights” and gave some small grain of truth to Thomas Jefferson’s hope that the trans-Mississippi region would undergird the United States as an “empire of liberty,” although even for Europeans of French and Spanish ancestry, the process was neither easy nor uncontested. For most, however, incorporation led to the expansion of the United States as an empire of slavery, one built upon the often violent dispossession of native peoples of their lands and the expropriated labor of enslaved peoples of African descent.
Benjamin C. Waterhouse
Political lobbying has always played a key role in American governance, but the concept of paid influence peddling has been marked by a persistent tension throughout the country’s history. On the one hand, lobbying represents a democratic process by which citizens maintain open access to government. On the other, the outsized clout of certain groups engenders corruption and perpetuates inequality. The practice of lobbying itself has reflected broader social, political, and economic changes, particularly in the scope of state power and the scale of business organization. During the Gilded Age, associational activity flourished and lobbying became increasingly the province of organized trade associations. By the early 20th century, a wide range at political reforms worked to counter the political influence of corporations. Even after the Great Depression and New Deal recast the administrative and regulatory role of the federal government, business associations remained the primary vehicle through which corporations and their designated lobbyists influenced government policy. By the 1970s, corporate lobbyists had become more effective and better organized, and trade associations spurred a broad-based political mobilization of business. Business lobbying expanded in the latter decades of the 20th century; while the number of companies with a lobbying presence leveled off in the 1980s and 1990s, the number of lobbyists per company increased steadily and corporate lobbyists grew increasingly professionalized. A series of high-profile political scandals involving lobbyists in 2005 and 2006 sparked another effort at regulation. Yet despite popular disapproval of lobbying and distaste for politicians, efforts to substantially curtail the activities of lobbyists and trade associations did not achieve significant success.
Malinda Maynor Lowery
The Lumbee tribe of North Carolina, including approximately 55,000 enrolled members, is the largest Indian community east of the Mississippi River. Lumbee history serves as a window into the roles that Native people have played in the struggle to implement the founding principles of the United States, not just as “the First Americans,” but as members of their own nations, operating in their own communities’ interests. When we see US history through the perspectives of Native nations, we see that the United States is not only on a quest to expand rights for individuals. Surviving Native nations like the Lumbees, who have their own unique claims on this land and its ruling government, are forcing Americans to confront the ways in which their stories, their defining moments, and their founding principles are flawed and inadequate. We know the forced removals, the massacres, the protests that Native people have lodged against injustice, yet such knowledge is not sufficient to understand American history. Lumbee history provides a way to honor, and complicate, American history by focusing not just on the dispossession and injustice visited upon Native peoples, but on how and why Native survival matters. Native nations are doing the same work as the American nation—reconstituting communities, thriving, and finding a shared identity with which to achieve justice and self-determination.
Since the late 19th century, Lumbee Indians have used segregation, war, and civil rights to maintain a distinct identity in the biracial South. The Lumbees’ survival as a people, a race, and a tribal nation shows that their struggle has revolved around autonomy, or the ability to govern their own affairs. They have sought local, state, and federal recognition to support that autonomy, but doing so has entangled the processes of survival with outsiders’ ideas about what constitutes a legitimate Lumbee identity. Lumbees continue to adapt to the constraints imposed on them by outsiders, strengthening their community ties through the process of adaptation itself. Lumbee people find their cohesion in the relentless fight for self-determination. Always, that struggle has mattered more than winning or losing a single battle.
Mark A. Granquist
Lutherans are one branch of Protestant Christianity and have been in America for almost 400 years. Historically they have immigrated to America from Lutheran countries in Europe, especially Germany and Scandinavia. Immigrants during the eighteenth century founded Lutheran congregations in the middle colonies, while westward expansion and further immigration from Europe centered Lutherans in the American Midwest. Lutherans formed regional and national denominations based on geography, ethnicity, and theological differences, In the twentieth century they continued to grow, and mergers reduced the numbers of denominations by 1988 to two major denominations: the Evangelical Lutheran Church in America and the Lutheran Church-Missouri Synod. In 2015 there were close to seven million Lutherans in America.
Landon R. Y. Storrs
The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.
The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.
On February 19, 1942, President Franklin Delano Roosevelt signed Executive Order 9066 authorizing the incarceration of 120,000 Japanese Americans, living primarily on the West Coast of the continental United States. On August 10, 1988, President Ronald Reagan signed legislation authorizing formal apologies and checks for $20,000 to those still alive who had been unjustly imprisoned during WWII. In the interim period, nearly a half century, there were enormous shifts in memories of the events, mainstream accounts, and internal ethnic accountabilities. To be sure, there were significant acts of resistance, from the beginning of mass forced removal to the Supreme Court decisions toward the end of the war. But for a quarter of a century, between 1945 and approximately 1970, there was little to threaten a master narrative that posited Japanese Americans, led by the Japanese American Citizens League (JACL), as a once-embattled ethnic/racial minority that had transcended its victimized past to become America’s treasured model minority. The fact that the Japanese American community began effective mobilization for government apology and reparations in the 1970s only confirmed its emergence as a bona fide part of the American body politic. But where the earlier narrative extolled the memories of Japanese American war heroes and leaders of the JACL, memory making changed dramatically in the 1990s and 2000s. In the years since Reagan’s affirmation that “here we admit a wrong,” Japanese Americans have unleashed a torrent of memorials, museums, and monuments honoring those who fought the injustices and who swore they would resist current or future attempts to scapegoat other groups in the name of national security.
Ramón A. Gutiérrez
The history of Mexican immigration to the United States is best characterized as the movement of unskilled, manual laborers pushed northward mostly by poverty and unemployment and pulled into American labor markets with higher wages. Historically, most Mexicans have been economic immigrants seeking to improve their lives. In moments of civil strife, such as the Mexican Revolution (1910–1917) and the Cristero Revolt (1926–1929), many fled to the United States to escape religious and political persecution. Others, chafing under the weight of conservative, patriarchal, tradition-bound, rural agrarian societies, have migrated seeking modern values and greater personal liberties.
Since the last quarter of the 19th century, due to increasing numeric restrictions on the importation of immigrant workers from Europe, Asia, and Africa, American employers have turned to Mexico to recruit cheap, unskilled labor. Before 1942, Mexico minimally regulated emigration. While attentive to the safety and well-being of its émigrés, the Mexican government deemed out-migration a depletion of the country’s human capital. Monetary remittances helped compensate for this loss, contributing perhaps as much as 10 percent of the country’s yearly gross national product, vastly improving national life, particularly when emigrants returned with skills and consumer goods, seeking investment opportunities for their accumulated cash. Since the 1980s, single Mexican women have become a significant component of this migration, representing 40 percent of the total immigrant flow, employed mostly as service workers, domestics, and nannies, and less so in agricultural work. Mexicans also have gained authorized entry into the United States as highly skilled professionals, but their numbers remain relatively small in comparison to unskilled laborers. Beginning in 1942, and particularly in the 1990s, Mexican immigrants have been stigmatized as illegal aliens, subject to deportation as significant security threats to the nation; a rhetoric that intensified after the September 11, 2001 attacks on the United States by al-Qaeda.
Benjamin H. Johnson
When rebels captured the border city of Juárez, Mexico, in May 1911 and forced the abdication of President Porfirio Díaz shortly thereafter, they not only overthrew the western hemisphere’s oldest regime but also inaugurated the first social revolution of the 20th century. Driven by disenchantment with an authoritarian regime that catered to foreign investment, labor exploitation, and landlessness, revolutionaries dislodged Díaz’s regime, crushed an effort to resurrect it, and then spent the rest of the decade fighting one another for control of the nation. This struggle, recognized ever since as foundational for Mexican politics and identity, also had enormous consequences for the ethnic makeup, border policing, and foreign policy of the United States. Over a million Mexicans fled north during the 1910s, perhaps tripling the country’s Mexican-descent population, most visibly in places such as Los Angeles that had become overwhelmingly Anglo-American. US forces occupied Mexican territory twice, nearly bringing the two nations to outright warfare for the first time since the US–Mexican War of 1846–1848. Moreover, revolutionary violence and radicalism transformed the ways that much of the American population and its government perceived their border with Mexico, providing a rationale for a much more highly policed border and for the increasingly brutal treatment of Mexican-descent people in the United States. The Mexican Revolution was a turning point for Mexico, the United States, and their shared border, and for all who crossed it.
The military history of the American Revolution is more than the history of the War of Independence. The Revolution itself had important military causes. The experience of the Seven Years’ War (which started in 1754 in North America) conditioned British attitudes to the colonies after that conflict was over. From 1764, the British Parliament tried to raise taxes in America to pay for a new permanent military garrison. British politicians resisted colonial objections to parliamentary taxation at least partly because they feared that if the Americans established their right not to be taxed by Westminster, Parliament’s right to regulate colonial overseas trade would then be challenged. If the Americans broke out of the system of trade regulation, British ministers, MPs, and peers worried, then the Royal Navy would be seriously weakened.
The War of Independence, which began in 1775, was not the great American triumph that most accounts suggest. The British army faced a difficult task in suppressing a rebellion three thousand miles from Britain itself. French intervention on the American side in 1778 (followed by the Spanish in 1779, and the Dutch in 1780) made the task still more difficult. In the end, the war in America was won by the French as much as by the Americans. But in the wider imperial conflict, affecting the Caribbean, Central America, Europe, West Africa, and South Asia, the British fared much better. Even in its American dimension, the outcome was less clear cut than we usually imagine. The British, the nominal losers, retained great influence in the independent United States, which in economic terms remained in an essentially dependent relationship with the former mother country.
Wayne Wei-siang Hsieh
Despite the absence of a robust and well-articulated conception of strategy, American military and political leaders during the Civil War had an intuitive sense of how military operations should be coordinated with larger political ends. They also shared a general adherence to the straightforward strategic ideas of Antoine-Henri de Jomini, who emphasized the importance of concentrating one’s own military forces in opposition to dispersed opponents. In the case of the Union, however, victory would require not only a more sophisticated conception of strategy that superseded Jomini and coordinated military operations in geographically disconnected fronts but also the practical implementation of such ideas through well-selected subordinate commanders. It would take Ulysses S. Grant until the end of the war to complete all these tasks. In the case of the Confederacy, secessionist leaders faced the challenge of prioritizing different theaters in the face of their material inferiority to the Union. Robert E. Lee chose the plausible strategy of striking directly at Northern public opinion with aggressive operations waged by his own Army of Northern Virginia, but the final failure of the Confederate war effort raises fair questions about whether the Confederacy should have paid more attention to its western theater.
The relationship between the Church of Jesus Christ of Latter-day Saints—commonly called “Mormonism”—and the politics and culture of the United States is both contentious and intertwined. Historians have commonly observed that Mormonism is in many ways quintessentially American, bearing the marks of the Jacksonian period in which it was born. Its rejection of the denominational leadership of its day, its institution of a lay priesthood, and Joseph Smith’s insistence that revelation trumped scholarship and study all marked it as very much of its time and place, an America in which the authority of common people was exalted and tradition authority was suspect. And yet at the same time, Mormonism was suspect almost immediately upon its birth for those things that made it appear distinctly un-American: the divine power of its prophetic leaders, its rejection of the sole authority of the Bible, its clannishness and separatism, and its defiance of 19th-century sexual morality.
The history of Mormonism in America is in many ways a tug of war between these two impulses. At times the Mormons have embraced what makes them American, have proudly claimed elements of national identity, and have claimed that their faith most truly embodies the American creed. At other times, however, either because of hostility from other Americans or because of their own separatism, Mormons have distanced themselves from the national community and sought a separate community and peoplehood. Through the 19th century, because of the practice of polygamy and the theocratic government of the Utah territory, both Mormons and other Americans perceived a gap between their two communities, but that gap closed by the end of the century, when the federal government used force to eliminate those things Americans most objected to about the faith and Mormons began aggressively pursuing assimilation into American life. By the end of the 20th century, however, Mormonism’s cultural conservatism led both Mormons and other Americans to see that gap opening once more.
The Japanese American Redress Movement refers to the various efforts of Japanese Americans from the 1940s to the 1980s to obtain restitution for their removal and confinement during World War II. This included judicial and legislative campaigns at local, state, and federal levels for recognition of government wrongdoing and compensation for losses, both material and immaterial. The push for redress originated in the late 1940s as the Cold War opened up opportunities for Japanese Americans to demand concessions from the government. During the 1960s and 1970s, Japanese Americans began to connect the struggle for redress with anti-racist and anti-imperialist movements of the time. Despite their growing political divisions, Japanese Americans came together to launch several successful campaigns that laid the groundwork for redress. During the early 1980s, the government increased its involvement in redress by forming a congressional commission to conduct an official review of the World War II incarceration. The commission’s recommendations of monetary payments and an official apology paved the way for the passage of the Civil Liberties Act of 1988 and other redress actions. Beyond its legislative and judicial victories, the redress movement also created a space for collective healing and generated new forms of activism that continue into the present.
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
The history of Muslims in America dates back to the transatlantic mercantile interactions between Europe, Africa, and the Americas. Upon its arrival, Islam became entrenched in American discourses on race and civilization because literate and noble African Muslims, brought to America as slaves, had problematized popular stereotypes of Muslims and black Africans. Furthermore, these enslaved Muslims had to re-evaluate and reconfigure their beliefs and practices to form new communal relations and to make sense of their lives in America.
At the turn of the 20th century, as Muslim immigrants began arriving in the United States from the Middle East, Eastern Europe, and South Asia, they had to establish themselves in an America in which the white race, Protestantism, and progress were conflated to define a triumphalist American national identity, one that allowed varying levels of inclusion for Muslims based on their ethnic, racial, and national backgrounds.
The enormous bloodshed and destruction experienced during World War I ushered in a crisis of confidence in the ideals of the European Enlightenment, as well as in white, Protestant nationalism. It opened up avenues for alternative expressions of progress, which allowed Muslims, along with other nonwhite, non-Christian communities, to engage in political and social organization. Among these organizations were a number of black religious movements that used Islamic beliefs, rites, and symbols to define a black Muslim national identity.
World War II further shifted America, away from the religious competition that had earlier defined the nation’s identity and toward a “civil religion” of American democratic values and political institutions. Although this inclusive rhetoric was received differently along racial and ethnic lines, there was an overall appeal for greater visibility for Muslims in America. After World War II, increased commercial and diplomatic relations between the United States and Muslim-majority countries put American Muslims in a position, not only to relate Islam and America in their own lives but also to mediate between the varying interests of Muslim-majority countries and the United States.
Following the civil rights legislation of the 1950s and 1960s and the passage of the Immigration Act of 1965, Muslim activists, many of whom had been politicized by anticolonial movements abroad, established new Islamic institutions. Eventually, a window was opened between the US government and American Muslim activists, who found a common enemy in communism following the Soviet occupation of Afghanistan in the 1980s.
Since the late 1960s, the number of Muslims in the United States has grown significantly. Today, Muslims are estimated to constitute a little more than 1 percent of the US population. However, with the fall of the Soviet Union and the rise of the United States as the sole superpower in the world, the United States has come into military conflict with Muslim-majority countries and has been the target of attacks by militant Muslim organizations. This has led to the cultivation of the binaries of “Islam and the West” and of “good” Islam and “bad” Islam, which have contributed to the racialization of American Muslims. It has also interpolated them into a reality external to their history and lived experiences as Muslims and Americans.
The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
Urban renewal refers to an interlocking set of national and local policies, programs, and projects, implemented in the vast majority of American cities between 1949 and 1973. These typically entailed major redevelopment of existing urban areas with a view to the modernization of housing, highway infrastructure, commercial and business districts, as well as other large-scale constructions. Reformers from the Progressive Era through the Great Society strove to ameliorate the conditions of poverty and inequality in American cities by focusing primarily on physical transformation of the urban built environment. Citing antecedents such as the reconstruction of Second Empire Paris, imported via the City Beautiful movement, and then updated with midcentury modernism, US urban planners envisioned a radical reorganization of city life. In practice, federal programs and local public authorities targeted the eradication of areas deemed slums or blighted—often as much to socially sanitize neighborhoods inhabited by racial minorities and other marginalized groups as to address deteriorating physical conditions. And while federal funding became available for public works projects in declining central cities under the auspices of improving living conditions for the poor—including providing public housing—urban renewal programs consistently destroyed more affordable housing than they created, over more than three decades. By the end of the 1960s, urban residents and policymakers across the political spectrum concluded that such programs were usually doing more harm than good, and most ended during the Nixon administration. Yet large-scale reminders of urban renewal can still be found in most large US communities, whether in the form of mid-20th-century public housing blocks, transportation projects, stadiums, convention centers, university and hospital expansions, or a variety of public-private redevelopment initiatives. But perhaps the most fundamental legacies of all were the institutionalization of the comprehensive zoning and master planning process in cities nationwide, on the one hand, and the countervailing mobilization of defensively oriented (NIMBY) neighborhood politics, on the other.
Nicolas G. Rosenthal
An important relationship has existed between Native Americans and cities from pre-Columbian times to the early 21st century. Long before Europeans arrived in the Americas, indigenous peoples developed societies characterized by dense populations, large-scale agriculture, monumental architecture, and complex social hierarchies. Following European and American conquest and colonization, Native Americans played a crucial role in the development of towns and cities throughout North America, often on the site of former indigenous settlements.
Beginning in the early 20th century, Native Americans began migrating from reservations to U.S. cities in large numbers and formed new intertribal communities. By 1970, the majority of the Native American population lived in cities and the numbers of urban American Indians have been growing ever since. Indian Country in the early 21st century continues to be influenced by the complex and evolving ties between Native Americans and cities.
Wendy L. Wall
The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state.
The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy.
Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”
In late 19th- and early 20th-century America, a new image of womanhood emerged that began to shape public views and understandings of women’s role in society.
Identified by contemporaries as a Gibson Girl, a suffragist, a Progressive reformer, a bohemian feminist, a college girl, a bicyclist, a flapper, a working-class militant, or a Hollywood vamp, all of these images came to epitomize the New Woman, an umbrella term for modern understandings of femininity. Referring both to real, flesh-and-blood women, and also to an abstract idea or a visual archetype, the New Woman represented a generation of women who came of age between 1890 and 1920 and challenged gender norms and structures by asserting a new public presence through work, education, entertainment, and politics, while also denoting a distinctly modern appearance that contrasted with Victorian ideals. The New Woman became associated with the rise of feminism and the campaign for women’s suffrage, as well as with the rise of consumerism, mass culture, and freer expressions of sexuality that defined the first decades of the 20th century. Emphasizing youth, mobility, freedom, and modernity, the image of the New Woman varied by age, class, race, ethnicity, and geographical region, offering a spectrum of behaviors and appearances with which different women could identify. At times controversial, the New Woman image provided women with opportunities to negotiate new social roles and to promote ideas of equality and freedom that would later become mainstream.
Luke A. Nichter
Assessments of President Richard Nixon’s foreign policy continue to evolve as scholars tap new possibilities for research. Due to the long wait before national security records are declassified by the National Archives and made available to researchers and the public, only in recent decades has the excavation of the Nixon administration’s engagement with the world started to become well documented. As more records are released by the National Archives (including potentially 700 hours of Nixon’s secret White House tapes that remain closed), scholarly understanding of the Nixon presidency is likely to continue changing. Thus far, historians have pointed to four major legacies of Nixon’s foreign policy: tendencies to use American muscle abroad on a more realistic scale, to reorient the focus of American foreign policy to the Pacific, to reduce the chance that the Cold War could turn hot, and, inadvertently, to contribute to the later rise of Ronald Reagan and the Republican right wing—many of whom had been part of Nixon’s “silent majority.” While earlier works focused primarily on subjects like Vietnam, China, and the Soviet Union, the historiography today is much more diverse – now there is at least one work covering most major aspects of Nixon’s foreign policy.