You are looking at 161-180 of 380 articles
Law in early America came from many sources. To focus exclusively on the English common law excludes other vital sources including (but not limited to) civil law, canon law, lex mercatoria (the law merchant), and custom. Also, the number of sources increases the farther back in time one goes and the greater the geographic area under consideration.
By the 18th century, common law had come to dominate, but not snuff out, other competing legal traditions, in part due to the numerical, political, military, and linguistic advantages of its users. English colonists were well-acquainted with the common law, but after arriving in the New World, the process of adaptation to new experiences and new surroundings meant that English common law would undergo numerous alterations.
Colonists in early America had to create legal explanations for the dispossession of Native American land and the appropriation of labor by enslaved Native Americans and Africans. Their colonial charters provided that all colonial law must conform to English law, but deviations began to appear in several areas almost from the first moment of colonization. When controversies arose within the colonies, not all disagreements were settled in courts: churches and merchants provided alternative settings to arbitrate disputes. In part, other groups provided mediation because there were so few trained lawyers and judges available in 17th-century colonies. By the 18th century, however, the number of trained practitioners increased, and the sophistication of legal knowledge in the colonies grew. The majority of legal work handled by colonial lawyers concerned contracts and property.
Law and the language of rights became more widely used by early Americans as the English attempted to tighten their control over the colonists in the mid-18th century. Rights and law became firmly linked with the Revolution in the minds of Americans, so much so that law, rights, and the American Revolution continue to form an integral part of American national identity.
The impact of LGBTQ (lesbian, gay, bisexual, transgender, and queer) issues on U.S. foreign relations is an understudied area, and only a handful of historians have addressed these issues in articles and books. Encounters with unexpected and condemnable (to European eyes) sexual behaviors and gender comportment arose from the first European forays into North America. As such, subduing heterodox sexual and gender expression has always been part of the colonizing endeavor in the so-called New World, tied in with the mission of civilizing and Christianizing the indigenous peoples that was so central to the forging of the United States and pressing its territorial expansion across the continent. These same impulses accompanied the further U.S. accumulation of territory across the Pacific and the Caribbean in the late 19th century, and they persisted even longer and further afield in its citizens’ missionary endeavors across the globe. During the 20th century, as the state’s foreign policy apparatus grew in size and scope, so too did the notions of homosexuality and transgender identity solidify as widely recognizable identity categories in the United States. Thus, it is during the 20th and 21st centuries, with ever greater intensity as the decades progressed, that one finds important influences of homosexuality and gender diversity on U.S. foreign policy: in immigration policies dating back to the late 19th century, in the Lavender Scare that plagued the State Department during the Truman and Eisenhower presidencies, in more contemporary battles between religious conservatives and queer rights activists that have at times been exported to other countries, and in the increasing intersections of LGBTQ rights issues and the War on Terror that has been waged primarily in the Middle East since September 11, 2001.
Emily K. Hobson
Since World War II, the United States has witnessed major changes in lesbian, gay, bisexual, transgender, and queer (LGBTQ) politics. Indeed, because the history of LGBTQ activism is almost entirely concentrated in the postwar years, the LGBTQ movement is typically said to have achieved rapid change in a short period of time. But if popular accounts characterize LGBTQ history as a straightforward narrative of progress, the reality is more complex. Postwar LGBTQ politics has been both diverse and divided, marked by differences of identity and ideology. At the same time, LGBTQ politics has been embedded in the contexts of state-building and the Cold War, the New Left and the New Right, the growth of neoliberalism, and the HIV/AIDS epidemic. As the field of LGBTQ history has grown, scholars have increasingly been able to place analyses of state regulation into conversation with community-based histories. Moving between such outside and inside perspectives helps to reveal how multiple modes of LGBTQ politics have shaped one another and how they have been interwoven with broader social change. Looking from the outside, it is apparent that LGBTQ politics has been catalyzed by exclusions from citizenship; from the inside, we can see that activists have responded to such exclusions in different ways, including both by seeking social inclusion and by rejecting assimilationist terms. Court rulings and the administration of law have run alongside the debates inside activist communities. Competing visions for LGBTQ politics have centered around both leftist and liberal agendas, as well as viewpoints shaped by race, gender, gender expression, and class.
In 1944 President Franklin D. Roosevelt’s State of the Union address set out what he termed an “economic Bill of Rights” that would act as a manifesto of liberal policies after World War Two. Politically, however, the United States was a different place than the country that had faced the ravages of the Great Depression of the 1930s and ushered in Roosevelt’s New Deal to transform the relationship between government and the people. Key legacies of the New Deal, such as Social Security, remained and were gradually expanded, but opponents of governmental regulation of the economy launched a bitter campaign after the war to roll back labor union rights and dismantle the New Deal state.
Liberal heirs to FDR in the 1950s, represented by figures like two-time presidential candidate Adlai Stevenson, struggled to rework liberalism to tackle the realities of a more prosperous age. The long shadow of the U.S. Cold War with the Soviet Union also set up new challenges for liberal politicians trying to juggle domestic and international priorities in an era of superpower rivalry and American global dominance. The election of John F. Kennedy as president in November 1960 seemed to represent a narrow victory for Cold War liberalism, and his election coincided with the intensification of the struggle for racial equality in the United States that would do much to shape liberal politics in the 1960s. After his assassination in 1963, President Lyndon Johnson launched his “Great Society,” a commitment to eradicate poverty and to provide greater economic security for Americans through policies such as Medicare. But his administration’s deepening involvement in the Vietnam War and its mixed record on alleviating poverty did much to taint the positive connotations of “liberalism” that had dominated politics during the New Deal era.
Jennifer M. Spear
On December 20, 1803, residents of New Orleans gathered at the Place d’Armes in the city center to watch as the French flag was lowered and the flag of the United States was raised in its place. Toasts were made to the US president, the French First Consul, and the Spanish king (whose flag had been lowered in a similar ceremony just twenty days earlier), and the celebrations continued throughout the night. The following day, however, began the process of determining just what it meant now that Louisiana was a part of the United States, initiating the first great test for the United States of its ability to expand its borders, incorporating both territories and peoples. The treaty ratifying the transfer, signed in Paris the previous April 30th, promised that “the inhabitants of the ceded territory shall be incorporated in the Union of the United States” where they would experience “the enjoyment of all these rights, advantages and immunities of citizens of the United States.” These inhabitants included thousands of people of French and Spanish descent, several thousand slaves of African descent, and about fifteen hundred free people of at least partial African ancestry; most of these inhabitants spoke French or (far fewer) Spanish and practiced Catholicism. In addition, the territory was home to tens of thousands of indigenous peoples, many of whom still lived on traditional territories and under their own sovereignty. For a few inhabitants of what would become the Territory of Orleans and later the state of Louisiana, incorporation did lead to “the enjoyment of all these rights” and gave some small grain of truth to Thomas Jefferson’s hope that the trans-Mississippi region would undergird the United States as an “empire of liberty,” although even for Europeans of French and Spanish ancestry, the process was neither easy nor uncontested. For most, however, incorporation led to the expansion of the United States as an empire of slavery, one built upon the often violent dispossession of native peoples of their lands and the expropriated labor of enslaved peoples of African descent.
Benjamin C. Waterhouse
Political lobbying has always played a key role in American governance, but the concept of paid influence peddling has been marked by a persistent tension throughout the country’s history. On the one hand, lobbying represents a democratic process by which citizens maintain open access to government. On the other, the outsized clout of certain groups engenders corruption and perpetuates inequality. The practice of lobbying itself has reflected broader social, political, and economic changes, particularly in the scope of state power and the scale of business organization. During the Gilded Age, associational activity flourished and lobbying became increasingly the province of organized trade associations. By the early 20th century, a wide range at political reforms worked to counter the political influence of corporations. Even after the Great Depression and New Deal recast the administrative and regulatory role of the federal government, business associations remained the primary vehicle through which corporations and their designated lobbyists influenced government policy. By the 1970s, corporate lobbyists had become more effective and better organized, and trade associations spurred a broad-based political mobilization of business. Business lobbying expanded in the latter decades of the 20th century; while the number of companies with a lobbying presence leveled off in the 1980s and 1990s, the number of lobbyists per company increased steadily and corporate lobbyists grew increasingly professionalized. A series of high-profile political scandals involving lobbyists in 2005 and 2006 sparked another effort at regulation. Yet despite popular disapproval of lobbying and distaste for politicians, efforts to substantially curtail the activities of lobbyists and trade associations did not achieve significant success.
Malinda Maynor Lowery
The Lumbee tribe of North Carolina, including approximately 55,000 enrolled members, is the largest Indian community east of the Mississippi River. Lumbee history serves as a window into the roles that Native people have played in the struggle to implement the founding principles of the United States, not just as “the First Americans,” but as members of their own nations, operating in their own communities’ interests. When we see US history through the perspectives of Native nations, we see that the United States is not only on a quest to expand rights for individuals. Surviving Native nations like the Lumbees, who have their own unique claims on this land and its ruling government, are forcing Americans to confront the ways in which their stories, their defining moments, and their founding principles are flawed and inadequate. We know the forced removals, the massacres, the protests that Native people have lodged against injustice, yet such knowledge is not sufficient to understand American history. Lumbee history provides a way to honor, and complicate, American history by focusing not just on the dispossession and injustice visited upon Native peoples, but on how and why Native survival matters. Native nations are doing the same work as the American nation—reconstituting communities, thriving, and finding a shared identity with which to achieve justice and self-determination.
Since the late 19th century, Lumbee Indians have used segregation, war, and civil rights to maintain a distinct identity in the biracial South. The Lumbees’ survival as a people, a race, and a tribal nation shows that their struggle has revolved around autonomy, or the ability to govern their own affairs. They have sought local, state, and federal recognition to support that autonomy, but doing so has entangled the processes of survival with outsiders’ ideas about what constitutes a legitimate Lumbee identity. Lumbees continue to adapt to the constraints imposed on them by outsiders, strengthening their community ties through the process of adaptation itself. Lumbee people find their cohesion in the relentless fight for self-determination. Always, that struggle has mattered more than winning or losing a single battle.
Mark A. Granquist
Lutherans are one branch of Protestant Christianity and have been in America for almost 400 years. Historically they have immigrated to America from Lutheran countries in Europe, especially Germany and Scandinavia. Immigrants during the eighteenth century founded Lutheran congregations in the middle colonies, while westward expansion and further immigration from Europe centered Lutherans in the American Midwest. Lutherans formed regional and national denominations based on geography, ethnicity, and theological differences, In the twentieth century they continued to grow, and mergers reduced the numbers of denominations by 1988 to two major denominations: the Evangelical Lutheran Church in America and the Lutheran Church-Missouri Synod. In 2015 there were close to seven million Lutherans in America.
Landon R. Y. Storrs
The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.
The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.
On February 19, 1942, President Franklin Delano Roosevelt signed Executive Order 9066 authorizing the incarceration of 120,000 Japanese Americans, living primarily on the West Coast of the continental United States. On August 10, 1988, President Ronald Reagan signed legislation authorizing formal apologies and checks for $20,000 to those still alive who had been unjustly imprisoned during WWII. In the interim period, nearly a half century, there were enormous shifts in memories of the events, mainstream accounts, and internal ethnic accountabilities. To be sure, there were significant acts of resistance, from the beginning of mass forced removal to the Supreme Court decisions toward the end of the war. But for a quarter of a century, between 1945 and approximately 1970, there was little to threaten a master narrative that posited Japanese Americans, led by the Japanese American Citizens League (JACL), as a once-embattled ethnic/racial minority that had transcended its victimized past to become America’s treasured model minority. The fact that the Japanese American community began effective mobilization for government apology and reparations in the 1970s only confirmed its emergence as a bona fide part of the American body politic. But where the earlier narrative extolled the memories of Japanese American war heroes and leaders of the JACL, memory making changed dramatically in the 1990s and 2000s. In the years since Reagan’s affirmation that “here we admit a wrong,” Japanese Americans have unleashed a torrent of memorials, museums, and monuments honoring those who fought the injustices and who swore they would resist current or future attempts to scapegoat other groups in the name of national security.
Ramón A. Gutiérrez
The history of Mexican immigration to the United States is best characterized as the movement of unskilled, manual laborers pushed northward mostly by poverty and unemployment and pulled into American labor markets with higher wages. Historically, most Mexicans have been economic immigrants seeking to improve their lives. In moments of civil strife, such as the Mexican Revolution (1910–1917) and the Cristero Revolt (1926–1929), many fled to the United States to escape religious and political persecution. Others, chafing under the weight of conservative, patriarchal, tradition-bound, rural agrarian societies, have migrated seeking modern values and greater personal liberties.
Since the last quarter of the 19th century, due to increasing numeric restrictions on the importation of immigrant workers from Europe, Asia, and Africa, American employers have turned to Mexico to recruit cheap, unskilled labor. Before 1942, Mexico minimally regulated emigration. While attentive to the safety and well-being of its émigrés, the Mexican government deemed out-migration a depletion of the country’s human capital. Monetary remittances helped compensate for this loss, contributing perhaps as much as 10 percent of the country’s yearly gross national product, vastly improving national life, particularly when emigrants returned with skills and consumer goods, seeking investment opportunities for their accumulated cash. Since the 1980s, single Mexican women have become a significant component of this migration, representing 40 percent of the total immigrant flow, employed mostly as service workers, domestics, and nannies, and less so in agricultural work. Mexicans also have gained authorized entry into the United States as highly skilled professionals, but their numbers remain relatively small in comparison to unskilled laborers. Beginning in 1942, and particularly in the 1990s, Mexican immigrants have been stigmatized as illegal aliens, subject to deportation as significant security threats to the nation; a rhetoric that intensified after the September 11, 2001 attacks on the United States by al-Qaeda.
Benjamin H. Johnson
When rebels captured the border city of Juárez, Mexico, in May 1911 and forced the abdication of President Porfirio Díaz shortly thereafter, they not only overthrew the western hemisphere’s oldest regime but also inaugurated the first social revolution of the 20th century. Driven by disenchantment with an authoritarian regime that catered to foreign investment, labor exploitation, and landlessness, revolutionaries dislodged Díaz’s regime, crushed an effort to resurrect it, and then spent the rest of the decade fighting one another for control of the nation. This struggle, recognized ever since as foundational for Mexican politics and identity, also had enormous consequences for the ethnic makeup, border policing, and foreign policy of the United States. Over a million Mexicans fled north during the 1910s, perhaps tripling the country’s Mexican-descent population, most visibly in places such as Los Angeles that had become overwhelmingly Anglo-American. US forces occupied Mexican territory twice, nearly bringing the two nations to outright warfare for the first time since the US–Mexican War of 1846–1848. Moreover, revolutionary violence and radicalism transformed the ways that much of the American population and its government perceived their border with Mexico, providing a rationale for a much more highly policed border and for the increasingly brutal treatment of Mexican-descent people in the United States. The Mexican Revolution was a turning point for Mexico, the United States, and their shared border, and for all who crossed it.
The military history of the American Revolution is more than the history of the War of Independence. The Revolution itself had important military causes. The experience of the Seven Years’ War (which started in 1754 in North America) conditioned British attitudes to the colonies after that conflict was over. From 1764, the British Parliament tried to raise taxes in America to pay for a new permanent military garrison. British politicians resisted colonial objections to parliamentary taxation at least partly because they feared that if the Americans established their right not to be taxed by Westminster, Parliament’s right to regulate colonial overseas trade would then be challenged. If the Americans broke out of the system of trade regulation, British ministers, MPs, and peers worried, then the Royal Navy would be seriously weakened.
The War of Independence, which began in 1775, was not the great American triumph that most accounts suggest. The British army faced a difficult task in suppressing a rebellion three thousand miles from Britain itself. French intervention on the American side in 1778 (followed by the Spanish in 1779, and the Dutch in 1780) made the task still more difficult. In the end, the war in America was won by the French as much as by the Americans. But in the wider imperial conflict, affecting the Caribbean, Central America, Europe, West Africa, and South Asia, the British fared much better. Even in its American dimension, the outcome was less clear cut than we usually imagine. The British, the nominal losers, retained great influence in the independent United States, which in economic terms remained in an essentially dependent relationship with the former mother country.
Wayne Wei-siang Hsieh
Despite the absence of a robust and well-articulated conception of strategy, American military and political leaders during the Civil War had an intuitive sense of how military operations should be coordinated with larger political ends. They also shared a general adherence to the straightforward strategic ideas of Antoine-Henri de Jomini, who emphasized the importance of concentrating one’s own military forces in opposition to dispersed opponents. In the case of the Union, however, victory would require not only a more sophisticated conception of strategy that superseded Jomini and coordinated military operations in geographically disconnected fronts but also the practical implementation of such ideas through well-selected subordinate commanders. It would take Ulysses S. Grant until the end of the war to complete all these tasks. In the case of the Confederacy, secessionist leaders faced the challenge of prioritizing different theaters in the face of their material inferiority to the Union. Robert E. Lee chose the plausible strategy of striking directly at Northern public opinion with aggressive operations waged by his own Army of Northern Virginia, but the final failure of the Confederate war effort raises fair questions about whether the Confederacy should have paid more attention to its western theater.
The relationship between the Church of Jesus Christ of Latter-day Saints—commonly called “Mormonism”—and the politics and culture of the United States is both contentious and intertwined. Historians have commonly observed that Mormonism is in many ways quintessentially American, bearing the marks of the Jacksonian period in which it was born. Its rejection of the denominational leadership of its day, its institution of a lay priesthood, and Joseph Smith’s insistence that revelation trumped scholarship and study all marked it as very much of its time and place, an America in which the authority of common people was exalted and tradition authority was suspect. And yet at the same time, Mormonism was suspect almost immediately upon its birth for those things that made it appear distinctly un-American: the divine power of its prophetic leaders, its rejection of the sole authority of the Bible, its clannishness and separatism, and its defiance of 19th-century sexual morality.
The history of Mormonism in America is in many ways a tug of war between these two impulses. At times the Mormons have embraced what makes them American, have proudly claimed elements of national identity, and have claimed that their faith most truly embodies the American creed. At other times, however, either because of hostility from other Americans or because of their own separatism, Mormons have distanced themselves from the national community and sought a separate community and peoplehood. Through the 19th century, because of the practice of polygamy and the theocratic government of the Utah territory, both Mormons and other Americans perceived a gap between their two communities, but that gap closed by the end of the century, when the federal government used force to eliminate those things Americans most objected to about the faith and Mormons began aggressively pursuing assimilation into American life. By the end of the 20th century, however, Mormonism’s cultural conservatism led both Mormons and other Americans to see that gap opening once more.
The Japanese American Redress Movement refers to the various efforts of Japanese Americans from the 1940s to the 1980s to obtain restitution for their removal and confinement during World War II. This included judicial and legislative campaigns at local, state, and federal levels for recognition of government wrongdoing and compensation for losses, both material and immaterial. The push for redress originated in the late 1940s as the Cold War opened up opportunities for Japanese Americans to demand concessions from the government. During the 1960s and 1970s, Japanese Americans began to connect the struggle for redress with anti-racist and anti-imperialist movements of the time. Despite their growing political divisions, Japanese Americans came together to launch several successful campaigns that laid the groundwork for redress. During the early 1980s, the government increased its involvement in redress by forming a congressional commission to conduct an official review of the World War II incarceration. The commission’s recommendations of monetary payments and an official apology paved the way for the passage of the Civil Liberties Act of 1988 and other redress actions. Beyond its legislative and judicial victories, the redress movement also created a space for collective healing and generated new forms of activism that continue into the present.
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
The history of Muslims in America dates back to the transatlantic mercantile interactions between Europe, Africa, and the Americas. Upon its arrival, Islam became entrenched in American discourses on race and civilization because literate and noble African Muslims, brought to America as slaves, had problematized popular stereotypes of Muslims and black Africans. Furthermore, these enslaved Muslims had to re-evaluate and reconfigure their beliefs and practices to form new communal relations and to make sense of their lives in America.
At the turn of the 20th century, as Muslim immigrants began arriving in the United States from the Middle East, Eastern Europe, and South Asia, they had to establish themselves in an America in which the white race, Protestantism, and progress were conflated to define a triumphalist American national identity, one that allowed varying levels of inclusion for Muslims based on their ethnic, racial, and national backgrounds.
The enormous bloodshed and destruction experienced during World War I ushered in a crisis of confidence in the ideals of the European Enlightenment, as well as in white, Protestant nationalism. It opened up avenues for alternative expressions of progress, which allowed Muslims, along with other nonwhite, non-Christian communities, to engage in political and social organization. Among these organizations were a number of black religious movements that used Islamic beliefs, rites, and symbols to define a black Muslim national identity.
World War II further shifted America, away from the religious competition that had earlier defined the nation’s identity and toward a “civil religion” of American democratic values and political institutions. Although this inclusive rhetoric was received differently along racial and ethnic lines, there was an overall appeal for greater visibility for Muslims in America. After World War II, increased commercial and diplomatic relations between the United States and Muslim-majority countries put American Muslims in a position, not only to relate Islam and America in their own lives but also to mediate between the varying interests of Muslim-majority countries and the United States.
Following the civil rights legislation of the 1950s and 1960s and the passage of the Immigration Act of 1965, Muslim activists, many of whom had been politicized by anticolonial movements abroad, established new Islamic institutions. Eventually, a window was opened between the US government and American Muslim activists, who found a common enemy in communism following the Soviet occupation of Afghanistan in the 1980s.
Since the late 1960s, the number of Muslims in the United States has grown significantly. Today, Muslims are estimated to constitute a little more than 1 percent of the US population. However, with the fall of the Soviet Union and the rise of the United States as the sole superpower in the world, the United States has come into military conflict with Muslim-majority countries and has been the target of attacks by militant Muslim organizations. This has led to the cultivation of the binaries of “Islam and the West” and of “good” Islam and “bad” Islam, which have contributed to the racialization of American Muslims. It has also interpolated them into a reality external to their history and lived experiences as Muslims and Americans.
The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
Urban renewal refers to an interlocking set of national and local policies, programs, and projects, implemented in the vast majority of American cities between 1949 and 1973. These typically entailed major redevelopment of existing urban areas with a view to the modernization of housing, highway infrastructure, commercial and business districts, as well as other large-scale constructions. Reformers from the Progressive Era through the Great Society strove to ameliorate the conditions of poverty and inequality in American cities by focusing primarily on physical transformation of the urban built environment. Citing antecedents such as the reconstruction of Second Empire Paris, imported via the City Beautiful movement, and then updated with midcentury modernism, US urban planners envisioned a radical reorganization of city life. In practice, federal programs and local public authorities targeted the eradication of areas deemed slums or blighted—often as much to socially sanitize neighborhoods inhabited by racial minorities and other marginalized groups as to address deteriorating physical conditions. And while federal funding became available for public works projects in declining central cities under the auspices of improving living conditions for the poor—including providing public housing—urban renewal programs consistently destroyed more affordable housing than they created, over more than three decades. By the end of the 1960s, urban residents and policymakers across the political spectrum concluded that such programs were usually doing more harm than good, and most ended during the Nixon administration. Yet large-scale reminders of urban renewal can still be found in most large US communities, whether in the form of mid-20th-century public housing blocks, transportation projects, stadiums, convention centers, university and hospital expansions, or a variety of public-private redevelopment initiatives. But perhaps the most fundamental legacies of all were the institutionalization of the comprehensive zoning and master planning process in cities nationwide, on the one hand, and the countervailing mobilization of defensively oriented (NIMBY) neighborhood politics, on the other.