United States law recognizes American Indian tribes as distinct political bodies with powers of self-government. Their status as sovereign entities predates the formation of the United States and they are enumerated in the U.S. Constitution as among the subjects (along with foreign nations and the several states) with whom Congress may engage in formal relations. And yet, despite this long-standing recognition, federal Indian law remains curiously ambivalent, even conflicted, about the legal and political status of Indian tribes within the U.S. constitutional structure. On the one hand, tribes are recognized as sovereign bodies with powers of self-government within their lands. On the other, long-standing precedents of the Supreme Court maintain that Congress possesses plenary power over Indian tribes, with authority to modify or even eliminate their powers of self-government. These two propositions are in tension with one another and are at the root of the challenges faced by political leaders and academics alike in trying to understand and accommodate the tribal rights to self-government. The body of laws that make up the field of federal Indian law include select provisions of the U.S. Constitution (notably the so-called Indian Commerce Clause), treaties between the United States and various Indian tribes, congressional statutes, executive orders, regulations, and a complex and rich body of court decisions dating back to the nation’s formative years. The noted legal scholar Felix Cohen brought much-needed coherence and order to this legal landscape in the 1940s when he led a team of scholars within the Office of the Solicitor in the Department of the Interior to produce a handbook on federal Indian law. The revised edition of Cohen’s Handbook of Federal Indian Law is still regarded as the seminal treatise in the field. Critically, however, this rich body of law only hints at the real story in federal Indian law. The laws themselves serve as historical and moral markers in the ongoing clash between indigenous and nonindigenous societies and cultures still seeking to establish systems of peaceful coexistence in shared territories. It is a story about the limits of legal pluralism and the willingness of a dominant society and nation to acknowledge and honor its promises to the first inhabitants and first sovereigns.
N. Bruce Duthu
The history of Muslims in America dates back to the transatlantic mercantile interactions between Europe, Africa, and the Americas. Upon its arrival, Islam became entrenched in American discourses on race and civilization because literate and noble African Muslims, brought to America as slaves, had problematized popular stereotypes of Muslims and black Africans. Furthermore, these enslaved Muslims had to re-evaluate and reconfigure their beliefs and practices to form new communal relations and to make sense of their lives in America. At the turn of the 20th century, as Muslim immigrants began arriving in the United States from the Middle East, Eastern Europe, and South Asia, they had to establish themselves in an America in which the white race, Protestantism, and progress were conflated to define a triumphalist American national identity, one that allowed varying levels of inclusion for Muslims based on their ethnic, racial, and national backgrounds. The enormous bloodshed and destruction experienced during World War I ushered in a crisis of confidence in the ideals of the European Enlightenment, as well as in white, Protestant nationalism. It opened up avenues for alternative expressions of progress, which allowed Muslims, along with other nonwhite, non-Christian communities, to engage in political and social organization. Among these organizations were a number of black religious movements that used Islamic beliefs, rites, and symbols to define a black Muslim national identity. World War II further shifted America, away from the religious competition that had earlier defined the nation’s identity and toward a “civil religion” of American democratic values and political institutions. Although this inclusive rhetoric was received differently along racial and ethnic lines, there was an overall appeal for greater visibility for Muslims in America. After World War II, increased commercial and diplomatic relations between the United States and Muslim-majority countries put American Muslims in a position, not only to relate Islam and America in their own lives but also to mediate between the varying interests of Muslim-majority countries and the United States. Following the civil rights legislation of the 1950s and 1960s and the passage of the Immigration Act of 1965, Muslim activists, many of whom had been politicized by anticolonial movements abroad, established new Islamic institutions. Eventually, a window was opened between the US government and American Muslim activists, who found a common enemy in communism following the Soviet occupation of Afghanistan in the 1980s. Since the late 1960s, the number of Muslims in the United States has grown significantly. Today, Muslims are estimated to constitute a little more than 1 percent of the US population. However, with the fall of the Soviet Union and the rise of the United States as the sole superpower in the world, the United States has come into military conflict with Muslim-majority countries and has been the target of attacks by militant Muslim organizations. This has led to the cultivation of the binaries of “Islam and the West” and of “good” Islam and “bad” Islam, which have contributed to the racialization of American Muslims. It has also interpolated them into a reality external to their history and lived experiences as Muslims and Americans.
Christopher D. Cantwell
Home to more than half the U.S. population by 1920, cities played an important role in the development of American religion throughout the 20th century. At the same time, the beliefs and practices of religious communities also shaped the contours of America’s urban landscape. Much as in the preceding three centuries, the economic development of America’s cities and the social diversity of urban populations animated this interplay. But the explosive, unregulated expansion that defined urban growth after the Civil War was met with an equally dramatic disinvestment from urban spaces throughout the second half of the 20th century. The domestic and European migrations that previously fueled urban growth also changed throughout the century, shifting from Europe and the rural Midwest to the deep South, Africa, Asia, and Latin America after World War II. These newcomers not only brought new faiths to America’s cities but also contributed to the innovation of several new, distinctly urban religious movements. Urban development and diversity on one level promoted toleration and cooperation as religious leaders forged numerous ecumenical and, eventually, interfaith bonds to combat urban problems. But it also led to tension and conflict as religious communities busied themselves with carving out spaces of their own through tight-knit urban enclaves or new suburban locales. Contemporary American cities are some of the most religiously diverse communities in the world. Historians continue to uncover how religious communities not only have lived in but also have shaped the modern city.
Jimmy Carter’s “Crisis of Confidence Speech” of July 1979 was a critical juncture in post-1945 U.S. politics, but it also marks an exemplary pivot in post-1945 religion. Five dimensions of faith shaped the president’s sermon. The first concerned the shattered consensus of American religion. When Carter encouraged Americans to recapture a spirit of unity, he spoke in a heartfelt but spent language more suitable to Dwight Eisenhower’s presidency than his own. By 1979, the Protestant-Catholic-Jewish consensus of Eisenhower’s time was fractured into a dynamic pluralism, remaking American religion in profound ways. Carter’s speech revealed a second revolution of post-1945 religion when it decried its polarization and politicization. Carter sought to heal ruptures that were dividing the nation between what observers, two decades hence, would label “red” (conservative Republican) and “blue” (liberal Democratic) constituencies. Yet his endeavors failed, as would be evidenced in the religious politics of Ronald Reagan’s era, which followed. Carter championed community values as the answer to his society’s problems aware of yet a third dawning reality: globalization. The virtues of localism that Carter espoused were in fact implicated in (and complicated by) transnational forces of change that saw immigration, missionary enterprises, and state and non-state actors internationalizing the American religious experience. A fourth illuminating dimension of Carter’s speech was its critique of America’s gospel of wealth. Although this “born-again” southerner was a product of the evangelical South’s revitalized free-market capitalism, he lamented how laissez-faire Christianity had become America’s lingua franca. Finally, Carter wrestled with secularization, revealing a fifth feature of post-1945 America. Even though faith commitments were increasingly cordoned off from formal state functions during this time, the nation’s political discourse acquired a pronounced religiosity. Carter contributed by framing mundane issues (such as energy) in moral contexts that drew no hard-and-fast boundaries between matters of the soul and governance. Drawn from the political and economic crises of his moment, Carter’s speech thus also reveals the all-enveloping tide of religion in America’s post-1945 age.