1-20 of 21 Results  for:

  • Legal History x
  • Political History x
Clear all

Article

Benjamin L. Madley

Human beings have inhabited the region known as California for at least 13,000 years, or as some believe since time immemorial. By developing technologies, honing skills, and implementing stewardship practices, California Indian communities maximized the bounty of their homelands during the precolonial period. Overall, their population grew to perhaps 310,000 people. Speaking scores of different languages, they organized themselves into at least sixty major tribes. Communities were usually politically autonomous but connected to larger tribal groups by shared languages and cultures while dense networks of economic exchange also bound tribes together. Newcomers brought devastating change, but California Indians resisted and survived. During the Russo-Hispanic period (1769–1846), the Indigenous population fell to perhaps 150,000 people due to diseases, environmental transformation, and colonial policies. The organized mass violence and other policies of early United States rule (1846–1900) further reduced the population. By 1900, census takers counted only 15,377 California Indian people. Still, California Indians resisted. During the 1900–1953 period, the federal government continued its national Allotment Policy but initiated healthcare, land policy, education, and citizenship reforms for California Indians even as they continued to resist and their population grew. During the termination era (1953–1968), California Indians faced federal attempts to obliterate them as American Indians. Finally, California Indian people achieved many hard-won victories during the self-determination era (1968–present).

Article

The NAACP, established in 1909, was formed as an integrated organization to confront racism in the United States rather than seeing the issue as simply a southern problem. It is the longest running civil rights organization and continues to operate today. The original name of the organization was The National Negro League, but this was changed to the NAACP on May 30, 1910. Organized to promote racial equality and integration, the NAACP pursued this goal via legal cases, political lobbying, and public campaigns. Early campaigns involved lobbying for national anti-lynching legislation, pursuing through the US Supreme Court desegregation in areas such as housing and higher education, and the pursuit of voting rights. The NAACP is renowned for the US Supreme Court case of Brown v. Board of Education (1954) that desegregated primary and secondary schools and is seen as a catalyst for the civil rights movement (1955–1968). It also advocated public education by promoting African American achievements in education and the arts to counteract racial stereotypes. The organization published a monthly journal, The Crisis, and promoted African American art forms and culture as another means to advance equality. NAACP branches were established all across the United States and became a network of information, campaigning, and finance that underpinned activism. Youth groups and university branches mobilized younger members of the community. Women were also invaluable to the NAACP in local, regional, and national decision-making processes and campaigning. The organization sought to integrate African Americans and other minorities into the American social, political, and economic model as codified by the US Constitution.

Article

Ivón Padilla-Rodríguez

Child migration has garnered widespread media coverage in the 21st century, becoming a central topic of national political discourse and immigration policymaking. Contemporary surges of child migrants are part of a much longer history of migration to the United States. In the first half of the 20th century, millions of European and Asian child migrants passed through immigration inspection stations in the New York harbor and San Francisco Bay. Even though some accompanied and unaccompanied European child migrants experienced detention at Ellis Island, most were processed and admitted into the United States fairly quickly in the early 20th century. Few of the European child migrants were deported from Ellis Island. Predominantly accompanied Chinese and Japanese child migrants, however, like Latin American and Caribbean migrants in recent years, were more frequently subjected to family separation, abuse, detention, and deportation at Angel Island. Once inside the United States, both European and Asian children struggled to overcome poverty, labor exploitation, educational inequity, the attitudes of hostile officials, and public health problems. After World War II, Korean refugee “orphans” came to the United States under the Refugee Relief Act of 1953 and the Immigration and Nationality Act. European, Cuban, and Indochinese refugee children were admitted into the United States through a series of ad hoc programs and temporary legislation until the 1980 Refugee Act created a permanent mechanism for the admission of refugee and unaccompanied children. Exclusionary immigration laws, the hardening of US international boundaries, and the United States preference for refugees who fled Communist regimes made unlawful entry the only option for thousands of accompanied and unaccompanied Mexican, Central American, and Haitian children in the second half of the 20th century. Black and brown migrant and asylum-seeking children were forced to endure educational deprivation, labor trafficking, mandatory detention, deportation, and deadly abuse by US authorities and employers at US borders and inside the country.

Article

Clodagh Harrington

The Clinton scandals have settled in the annals of American political history in the context of the era’s recurrent presidential misbehavior. Viewed through a historical lens, the activities, investigation, and impeachment trial of the forty-second president are almost inevitably measured against the weight of Watergate and Iran-Contra. As a result, the actions and consequences of this high-profile moment in the late-20th-century political history of the United States arguably took on a weightier meaning than it might otherwise have. If Watergate tested the U.S. constitutional system to its limits and Iran-Contra was arguably as grave, the Clinton affair was crisis-light by comparison. Originating with an investigation into a failed 1970s Arkansas land deal by Bill Clinton and his wife, the saga developed to include such meandering subplots as Filegate, Travelgate, Troopergate, the death of White House counsel Vince Foster, and, most infamously, the president’s affair with a White House intern. Unlike Richard Nixon and Ronald Reagan, even Bill Clinton’s most ardent critics could not find a national security threat among the myriad scandals linked to his name. By the time that Justice Department appointee Robert Fiske was replaced as prosecutor by the infinitely more zealous Kenneth Starr, the case had become synonymous with the culture wars that permeated 1990s American society. As the Whitewater and related tentacles of the investigation failed to result in any meaningfully negative impact on the president, it was his marital infidelities that came closest to unseating him. Pursued with vigor by the Independent Counsel, his supporters remained loyal as his detractors spotted political opportunity via his lapses in judgment. Certain key factors made the Clinton scandal particular to its era. First, in an unprecedented development, the personal indiscretion aspect of the story broke via the Internet. In addition, had the Independent Counsel legislation not been renewed, prosecutor Fiske would likely have wrapped up his investigation in a timely fashion with no intention of pursuing an impeachment path. And, the relentless cable news cycle and increasingly febrile partisan atmosphere of the decade ensured that the nation remained as focused as it was divided on the topic.

Article

The Equal Rights Amendment (ERA), designed to enshrine in the Constitution of the United States a guarantee of equal rights to women and men, has had a long and volatile history. When first introduced in Congress in 1923, three years after ratification of the woman suffrage amendment to the US Constitution, the ERA faced fierce opposition from the majority of former suffragists. These progressive women activists opposed the ERA because it threatened hard-won protective labor legislation for wage-earning women. A half century later, however, the amendment enjoyed such broad support that it was passed by the requisite two-thirds of Congress and, in 1972, sent to the states for ratification. Unexpectedly, virulent opposition emerged during the ratification process, not among progressive women this time but among conservatives, whose savvy organizing prevented ratification by a 1982 deadline. Many scholars contend that despite the failure of ratification, equal rights thinking so triumphed in the courts and legislatures by the 1990s that a “de facto ERA” was in place. Some feminists, distrustful of reversible court decisions and repealable legislation, continued to agitate for the ERA; others voiced doubt that ERA would achieve substantive equality for women. Because support for an ERA noticeably revived in the 2010s, this history remains very much in progress.

Article

N. Bruce Duthu

United States law recognizes American Indian tribes as distinct political bodies with powers of self-government. Their status as sovereign entities predates the formation of the United States and they are enumerated in the U.S. Constitution as among the subjects (along with foreign nations and the several states) with whom Congress may engage in formal relations. And yet, despite this long-standing recognition, federal Indian law remains curiously ambivalent, even conflicted, about the legal and political status of Indian tribes within the U.S. constitutional structure. On the one hand, tribes are recognized as sovereign bodies with powers of self-government within their lands. On the other, long-standing precedents of the Supreme Court maintain that Congress possesses plenary power over Indian tribes, with authority to modify or even eliminate their powers of self-government. These two propositions are in tension with one another and are at the root of the challenges faced by political leaders and academics alike in trying to understand and accommodate the tribal rights to self-government. The body of laws that make up the field of federal Indian law include select provisions of the U.S. Constitution (notably the so-called Indian Commerce Clause), treaties between the United States and various Indian tribes, congressional statutes, executive orders, regulations, and a complex and rich body of court decisions dating back to the nation’s formative years. The noted legal scholar Felix Cohen brought much-needed coherence and order to this legal landscape in the 1940s when he led a team of scholars within the Office of the Solicitor in the Department of the Interior to produce a handbook on federal Indian law. The revised edition of Cohen’s Handbook of Federal Indian Law is still regarded as the seminal treatise in the field. Critically, however, this rich body of law only hints at the real story in federal Indian law. The laws themselves serve as historical and moral markers in the ongoing clash between indigenous and nonindigenous societies and cultures still seeking to establish systems of peaceful coexistence in shared territories. It is a story about the limits of legal pluralism and the willingness of a dominant society and nation to acknowledge and honor its promises to the first inhabitants and first sovereigns.

Article

Alison L. LaCroix

Federalism refers to the constitutional and political structure of the United States of America, according to which political power is divided among multiple levels of government: the national level of government (also referred to as the “federal” or “general” government) and that of the states. It is a multilayered system of government that reserves some powers to component entities while also establishing an overarching level of government with a specified domain of authority. The structures of federalism are set forth in the Constitution of the United States, although some related ideas and practices predated the founding period and others have developed since. The balance between federal and state power has shifted throughout U.S. history, with assertions of broad national power meeting challenges from supporters of states’ rights and state sovereignty. Federalism is a fundamental value of the American political system, and it has been a controversial political and legal question since the founding period.

Article

Sam Lebovic

According to the First Amendment of the US Constitution, Congress is barred from abridging the freedom of the press (“Congress shall make no law . . . abridging the freedom of speech, or of the press”). In practice, the history of press freedom is far more complicated than this simple constitutional right suggests. Over time, the meaning of the First Amendment has changed greatly. The Supreme Court largely ignored the First Amendment until the 20th century, leaving the scope of press freedom to state courts and legislatures. Since World War I, jurisprudence has greatly expanded the types of publication protected from government interference. The press now has broad rights to publish criticism of public officials, salacious material, private information, national security secrets, and much else. To understand the shifting history of press freedom, however, it is important to understand not only the expansion of formal constitutional rights but also how those rights have been shaped by such factors as economic transformations in the newspaper industry, the evolution of professional standards in the press, and the broader political and cultural relations between politicians and the press.

Article

In its formulation of foreign policy, the United States takes account of many priorities and factors, including national security concerns, economic interests, and alliance relationships. An additional factor with significance that has risen and fallen over time is human rights, or more specifically violations of human rights. The extent to which the United States should consider such abuses or seek to moderate them has been and continues to be the subject of considerable debate.

Article

Laurie Arnold

Indian gaming, also called Native American casino gaming or tribal gaming, is tribal government gaming. It is government gaming built on sovereignty and consequently is a corollary to state gambling such as lotteries rather than a corollary to corporate gaming. While the types of games offered in casinos might differ in format from ancestral indigenous games, gaming itself is a cultural tradition in many tribes, including those who operate casino gambling. Native American casino gaming is a $33.7 billion industry operated by nearly 250 distinct tribes in twenty-nine states in the United States. The Indian Gaming Regulatory Act (IGRA) of 1988 provides the framework for tribal gaming and the most important case law in Indian gaming remains Seminole Tribe of Florida v. Butterworth, in the US Fifth Circuit Court of Appeals, and the US Supreme Court decision over California v. Cabazon Band of Mission Indians.

Article

International law is the set of rules, formally agreed by treaty or understood as customary, by which nation-states interact with each other in a form of international society. Across the history of U.S. foreign relations, international law has provided both an animating vision, or ideology, for various American projects of world order, and a practical tool for the advancement of U.S. power and interests. As the American role in the world changed since the late 18th century, so too did the role of international law in U.S. foreign policy. Initially, international law was a source of authority to which the weak American government could appeal on questions of independence, sovereignty, and neutrality. As U.S. power grew in the 19th and early 20th centuries, international law became variously a liberal project for the advancement of peace, a civilizational discourse for justifying violence and dispossession, and a bureaucratic and commercial tool for the expansion of empire. With the advent of formal inter-governmental organizations in the 20th century, the traditional American focus on neutrality faded, to be replaced by an emphasis on collective security. But as the process of decolonization diluted the strength of the United States and its allies in the parliamentary chambers of the world’s international organizations, Washington increasingly advanced its own interpretations of international law, and opted out of a number of international legal regimes. At the same time, Americans increasingly came to perceive of international law as a vehicle to advance the human rights of individuals over the sovereign rights of states.

Article

Juvenile justice is a technical term that refers to the specific area of law and affiliated institutions, most notably the juvenile court, with jurisdiction over the cases of minors who are accused of being miscreants. Although the idea that the law should treat minors differently from adults predates the American Revolution, juvenile justice itself is a Progressive Era invention. Its institutional legitimacy rests on the power and responsibility of the state to act as a parent (parens patriae) on behalf of those who cannot care for themselves. Since the establishment of the world’s first juvenile court in Chicago in 1899, this American idea of creating separate justice systems for juveniles has spread across the nation and much of the world. For more than a century, American states have used their juvenile justice systems to respond to youth crime and delinquency. Since the 1960s, the US Supreme Court has periodically considered whether juvenile courts must provide the same constitutional due process safeguards as adult criminal courts and whether juveniles prosecuted in the criminal justice system can receive the same sentences as adults, such as the death penalty or life without the possibility of parole.

Article

Benjamin C. Waterhouse

Political lobbying has always played a key role in American governance, but the concept of paid influence peddling has been marked by a persistent tension throughout the country’s history. On the one hand, lobbying represents a democratic process by which citizens maintain open access to government. On the other, the outsized clout of certain groups engenders corruption and perpetuates inequality. The practice of lobbying itself has reflected broader social, political, and economic changes, particularly in the scope of state power and the scale of business organization. During the Gilded Age, associational activity flourished and lobbying became increasingly the province of organized trade associations. By the early 20th century, a wide range at political reforms worked to counter the political influence of corporations. Even after the Great Depression and New Deal recast the administrative and regulatory role of the federal government, business associations remained the primary vehicle through which corporations and their designated lobbyists influenced government policy. By the 1970s, corporate lobbyists had become more effective and better organized, and trade associations spurred a broad-based political mobilization of business. Business lobbying expanded in the latter decades of the 20th century; while the number of companies with a lobbying presence leveled off in the 1980s and 1990s, the number of lobbyists per company increased steadily and corporate lobbyists grew increasingly professionalized. A series of high-profile political scandals involving lobbyists in 2005 and 2006 sparked another effort at regulation. Yet despite popular disapproval of lobbying and distaste for politicians, efforts to substantially curtail the activities of lobbyists and trade associations did not achieve significant success.

Article

Historians of colonial British North America have largely relegated piracy to the marginalia of the broad historical narrative from settlement to revolution. However, piracy and unregulated privateering played a pivotal role in the development of every English community along the eastern seaboard from the Carolinas to New England. Although many pirates originated in the British North American colonies and represented a diverse social spectrum, they were not supported and protected in these port communities by some underclass or proto-proletariat but by the highest echelons of colonial society, especially by colonial governors, merchants, and even ministers. Sea marauding in its multiple forms helped shape the economic, legal, political, religious, and cultural worlds of colonial America. The illicit market that brought longed-for bullion, slaves, and luxury goods integrated British North American communities with the Caribbean, West Africa, and the Pacific and Indian Oceans throughout the 17th century. Attempts to curb the support of sea marauding at the turn of the 18th century exposed sometimes violent divisions between local merchant interests and royal officials currying favor back in England, leading to debates over the protection of English liberties across the Atlantic. When the North American colonies finally closed their ports to English pirates during the years following the Treaty of Utrecht (1713), it sparked a brief yet dramatic turn of events where English marauders preyed upon the shipping belonging to their former “nests.” During the 18th century, colonial communities began to actively support a more regulated form of privateering against agreed upon enemies that would become a hallmark of patriot maritime warfare during the American Revolution.

Article

The reproductive experiences of women and girls in the 20th-century United States followed historical patterns shaped by the politics of race and class. Laws and policies governing reproduction generally regarded white women as legitimate reproducers and potentially fit mothers and defined women of color as unfit for reproduction and motherhood; regulations provided for rewards and punishments accordingly. In addition, public policy and public rhetoric defined “population control” as the solution to a variety of social and political problems in the United States, including poverty, immigration, the “quality” of the population, environmental degradation, and “overpopulation.” Throughout the century, nonetheless, women, communities of color, and impoverished persons challenged official efforts, at times reducing or even eliminating barriers to reproductive freedom and community survival. Between 1900 and 1930, decades marked by increasing urbanization, industrialization, and immigration, eugenic fears of “race suicide” (concerns that white women were not having enough babies) fueled a reproductive control regime that pressured middle-class white women to reproduce robustly. At the same time, the state enacted anti-immigrant laws, undermined the integrity of Native families, and protected various forms of racial segregation and white supremacy, all of which attacked the reproductive dignity of millions of women. Also in these decades, many African American women escaped the brutal and sexually predatory Jim Crow culture of the South, and middle-class white women gained greater sexual freedom and access to reproductive health care, including contraceptive services. During the Great Depression, the government devised the Aid to Dependent Children program to provide destitute “worthy” white mothers with government aid while often denying such supports to women of color forced to subordinate their motherhood to agricultural and domestic labor. Following World War II, as the Civil Rights movement gathered form, focus, and adherents, and as African American and other women of color claimed their rights to motherhood and social provision, white policymakers railed against “welfare queens” and defined motherhood as a class privilege, suitable only for those who could afford to give their children “advantages.” The state, invoking the “population bomb,” fought to reduce the birth rates of poor women and women of color through sterilization and mandatory contraception, among other strategies. Between 1960 and 1980, white feminists employed the consumerist language of “choice” as part of the campaign for legalized abortion, even as Native, black, Latina, immigrant, and poor women struggled to secure the right to give birth to and raise their children with dignity and safety. The last decades of the 20th century saw severe cuts in social programs designed to aid low-income mothers and their children, cuts to funding for public education and housing, court decisions that dramatically reduced poor women’s access to reproductive health care including abortion, and the emergence of a powerful, often violent, anti-abortion movement. In response, in 1994 a group of women of color activists articulated the theory of reproductive justice, splicing together “social justice” and “reproductive rights.” The resulting Reproductive Justice movement, which would become increasingly influential in the 21st century, defined reproductive health, rights, and justice as human rights due to all persons and articulated what each individual requires to achieve these rights: the right not to have children, the right to have children, and the right to the social, economic, and environmental conditions necessary to raise children in healthy, peaceful, and sustainable households and communities.

Article

Adrian Chastain Weimer

Founded in the late 1640s, Quakerism reached America in the 1650s and quickly took root due to the determined work of itinerant missionaries over the next several decades. Quakers, or members of the Society of Friends, faced different legal and social challenges in each colony. Many English men and women viewed Friends with hostility because they refused to bear arms in a colony’s defense or take loyalty oaths. Others were drawn to Quakers’ egalitarian message of universal access to the light of Christ in each human being. After George Fox’s visit to the West Indies and the mainland colonies in 1671–1672, Quaker missionaries followed his lead in trying to include enslaved Africans and native Americans in their meetings. Itinerant Friends were drawn to colonies with the most severe laws, seeking a public platform from which to display, through suffering, a joyful witness to the truth of the Quaker message. English Quakers then quickly ushered accounts of their sufferings into print. Organized and supported by English Quakers such as Margaret Fell, the Quaker “invasion” of itinerant missionaries put pressure on colonial judicial systems to define the acceptable boundaries for dissent. Nascent communities of Friends from Barbados to New England struggled with the tension between Quaker ideals and the economic and social hierarchies of colonial societies.

Article

Separation of church and state has long been viewed as a cornerstone of American democracy. At the same time, the concept has remained highly controversial in the popular culture and law. Much of the debate over the application and meaning of the phrase focuses on its historical antecedents. This article briefly examines the historical origins of the concept and its subsequent evolutions in the nineteenth century.

Article

Christopher W. Schmidt

One of the most significant protest campaigns of the civil rights era, the lunch counter sit-in movement began on February 1, 1960 when four young African American men sat down at the whites-only lunch counter of the Woolworth store in Greensboro, North Carolina. Refused service, the four college students sat quietly until the store closed. They continued their protest on the following days, each day joined by more fellow students. Students in other southern cities learned what was happening and started their own demonstrations, and in just weeks, lunch counter sit-ins were taking place across the South. By the end of the spring, tens of thousands of black college and high school students, joined in some cases by sympathetic white students, had joined the sit-in movement. Several thousand went to jail for their efforts after being arrested on charges of trespass, disorderly conduct, or whatever other laws southern police officers believed they could use against the protesters. The sit-ins arrived at a critical juncture in the modern black freedom struggle. The preceding years had brought major breakthroughs, such as the Supreme Court’s Brown v. Board of Education school desegregation ruling in 1954 and the successful Montgomery bus boycott of 1955–1956, but by 1960, activists were struggling to develop next steps. The sit-in movement energized and transformed the struggle for racial equality, moving the leading edge of the movement from the courtrooms and legislative halls to the streets and putting a new, younger generation of activists on the front lines. It gave birth to the Student Nonviolent Coordinating Committee, one of the most important activist groups of the 1960s. It directed the nation’s attention to the problem of racial discrimination in private businesses that served the public, pressured business owners in scores of southern cities to open their lunch counters to African American customers, and set in motion a chain of events that would culminate in the Civil Rights Act of 1964, which banned racial discrimination in public accommodations across the nation.

Article

From the 1890s to World War I, progressive reformers in the United States called upon their local, state, and federal governments to revitalize American democracy and address the most harmful social consequences of industrialization. The emergence of an increasingly powerful administrative state, which intervened on behalf of the public welfare in the economy and society, generated significant levels of conflict. Some of the opposition came from conservative business interests, who denounced state labor laws and other market regulations as meddlesome interferences with liberty of contract. But the historical record of the Progressive Era also reveals a broad undercurrent of resistance from ordinary Americans, who fought for personal liberty against the growth of police power in such areas as public health administration and the regulation of radical speech. Their struggles in the streets, statehouses, and courtrooms of the United States in the early 20th century shaped the legal culture of the period and revealed the contested meaning of individual liberty in a new social age.

Article

The key pieces of antitrust legislation in the United States—the Sherman Antitrust Act of 1890 and the Clayton Act of 1914—contain broad language that has afforded the courts wide latitude in interpreting and enforcing the law. This article chronicles the judiciary’s shifting interpretations of antitrust law and policy over the past 125 years. It argues that jurists, law enforcement agencies, and private litigants have revised their approaches to antitrust to accommodate economic shocks, technological developments, and predominant economic wisdom. Over time an economic logic that prioritizes lowest consumer prices as a signal of allocative efficiency—known as the consumer welfare standard—has replaced the older political objectives of antitrust, such as protecting independent proprietors or small businesses, or reducing wealth transfers from consumers to producers. However, a new group of progressive activists has again called for revamping antitrust so as to revive enforcement against dominant firms, especially in digital markets, and to refocus attention on the political effects of antitrust law and policy. This shift suggests that antitrust may remain a contested field for scholarly and popular debate.