1-20 of 24 Results

  • Keywords: law x
Clear all

Article

Risa L. Goluboff and Adam Sorensen

The crime of vagrancy has deep historical roots in American law and legal culture. Originating in 16th-century England, vagrancy laws came to the New World with the colonists and soon proliferated throughout the British colonies and, later, the United States. Vagrancy laws took myriad forms, generally making it a crime to be poor, idle, dissolute, immoral, drunk, lewd, or suspicious. Vagrancy laws often included prohibitions on loitering—wandering around without any apparent lawful purpose—though some jurisdictions criminalized loitering separately. Taken together, vaguely worded vagrancy, loitering, and suspicious persons laws targeted objectionable “out of place” people rather than any particular conduct. They served as a ubiquitous tool for maintaining hierarchy and order in American society. Their application changed alongside perceived threats to the social fabric, at different times and places targeting the unemployed, labor activists, radical orators, cultural and sexual nonconformists, racial and religious minorities, civil rights protesters, and the poor. By the mid-20th century, vagrancy laws served as the basis for hundreds of thousands of arrests every year. But over the course of just two decades, the crime of vagrancy, virtually unquestioned for four hundred years, unraveled. Profound social upheaval in the 1960s produced a concerted effort against the vagrancy regime, and in 1972, the US Supreme Court invalidated the laws. Local authorities have spent the years since looking for alternatives to the many functions vagrancy laws once served.

Article

Courts and legislatures in colonial America and the early American republic developed and refined a power to compel civilians to assist peace and law enforcement officers in arresting wrongdoers, keeping the peace, and other matters of law enforcement. This power to command civilian cooperation was known as the posse comitatus or “power of the county.” Rooted in early modern English countryside law enforcement, the posse comitatus became an important police institution in 18th- and 19th-century America. The posse comitatus was typically composed of able-bodied white male civilians who were temporarily deputized to aid a sheriff or constable. But if this “power of the county” was insufficient, law enforcement officers were often authorized to call on the military to serve as the posse comitatus. The posse comitatus proved particularly important in buttressing slavery in the American South. Slaveholders pushed for and especially benefited from laws that required citizens to assist in the recapture of local runaway slaves and fugitive slaves who crossed into states without slavery. Though slave patrols were rooted in the posse comitatus, the posse comitatus originated as a compulsory and noncompensated institution. Slaveholders in the American South later added financial incentives for those who acted in the place of a posse to recapture slaves on the run from their owners. The widespread use of the posse comitatus in southern slave law became part of the national discussion about slavery during the early American republic as national lawmakers contemplated how to deal with the problem of fugitive slaves who fled to free states. This dialogue culminated with the Fugitive Slave Law of 1850, in which the US Congress authorized officials to “summon and call to their aid the bystanders, or posse comitatus” and declared that “all good citizens are hereby commanded to aid and assist in the prompt and efficient execution of this law, whenever their services may be required.” During Reconstruction, the Radical Republican Congress used the posse comitatus to enforce laws that targeted conquered Confederates. After the end of Reconstruction in 1877, Southern states pushed Congress to create what would come to be known as the “Posse Comitatus Act,” which prohibited the use of federal military forces for law enforcement. The history of the posse comitatus in early America is thus best understood as a story about and an example of the centralization of government authority and its ramifications.

Article

Sally Hadden

Law in early America came from many sources. To focus exclusively on the English common law excludes other vital sources including (but not limited to) civil law, canon law, lex mercatoria (the law merchant), and custom. Also, the number of sources increases the farther back in time one goes and the greater the geographic area under consideration. By the 18th century, common law had come to dominate, but not snuff out, other competing legal traditions, in part due to the numerical, political, military, and linguistic advantages of its users. English colonists were well-acquainted with the common law, but after arriving in the New World, the process of adaptation to new experiences and new surroundings meant that English common law would undergo numerous alterations. Colonists in early America had to create legal explanations for the dispossession of Native American land and the appropriation of labor by enslaved Native Americans and Africans. Their colonial charters provided that all colonial law must conform to English law, but deviations began to appear in several areas almost from the first moment of colonization. When controversies arose within the colonies, not all disagreements were settled in courts: churches and merchants provided alternative settings to arbitrate disputes. In part, other groups provided mediation because there were so few trained lawyers and judges available in 17th-century colonies. By the 18th century, however, the number of trained practitioners increased, and the sophistication of legal knowledge in the colonies grew. The majority of legal work handled by colonial lawyers concerned contracts and property. Law and the language of rights became more widely used by early Americans as the English attempted to tighten their control over the colonists in the mid-18th century. Rights and law became firmly linked with the Revolution in the minds of Americans, so much so that law, rights, and the American Revolution continue to form an integral part of American national identity.

Article

David Schuyler

The creation and evolution of urban parks is in some ways a familiar story, especially given the attention that Frederick Law Olmsted’s work has commanded since the early 1970s. Following the success of Central Park, cities across the United States began building parks to meet the recreational needs of residents, and during the second half of the 19th century, Olmsted and his partners designed major parks or park systems in thirty cities. Yet, even that story is incomplete. To be sure, Olmsted believed that every city should have a large rural park as an alternative to the density of building and crowding of the modern metropolis, a place to provide for an “unbending of the faculties,” a process of recuperation from the stresses and strains of urban life. But, even in the mid-1860s he sought to create alternative spaces for other types of recreation. Olmsted and his partner Calvert Vaux successfully persuaded the Prospect Park commission, in Brooklyn, New York, to acquire land for a parade ground south of the park as a place for military musters and athletics; moreover, in 1868 they prepared a plan for a park system in Buffalo, New York, that consisted of three parks, linked by parkways, that served different functions and provided for different forms of recreation. As the decades progressed, Olmsted became a champion of parks designed for active recreation; gymnasiums for women as well as men, especially in working-class areas of cities; and playgrounds for small children. He did so in part to relieve pressure on the large landscape parks to accommodate uses he believed would be inappropriate, but also because he recognized the legitimate demands for new forms of recreation. In later years, other park designers and administrators would similarly add facilities for active recreation, though sometimes in ways that compromised what Olmsted considered the primary purpose of a public park. Urban parks are, in important ways, a microcosm of the nation’s cities. Battles over location, financing, political patronage, and use have been a constant. Through it all, parks have evolved to meet the changing recreational needs of residents. And, as dominant a figure as Olmsted has been, this is a story that antedates his professional career and that includes the many voices that have shaped public parks in U.S. cities in the 20th century.

Article

Antimonopoly, meaning opposition to the exclusive or near-exclusive control of an industry or business by one or a very few businesses, played a relatively muted role in the history of the post-1945 era, certainly compared to some earlier periods in American history. However, the subject of antimonopoly is important because it sheds light on changing attitudes toward concentrated power, corporations, and the federal government in the United States after World War II. Paradoxically, as antimonopoly declined as a grass-roots force in American politics, the technical, expert-driven field of antitrust enjoyed a golden age. From the 1940s to the 1960s, antitrust operated on principles that were broadly in line with those that inspired its creation in the late 19th and early 20th century, acknowledging the special contribution small-business owners made to US democratic culture. In these years, antimonopoly remained sufficiently potent as a political force to sustain the careers of national-level politicians such as congressmen Wright Patman and Estes Kefauver and to inform the opinions of Supreme Court justices such as Hugo Black and William O. Douglas. Antimonopoly and consumer politics overlapped in this period. From the mid-1960s onward, Ralph Nader repeatedly tapped antimonopoly ideas in his writings and consumer activism, skillfully exploiting popular anxieties about concentrated economic power. At the same time, as part of the United States’ rise to global hegemony, officials in the federal government’s Antitrust Division exported antitrust overseas, building it into the political, economic, and legal architecture of the postwar world. Beginning in the 1940s, conservative lawyers and economists launched a counterattack against the conception of antitrust elaborated in the progressive era. By making consumer welfare—understood in terms of low prices and market efficiency—the determining factor in antitrust cases, they made a major intellectual and political contribution to the rightward thrust of US politics in the 1970s and 1980s. Robert Bork’s The Antitrust Paradox, published in 1978, popularized and signaled the ascendency of this new approach. In the 1980s and 1990s antimonopoly drifted to the margin of political debate. Fear of big government now loomed larger in US politics than the specter of monopoly or of corporate domination. In the late 20th century, Americans, more often than not, directed their antipathy toward concentrated power in its public, rather than its private, forms. This fundamental shift in the political landscape accounts in large part for the overall decline of antimonopoly—a venerable American political tradition—in the period 1945 to 2000.

Article

Beginning in the 1630s, colonial assemblies in English America and later the new United States used legislation and constitutions to enslave Africans and deny free blacks civil rights, including free movement, freedom of marriage, freedom of occupation and, of course, citizenship and the vote. Across the next two centuries, blacks and a minority of whites critiqued the oppressive racialist system. Blacks employed varied tactics to challenge their enslavement, from running away to inciting revolts. Others used fiery rhetoric and printed tracts. In the 1760s, when whites began to search for political and philosophical arguments to challenge what they perceived as political oppression from London, they labeled their experience as “slavery.” The colonists also developed compelling arguments that gave some of them the insight that enslaving Africans was as wrong as what they called British oppression. The Massachusetts lawyer James Otis wiped the mirror clean in The Rights of the British Colonies Asserted and Proved, stating “The colonists, black and white . . . are free-born British subjects . . . entitled to all the essential civil rights.” The Declaration of Independence polished the stained mirror by asserting, “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights.” However, the Constitution of the United States negated these gains by offering federal protection for slavery; it was a covenant with death, as abolitionist William Lloyd Garrison later asserted. After the Revolution, many states passed black laws to deprive blacks of the same rights as whites. Blacks commonly could not vote, testify in court against a white, or serve on juries. States barred black children from public schools. The Civil War offered the promise of equality with whites, but when the war ended, many southern states immediately passed black codes to deny blacks the gains won in emancipation.

Article

Timothy S. Huebner

The Supreme Court of the United States stands at the head of the nation’s judicial system. Created in Article III of the Constitution of 1787 but obscured by the other branches of government during the first few decades of its history, the Court came into its own as a co-equal branch in the early 19th century. Its exercise of judicial review—the power that it claimed to determine the constitutionality of legislative acts—gave the Court a unique status as the final arbiter of the nation’s constitutional conflicts. From the slavery question during the antebellum era to abortion and gay rights in more recent times, the Court has decided cases brought to it by individual litigants, and in doing so has shaped American constitutional and legal development. Composed of unelected justices who serve “during good behavior,” the Court’s rise in stature has not gone uncontested. Throughout the nation’s history, Congress, the president, and organized interest groups have all attempted to influence the Court’s jurisdiction, composition, and decision making. The Court’s prominence reflects Americans’ historically paradoxical attitudes toward the judiciary: they have often been suspicious of the power of unelected judges at the same time that they have relied on independent judicial institutions to resolve their deepest disputes.

Article

Foreign relations under the US Constitution starts with the paradox, also seen in domestic matters, of relatively scant text providing guidance for the exercise of vast power. Founding understandings, structural inference, and ongoing constitutional custom and precedent have filled in much, though hardly all, of the framework over the course of two hundred years. As a result, two basic questions frame the relationship between the Constitution and US foreign policy: (1) which parts of the US government, alone or in combination, properly exercise authority in the making of foreign policy; and (2) once made, what is the status of the nation’s international legal obligations in the US domestic legal system. The making of American foreign policy is framed by the Constitution’s commitment to separation of powers. Congress, the president, and the courts are all allocated discrete yet significant foreign affairs authority. Determining the exact borders and overlaps in areas such as the use of military force, emergency measures, and treaty termination continues to generate controversy. The status of international law in the US legal system in the first instance turns on whether resulting obligations derive from agreements or custom. The United States enters into international agreements in three ways: treaties, congressional-executive agreements, and sole executive agreements. Complex doctrine deals with the domestic applicability of treaties in particular. US courts primarily apply customary international law in two basic ways. They can exercise a version of their common lawmaking authority to fashion rules of decision based on international custom. They also apply customary international law when incorporated into domestic law by statute.

Article

The Equal Rights Amendment (ERA), designed to enshrine in the Constitution of the United States a guarantee of equal rights to women and men, has had a long and volatile history. When first introduced in Congress in 1923, three years after ratification of the woman suffrage amendment to the US Constitution, the ERA faced fierce opposition from the majority of former suffragists. These progressive women activists opposed the ERA because it threatened hard-won protective labor legislation for wage-earning women. A half century later, however, the amendment enjoyed such broad support that it was passed by the requisite two-thirds of Congress and, in 1972, sent to the states for ratification. Unexpectedly, virulent opposition emerged during the ratification process, not among progressive women this time but among conservatives, whose savvy organizing prevented ratification by a 1982 deadline. Many scholars contend that despite the failure of ratification, equal rights thinking so triumphed in the courts and legislatures by the 1990s that a “de facto ERA” was in place. Some feminists, distrustful of reversible court decisions and repealable legislation, continued to agitate for the ERA; others voiced doubt that ERA would achieve substantive equality for women. Because support for an ERA noticeably revived in the 2010s, this history remains very much in progress.

Article

Richard N. L. Andrews

Between 1964 and 2017, the United States adopted the concept of environmental policy as a new focus for a broad range of previously disparate policy issues affecting human interactions with the natural environment. These policies ranged from environmental health, pollution, and toxic exposure to management of ecosystems, resources, and use of the public lands, environmental aspects of urbanization, agricultural practices, and energy use, and negotiation of international agreements to address global environmental problems. In doing so, it nationalized many responsibilities that had previously been considered primarily state or local matters. It changed the United States’ approach to federalism by authorizing new powers for the federal government to set national minimum environmental standards and regulatory frameworks with the states mandated to participate in their implementation and compliance. Finally, it explicitly formalized administrative procedures for federal environmental decision-making with stricter requirements for scientific and economic justification rather than merely administrative discretion. In addition, it greatly increased public access to information and opportunities for input, as well as for judicial review, thus allowing citizen advocates for environmental protection and appreciative uses equal legitimacy with commodity producers to voice their preferences for use of public environmental resources. These policies initially reflected widespread public demand and broad bipartisan support. Over several decades, however, they became flashpoints, first, between business interests and environmental advocacy groups and, subsequently, between increasingly ideological and partisan agendas concerning the role of the federal government. Beginning in the 1980s, the long-standing Progressive ideal of the “public interest” was increasingly supplanted by a narrative of “government overreach,” and the 1990s witnessed campaigns to delegitimize the underlying evidence justifying environmental policies by labeling it “junk science” or a “hoax.” From the 1980s forward, the stated priorities of environmental policy vacillated repeatedly between presidential administrations and Congresses supporting continuation and expansion of environmental protection and preservation policies versus those seeking to weaken or even reverse protections in favor of private-property rights and more damaging uses of resources. Yet despite these apparent shifts, the basic environmental laws and policies enacted during the 1970s remained largely in place: political gridlock, in effect, maintained the status quo, with the addition of a very few innovations such as “cap and trade” policies. One reason was that environmental policies retained considerable latent public support: in electoral campaigns, they were often overshadowed by economic and other issues, but they still aroused widespread support in their defense when threatened. Another reason was that decisions by the courts also continued to reaffirm many existing policies and to reject attempts to dismantle them. With the election of Donald Trump in 2016, along with conservative majorities in both houses of Congress, US environmental policy came under the most hostile and wide-ranging attack since its origins. More than almost any other issue, the incoming president targeted environmental policy for rhetorical attacks and budget cuts, and sought to eradicate the executive policies of his predecessor, weaken or rescind protective regulations, and undermine the regulatory and even the scientific capacity of the federal environmental agencies. In the early 21st century, it is as yet unclear how much of his agenda will actually be accomplished, or whether, as in past attempts, much of it will ultimately be blocked by Congress, the courts, public backlash, and business and state government interests seeking stable policy expectations rather than disruptive deregulation.

Article

In the United States, the history of sexual assault in the first half of the 20th century involves multiple contradictions between the ordinary, almost invisible accounts of women of all colors who were raped by fathers, husbands, neighbors, boarders, bosses, hired hands, and other known individuals versus the sensational myths that involved rapacious black men, sly white slavers, libertine elites, and virginal white female victims. Much of the debate about sexual assault revolved around the “unwritten law” that justified “honorable” white men avenging the “defilement” of their women. Both North and South, white people defended lynching and the murder of presumed rapists as “honor killings.” In courtrooms, defense attorneys linked the unwritten law to insanity pleas, arguing that after hearing women tell about their assault, husbands and fathers experienced an irresistible compulsion to avenge the rape of their women. Over time, however, notorious court cases from New York to San Francisco, Indianapolis and Honolulu, to Scottsboro, Alabama, shifted the discourse away from the unwritten law and extralegal “justice” to a more complicated script that demonized unreliable women and absolved imperfect men. National coverage of these cases, made possible by wire services and the Hearst newspaper empire, spurred heated debates concerning the proper roles of men and women. Blockbuster movies like The Birth of a Nation and Gone with the Wind and Book of the Month Club selections such as John Steinbeck’s Of Mice and Men and Richard Wright’s Native Son joined the sensationalized media coverage of high-profile court cases to create new national stereotypes about sexual violence and its causes and culprits. During the 1930s, journalists, novelists, playwrights, and moviemakers increasingly emphasized the culpability of women who, according to this narrative, made themselves vulnerable to assault by stepping outside of their appropriate sphere and tempting men into harming them.

Article

Best known as Abraham Lincoln’s secretary of state during the Civil War, William Henry Seward conducted full careers as a statesman, politician, and visionary of America’s future, both before and after that traumatic conflict. His greatest legacy, however, lay in his service as the secretary of state, leading the diplomatic effort to prevent European intervention in the conflict. His success in that effort marked the margin between the salvation and the destruction of the Union. Beyond his role as diplomat, Seward’s signature qualities of energy, optimism, ambition, and opportunism enabled him to assume a role in the Lincoln administration extending well beyond his diplomatic role as the secretary of state. Those same qualities secured a close working relationship with the president as Seward overcame a rocky first few weeks in office to become Lincoln’s confidant and sounding board. Seward’s career in politics stretched from the 1830s until 1869. Through that time, he maintained a vision of a United States of America built on opportunity and free labor, powered by government’s active role in internal improvement and education. He foresaw a nation fated to expand across the continent and overseas, with expansion occurring peacefully as a result of American industrial and economic strength and its model of government. During his second term as secretary of state, under the Johnson administration, Seward attempted a series of territorial acquisitions in the Caribbean, the Pacific, and on the North American continent. The state of the post-war nation and its fractious politics precluded success in most of these attempts, but Seward was successful in negotiating and securing Congressional ratification of the purchase of Alaska in 1867. In addition, Seward pursued a series of policies establishing paths followed later by US diplomats, including the open door in China and the acquisition of Hawaii and US naval bases in the Caribbean.

Article

A fear of foreignness shaped the immigration foreign policies of the United States up to the end of World War II. US leaders perceived nonwhite peoples of Latin America, Asia, and Europe as racially inferior, and feared that contact with them, even annexation of their territories, would invite their foreign mores, customs, and ideologies into US society. This belief in nonwhite peoples’ foreignness also influenced US immigration policy, as Washington codified laws that prohibited the immigration of nonwhite peoples to the United States, even as immigration was deemed a net gain for a US economy that was rapidly industrializing from the late 19th century to the first half of the 20th century. Ironically, this fear of foreignness fostered an aggressive US foreign policy for many of the years under study, as US leaders feared that European intervention into Latin America, for example, would undermine the United States’ regional hegemony. The fear of foreignness that seemed to oblige the United States to shore up its national security interests vis-à-vis European empires also demanded US intervention into the internal affairs of nonwhite nations. For US leaders, fear of foreignness was a two-sided coin: European aggression was encouraged by the internal instability of nonwhite nations, and nonwhite nations were unstable—and hence ripe pickings for Europe’s empires—because their citizens were racially inferior. To forestall both of these simultaneous foreign threats, the United States increasingly embedded itself into the political and economic affairs of foreign nations. The irony of opportunity, of territorial acquisitions as well as immigrants who fed US labor markets, and fear, of European encroachment and the racial inferiority of nonwhite peoples, lay at the root of the immigration and foreign policies of the United States up to 1945.

Article

Historians of colonial British North America have largely relegated piracy to the marginalia of the broad historical narrative from settlement to revolution. However, piracy and unregulated privateering played a pivotal role in the development of every English community along the eastern seaboard from the Carolinas to New England. Although many pirates originated in the British North American colonies and represented a diverse social spectrum, they were not supported and protected in these port communities by some underclass or proto-proletariat but by the highest echelons of colonial society, especially by colonial governors, merchants, and even ministers. Sea marauding in its multiple forms helped shape the economic, legal, political, religious, and cultural worlds of colonial America. The illicit market that brought longed-for bullion, slaves, and luxury goods integrated British North American communities with the Caribbean, West Africa, and the Pacific and Indian Oceans throughout the 17th century. Attempts to curb the support of sea marauding at the turn of the 18th century exposed sometimes violent divisions between local merchant interests and royal officials currying favor back in England, leading to debates over the protection of English liberties across the Atlantic. When the North American colonies finally closed their ports to English pirates during the years following the Treaty of Utrecht (1713), it sparked a brief yet dramatic turn of events where English marauders preyed upon the shipping belonging to their former “nests.” During the 18th century, colonial communities began to actively support a more regulated form of privateering against agreed upon enemies that would become a hallmark of patriot maritime warfare during the American Revolution.

Article

International law is the set of rules, formally agreed by treaty or understood as customary, by which nation-states interact with each other in a form of international society. Across the history of U.S. foreign relations, international law has provided both an animating vision, or ideology, for various American projects of world order, and a practical tool for the advancement of U.S. power and interests. As the American role in the world changed since the late 18th century, so too did the role of international law in U.S. foreign policy. Initially, international law was a source of authority to which the weak American government could appeal on questions of independence, sovereignty, and neutrality. As U.S. power grew in the 19th and early 20th centuries, international law became variously a liberal project for the advancement of peace, a civilizational discourse for justifying violence and dispossession, and a bureaucratic and commercial tool for the expansion of empire. With the advent of formal inter-governmental organizations in the 20th century, the traditional American focus on neutrality faded, to be replaced by an emphasis on collective security. But as the process of decolonization diluted the strength of the United States and its allies in the parliamentary chambers of the world’s international organizations, Washington increasingly advanced its own interpretations of international law, and opted out of a number of international legal regimes. At the same time, Americans increasingly came to perceive of international law as a vehicle to advance the human rights of individuals over the sovereign rights of states.

Article

Sherman’s March, more accurately known as the Georgia and Carolinas Campaigns, cut a swath across three states in 1864–1865. It was one of the most significant campaigns of the war, making Confederate civilians “howl” as farms and plantations were stripped of everything edible and all their valuables. Outbuildings, and occasionally homes, were burned, railroads were destroyed, and enslaved workers were emancipated. Long after the war ended, Sherman’s March continued to shape American’s memories as one of the most symbolically powerful aspects of the Civil War. Sherman’s March began with the better-known March to the Sea, which started in Atlanta on November 15, 1864, and concluded in Savannah on December 22 of the same year. Sherman’s men proceeded through South Carolina and North Carolina in February, March, and April of 1865. The study of this military campaign illuminates the relationships between Sherman’s soldiers and Southern white civilians, especially women, and African Americans. Sherman’s men were often uncomfortable with their role as an army of liberation, and African Americans, in particular, found the March to be a double-edged sword.

Article

Allison Brownell Tirres

Latino Americans have intersected with the law in complicated ways throughout American history. Latinos themselves are a diverse and heterogeneous racial, ethnic, and cultural group, with members hailing from all parts of the Spanish-speaking world and representing all variations on the spectrum of race. Each group has a unique origin story, but all have been shaped by law and legal process. Legal historians and legal scholars explore the role of law in incorporating Latino groups in American society, the effects of law on Latino communities, and the struggles of Latino lawyers, activists, and ordinary people against legal discrimination and for equality. The civil rights story of Latinos bears strong resemblance to that of African Americans: In each case, members have been subjected to de jure and de facto discrimination and social subordination. But the Latino civil rights story has unique valences, particularly in the areas of language discrimination and immigration law and policy. Latino legal history demonstrates the complex ways that Latinos interact with the color line in American law and politics.

Article

The foreign relations of the Jacksonian age reflected Andrew Jackson’s own sense of the American “nation” as long victimized by non-white enemies and weak politicians. His goal as president from 1829 to 1837 was to restore white Americans’ “sovereignty,” to empower them against other nations both within and beyond US territory. Three priorities emerged from this conviction. First, Jackson was determined to deport the roughly 50,000 Creeks, Cherokees, Choctaws, Chickasaws, and Seminoles living in southern states and territories. He saw them as hostile nations who threatened American safety and checked American prosperity. Far from a domestic issue, Indian Removal was an imperial project that set the stage for later expansion over continental and oceanic frontiers. Second and somewhat paradoxically, Jackson sought better relations with Great Britain. These were necessary because the British Empire was both the main threat to US expansion and the biggest market for slave-grown exports from former Indian lands. Anglo-American détente changed investment patterns and economic development throughout the Western Hemisphere, encouraging American leaders to appease London even when patriotic passions argued otherwise. Third, Jackson wanted to open markets and secure property rights around the globe, by treaty if possible but by force when necessary. He called for a larger navy, pressed countries from France to Mexico for outstanding debts, and embraced retaliatory strikes on “savages” and “pirates” as far away as Sumatra. Indeed, the Jacksonian age brought a new American presence in the Pacific. By the mid-1840s the United States was the dominant power in the Hawaiian Islands and a growing force in China. The Mexican War that followed made the Union a two-ocean colossus—and pushed its regional tensions to the breaking point.

Article

Working women and their issues played a central role in the women’s movement in the decades following World War II. Feminists lobbied, litigated, and engaged in direct action for workplace fairness. Working women, especially those in unions, joined feminist organizations and established their own organizations as well. There were fault lines within the women’s movement over the issues, strategies, and level of commitment to the causes of working women. In the first two decades after 1945, the unionists and liberal reformers who constituted the so-called Women’s Bureau Coalition (named after the U.S. Women’s Bureau) opposed the mostly affluent and conservative members of the National Woman’s Party for their support of the Equal Rights Amendment, supporting instead protective laws and policies that treated women differently from men in the workplace. With the arrival of second-wave feminism in the 1960s and 1970s, “labor feminists” clashed with the middle-class professional women at the helm of newly formed feminist organizations. As support for gender equality transformed employment practices, some labor feminists sought to retain (or extend to men) selected protective measures introduced in the early 20th century to shield women workers from the worst aspects of wage labor. In the face of harsh economic conditions in the 1970s, labor feminists again opposed other feminists for their efforts to modify the union practice of “last hired, first fired” as a way of retaining affirmative-action hiring gains. In recent decades feminists have focused on equity measures such as comparable worth and pregnancy leave as means of addressing the unique challenges women face. In addition they have expanded their concern to lesbian and transgender workers, and, increasingly, to the needs of immigrant workers who make up an increasingly percentage of the working population.

Article

The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.