41-60 of 205 Results  for:

  • 20th Century: Post-1945 x
Clear all

Article

Death and Dying in the Working Class  

Michael K. Rosenow

In the broader field of thanatology, scholars investigate rituals of dying, attitudes toward death, evolving trajectories of life expectancy, and more. Applying a lens of social class means studying similar themes but focusing on the men, women, and children who worked for wages in the United States. Working people were more likely to die from workplace accidents, occupational diseases, or episodes of work-related violence. In most periods of American history, it was more dangerous to be a wage worker than it was to be a soldier. Battlegrounds were not just the shop floor but also the terrain of labor relations. American labor history has been filled with violent encounters between workers asserting their views of economic justice and employers defending their private property rights. These clashes frequently turned deadly. Labor unions and working-class communities extended an ethos of mutualism and solidarity from the union halls and picket lines to memorial services and gravesites. They lauded martyrs to movements for human dignity and erected monuments to honor the fallen. Aspects of ethnicity, race, and gender added layers of meaning that intersected with and refracted through individuals’ economic positions. Workers’ encounters with death and the way they made sense of loss and sacrifice in some ways overlapped with Americans from other social classes in terms of religious custom, ritual practice, and material consumption. Their experiences were not entirely unique but diverged in significant ways.

Article

Decolonization and US Foreign Relations  

Jason C. Parker

The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded. Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.

Article

The Department Store  

Traci Parker

Department stores were the epicenter of American consumption and modernity in the late 19th and through the 20th century. Between 1846 and 1860 store merchants and commercial impresarios remade dry goods stores and small apparel shops into department stores—downtown emporiums that departmentalized its vast inventory and offered copious services and amenities. Their ascendance corresponded with increased urbanization, immigration, industrialization, and the mass production of machine-made wares. Urbanization and industrialization also helped to birth a new White middle class who were eager to spend their money on material comforts and leisure activities. And department stores provided them with a place where they could do so. Stores sold shoppers an astounding array of high-quality, stylish merchandise including clothing, furniture, radios, sporting equipment, musical instruments, luggage, silverware, china, and books. They also provided an array of services and amenities, including public telephones, postal services, shopping assistance, free delivery, telephone-order and mail-order departments, barber shops, hair salons, hospitals and dental offices, radio departments, shoe-shining stands, wedding gift registries and wedding secretary services, tearooms, and restaurants. Stores enthroned consumption as the route to democracy and citizenship, inviting everybody—regardless of race, gender, age, and class—to enter, browse, and purchase material goods. They were major employers of white-collar workers and functioned as a new public space for women as workers and consumers. The 20th century brought rapid and significant changes and challenges. Department stores weathered economic crises; two world wars; new and intense competition from neighborhood, chain, and discount stores; and labor and civil rights protests that threatened to damage their image and displace them as the nation’s top retailers. They experienced cutbacks, consolidated services, and declining sales during the Great Depression, played an essential role in the war effort, and contended with the Office of Price Administration’s Emergency Price Control Act during the Second World War. In the postwar era, they opened branch locations in suburban neighborhoods where their preferred clientele—the White middle class—now resided and shaped the development and proliferation of shopping centers. They hastened the decline of downtown shopping as a result. The last three decades of the 20th century witnessed a wave of department store closures, mergers, and acquisitions because of changing consumer behaviors, shifts in the retail landscape, and evolving market dynamics. Department stores would continue to suffer into the 21st century as online retailing exploded.

Article

The Draft in U.S. History  

Megan Threlkeld

The issue of compulsory military service has been contested in the United States since before its founding. In a nation characterized by both liberalism and republicanism, there is an inherent tension between the idea that individuals should be able to determine their own destiny and the idea that all citizens have a duty to serve their country. Prior to the 20th century, conscription occurred mainly on the level of local militias, first in the British colonies and later in individual states. It was during the Civil War that the first federal drafts were instituted, both in the Union and the Confederacy. In the North, the draft was unpopular and largely ineffective. Congress revived national conscription when the United States entered World War I and established the Selective Service System to oversee the process. That draft ended when U.S. belligerency ended in 1918. The first peacetime draft was implemented in 1940; with the exception of one year, it remained in effect until 1973. Its most controversial days came during the Vietnam War, when thousands of people across the country demonstrated against it and, in some cases, outright refused to be inducted. The draft stopped with the end of the war, but in 1980, Congress reinstated compulsory Selective Service registration. More than two decades into the 21st century, male citizens and immigrant noncitizens are still required to register within thirty days of their eighteenth birthday. The very idea of “selective service” is ambiguous. It is selective because not everyone is conscripted, but it is compulsory because one can be prosecuted for failing to register or to comply with orders of draft boards. Especially during the Cold War, one of the system’s main functions was not to procure soldiers but to identify and exempt from service those men best suited for other endeavors framed as national service: higher education, careers in science and engineering, and even supporting families. That fact, combined with the decentralized nature of the Selective Service System itself, left the process vulnerable to the prejudices of local draft boards and meant that those most likely to be drafted were poor and nonwhite.

Article

Dwight D. Eisenhower and American Foreign Relations  

Richard V. Damms

Probably no American president was more thoroughly versed in matters of national security and foreign policy before entering office than Dwight David Eisenhower. As a young military officer, Eisenhower served stateside in World War I and then in Panama and the Philippines in the interwar years. On assignments in Washington and Manila, he worked on war plans, gaining an understanding that national security entailed economic and psychological factors in addition to manpower and materiel. In World War II, he commanded Allied forces in the European Theatre of Operations and honed his skills in coalition building and diplomacy. After the war, he oversaw the German occupation and then became Army Chief of Staff as the nation hastily demobilized. At the onset of the Cold War, Eisenhower embraced President Harry S. Truman’s containment doctrine and participated in the discussions leading to the 1947 National Security Act establishing the Central Intelligence Agency, the National Security Council, and the Department of Defense. After briefly retiring from the military, Eisenhower twice returned to public service at the behest of President Truman to assume the temporary chairmanship of the Joint Chiefs of Staff and then, following the outbreak of the Korean War, to become the first Supreme Allied Commander, Europe, charged with transforming the North Atlantic Treaty Organization into a viable military force. These experiences colored Eisenhower’s foreign policy views, which in turn led him to seek the presidency. He viewed the Cold War as a long-term proposition and worried that Truman’s military buildup would overtax finite American resources. He sought a coherent strategic concept that would be sustainable over the long haul without adversely affecting the free enterprise system and American democratic institutions. He also worried that Republican Party leaders were dangerously insular. As president, his New Look policy pursued a cost-effective strategy of containment by means of increased reliance on nuclear forces over more expensive conventional ones, sustained existing regional alliances and developed new ones, sought an orderly process of decolonization under Western guidance, resorted to covert operations to safeguard vital interests, and employed psychological warfare in the battle with communism for world opinion, particularly in the so-called Third World. His foreign policy laid the basis for what would become the overall American strategy for the duration of the Cold War. The legacy of that policy, however, was decidedly mixed. Eisenhower avoided the disaster of global war, but technological innovations did not produce the fiscal savings that he had envisioned. The NATO alliance expanded and mostly stood firm, but other alliances were more problematic. Decolonization rarely proceeded as smoothly as envisioned and caused conflict with European allies. Covert operations had long-term negative consequences. In Southeast Asia and Cuba, the Eisenhower administration’s policies bequeathed a poisoned chalice for succeeding administrations.

Article

The Economy Since 1970  

Judge Glock

Despite almost three decades of strong and stable growth after World War II, the US economy, like the economies of many developed nations, faced new headwinds and challenges after 1970. Although the United States eventually overcame many of them, and continues to be one of the most dynamic in the world, it could not recover its mid-century economic miracle of rapid and broad-based economic growth. There are three major ways the US economy changed in this period. First, the US economy endured and eventually conquered the problem of high inflation, even as it instituted new policies that prioritized price stability over the so-called “Keynesian” goal of full employment. Although these new policies led to over two decades of moderate inflation and stable growth, the 2008 financial crisis challenged the post-Keynesian consensus and led to new demands for government intervention in downturns. Second, the government’s overall influence on the economy increased dramatically. Although the government deregulated several sectors in the 1970s and 1980s, such as transportation and banking, it also created new types of social and environmental regulation that were more pervasive. And although it occasionally cut spending, on the whole government spending increased substantially in this period, until it reached about 35 percent of the economy. Third, the US economy became more open to the world, and it imported more manufactured goods, even as it became more based on “intangible” products and on services rather than on manufacturing. These shifts created new economic winners and losers. Some institutions that thrived in the older economy, such as unions, which once compromised over a third of the workforce, became shadows of their former selves. The new service economy also created more gains for highly educated workers and for investors in quickly growing businesses, while blue-collar workers’ wages stagnated, at least in relative terms. Most of the trends that affected the US economy in this period were long-standing and continued over decades. Major national and international crises in this period, from the end of the Cold War, to the first Gulf War in 1991, to the September 11 attacks of 2001, seemed to have only a mild or transient impact on the economy. Two events that were of lasting importance were, first, the United States leaving the gold standard in 1971, which led to high inflation in the short term and more stable monetary policy over the long term; and second, the 2008 financial crisis, which seemed to permanently decrease American economic output even while it increased political battles about the involvement of government in the economy. The US economy at the beginning of the third decade of the 21st century was richer than it had ever been, and remained in many respects the envy of the world. But widening income gaps meant many Americans felt left behind in this new economy, and led some to worry that the stability and predictability of the old economy had been lost.

Article

Elementary and Secondary Education Policy, Post-1945  

Lawrence J. McAndrews

Americans almost universally agree on the importance of education to the success of individuals and the strength of the nation. Yet they have long differed over the proper mission of government in overseeing their schools. Before 1945, these debates largely occurred at the local and state levels. Since 1945, as education has become an increasingly national and international concern, the federal government has played a larger role in the nation’s schools. As Americans gradually have come to accept a greater federal presence in elementary and secondary schools, however, members of Congress and presidents from both major parties have continued to argue over the scope and substance of the federal role. From 1945 to 1965, these arguments centered on the quest for equity between rich and poor public school pupils and between public and nonpublic school students. From 1965 to 1989, national lawmakers devoted much of their attention to the goal of excellence in public education. From 1989 to the present, they have quarreled over how best to attain equity and excellence at the same time.

Article

Employers’ Associations and Open Shops in the United States  

Chad Pearson

Employers began organizing with one another to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops, workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the 20th century’s first three decades, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was mostly successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right-to-work,” insisting that individual laborers must enjoy freedom from so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, begin passing “right-to-work” laws, which made union organizing more difficult because workers were not obligated to join unions or pay their “fair share” of dues to them. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.

Article

Environmental and Conservation Movements in Metropolitan America  

Robert R. Gioielli

By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.

Article

The Environment in the Atomic Age  

Rachel Rothschild

The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.

Article

Financial Crises in American History  

Christoph Nitschke and Mark Rose

U.S. history is full of frequent and often devastating financial crises. They have coincided with business cycle downturns, but they have been rooted in the political design of markets. Financial crises have also drawn from changes in the underpinning cultures, knowledge systems, and ideologies of marketplace transactions. The United States’ political and economic development spawned, guided, and modified general factors in crisis causation. Broadly viewed, the reasons for financial crises have been recurrent in their form but historically specific in their configuration: causation has always revolved around relatively sudden reversals of investor perceptions of commercial growth, stock market gains, monetary availability, currency stability, and political predictability. The United States’ 19th-century financial crises, which happened in rapid succession, are best described as disturbances tied to market making, nation building, and empire creation. Ongoing changes in America’s financial system aided rapid national growth through the efficient distribution of credit to a spatially and organizationally changing economy. But complex political processes—whether Western expansion, the development of incorporation laws, or the nation’s foreign relations—also underlay the easy availability of credit. The relationship between systemic instability and ideas and ideals of economic growth, politically enacted, was then mirrored in the 19th century. Following the “Golden Age” of crash-free capitalism in the two decades after the Second World War, the recurrence of financial crises in American history coincided with the dominance of the market in statecraft. Banking and other crises were a product of political economy. The Global Financial Crisis of 2007–2008 not only once again changed the regulatory environment in an attempt to correct past mistakes, but also considerably broadened the discursive situation of financial crises as academic topics.

Article

Food and Agriculture in the 20th and 21st Centuries  

Gabriella M. Petrick

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles. The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods. Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.

Article

Food in 20th-Century American Cities  

Adam Shprintzen

Changing foodways, the consumption and production of food, access to food, and debates over food shaped the nature of American cities in the 20th century. As American cities transformed from centers of industrialization at the start of the century to post-industrial societies at the end of the 20th century, food cultures in urban America shifted in response to the ever-changing urban environment. Cities remained centers of food culture, diversity, and food reform despite these shifts. Growing populations and waves of immigration changed the nature of food cultures throughout the United States in the 20th century. These changes were significant, all contributing to an evolving sense of American food culture. For urban denizens, however, food choice and availability were dictated and shaped by a variety of powerful social factors, including class, race, ethnicity, gender, and laboring status. While cities possessed an abundance of food in a variety of locations to consume food, fresh food often remained difficult for the urban poor to obtain as the 20th century ended. As markets expanded from 1900 to 1950, regional geography became a less important factor in determining what types of foods were available. In the second half of the 20th century, even global geography became less important to food choices. Citrus fruit from the West Coast was readily available in northeastern markets near the start of the century, and off-season fruits and vegetables from South America filled shelves in grocery stores by the end of the 20th century. Urban Americans became further disconnected from their food sources, but this dislocation spurred counter-movements that embraced ideas of local, seasonal foods and a rethinking of the city’s relationship with its food sources.

Article

Foreign Economic Aid  

Jeffrey F. Taffet

In the first half of the 20th century, and more actively in the post–World War II period, the United States government used economic aid programs to advance its foreign policy interests. US policymakers generally believed that support for economic development in poorer countries would help create global stability, which would limit military threats and strengthen the global capitalist system. Aid was offered on a country-by-country basis to guide political development; its implementation reflected views about how humanity had advanced in richer countries and how it could and should similarly advance in poorer regions. Humanitarianism did play a role in driving US aid spending, but it was consistently secondary to political considerations. Overall, while funding varied over time, amounts spent were always substantial. Between 1946 and 2015, the United States offered almost $757 billion in economic assistance to countries around the world—$1.6 trillion in inflation-adjusted 2015 dollars. Assessing the impact of this spending is difficult; there has long been disagreement among scholars and politicians about how much economic growth, if any, resulted from aid spending and similar disputes about its utility in advancing US interests. Nevertheless, for most political leaders, even without solid evidence of successes, aid often seemed to be the best option for constructively engaging poorer countries and trying to create the kind of world in which the United States could be secure and prosperous.

Article

Forests and Logging in the United States  

Erik Loomis

Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.

Article

Free Civil Legal Assistance in the United States, 1863–1980  

Felice Batlan

Legal aid organizations were first created by a variety of private groups during the Civil War to provide legal advice in civil cases to the poor. The growing need for legal aid was deeply connected to industrialization, urbanization, and immigration. A variety of groups created legal aid organizations in response to labor unrest, the increasing number of women in the workforce, the founding of women’s clubs, and the slow and incomplete professionalization of the legal bar. In fact, before women could practice law, or were accepted into the legal profession, a variety of middle-class women’s groups using lay lawyers provided legal aid to poor women. Yet, this rich story of women’s work was later suppressed by leaders of the bar attempting to claim credit for legal aid, assert a monopoly over the practice of law, and professionalize legal assistance. Across time, the largest number of claims brought to legal aid providers involved workers trying to collect wages, domestic relations cases, and landlord tenant issues. Until the 1960s, legal aid organizations were largely financed through private donations and philanthropic organizations. After the 1960s, the federal government provided funding to support legal aid, creating significant controversy among lawyers, legal aid providers, and activists as to what types of cases legal aid organizations could take, what services could be provided, and who was eligible. Unlike in many other countries or in criminal cases, in the United States there is no constitutional right to have free counsel in civil cases. This leaves many poor and working-class people without legal advice or access to justice. Organizations providing free civil legal services to the poor are ubiquitous across the United States. They are so much part of the modern legal landscape that it is surprising that little historical scholarship exists on such organizations. Yet the history of organized legal aid, which began during the Civil War, is a rich story that brings into view a unique range of historical actors including women’s organizations, lawyers, social workers, community organizations, the state and federal government, and the millions of poor clients who over the last century and a half have sought legal assistance. This history of the development of legal aid is also very much a story about gender, race, professionalization, the development of the welfare state, and ultimately its slow dismantlement. In other words, the history of legal aid provides a window into the larger history of the United States while producing its own series of historical tensions, ironies, and contradictions. Although this narrative demonstrates change over time and various ruptures with the past, there are also important continuities in the history of free legal aid. Deceptively simple questions have plagued legal aid for almost a century and have also driven much of the historical scholarship on legal aid. These include: who should provide legal aid services, who should receive free legal aid, what types of cases should legal aid organizations handle, who should fund legal aid, and who benefits from legal aid.

Article

Freedom of the Press  

Sam Lebovic

According to the First Amendment of the US Constitution, Congress is barred from abridging the freedom of the press (“Congress shall make no law . . . abridging the freedom of speech, or of the press”). In practice, the history of press freedom is far more complicated than this simple constitutional right suggests. Over time, the meaning of the First Amendment has changed greatly. The Supreme Court largely ignored the First Amendment until the 20th century, leaving the scope of press freedom to state courts and legislatures. Since World War I, jurisprudence has greatly expanded the types of publication protected from government interference. The press now has broad rights to publish criticism of public officials, salacious material, private information, national security secrets, and much else. To understand the shifting history of press freedom, however, it is important to understand not only the expansion of formal constitutional rights but also how those rights have been shaped by such factors as economic transformations in the newspaper industry, the evolution of professional standards in the press, and the broader political and cultural relations between politicians and the press.

Article

Gambling in the Northern City: 1800 to 2000  

Matthew Vaz

While American gambling has a historical association with the lawlessness of the frontier and with the wasteful leisure practices of Southern planters, it was in large cities where American gambling first flourished as a form of mass leisure, and as a commercial enterprise of significant scale. In the urban areas of the Mid-Atlantic, the Northeast, and the upper Mid-West, for the better part of two centuries the gambling economy was deeply intertwined with municipal politics and governance, the practices of betting were a prominent feature of social life, and controversies over the presence of gambling both legal and illegal, were at the center of public debate. In New York and Chicago in particular, but also in Cleveland, Pittsburgh, Detroit, Baltimore, and Philadelphia, gambling channeled money to municipal police forces and sustained machine politics. In the eyes of reformers, gambling corrupted governance and corroded social and economic interactions. Big city gambling has changed over time, often in a manner reflecting important historical processes and transformations in economics, politics, and demographics. Yet irrespective of such change, from the onset of Northern urbanization during the 19th century, through much of the 20th century, gambling held steady as a central feature of city life and politics. From the poolrooms where recently arrived Irish New Yorkers bet on horseracing after the Civil War, to the corner stores where black and Puerto Rican New Yorkers bet on the numbers game in the 1960s, the gambling activity that covered the urban landscape produced argument and controversy, particularly with respect to drawing the line between crime and leisure, and over the question of where and to what ends the money of the gambling public should be directed.

Article

The Gay South  

Jerry Watkins

Regional variation, race, gender presentation, and class differences mean that there are many “Gay Souths.” Same-sex desire has been a feature of the human experience since the beginning, but the meanings, expressions, and ability to organize one’s life around desire have shifted profoundly since the invention of sexuality in the mid-19th century. World War II represented a key transition in gay history, as it gave many people a language for their desires. During the Cold War, government officials elided sex, race, and gender transgression with subversion and punished accordingly by state committees. These forces profoundly shaped gay social life, and rather than a straight line from closet to liberation, gays in the South have meandered. Movement rather than stasis, circulation rather than congregation, and the local rather than the stranger as well as creative uses of space and place mean that the gay South is distinctive, though not wholly unique, from the rest of the country.

Article

Gender in US Foreign Relations  

Heather Stur

Throughout US history, Americans have used ideas about gender to understand power, international relations, military behavior, and the conduct of war. Since Joan Wallach Scott called on scholars in 1986 to consider gender a “useful category of analysis,” historians have looked beyond traditional diplomatic and military sources and approaches to examine cultural sources, the media, and other evidence to try to understand the ideas that Americans have relied on to make sense of US involvement in the world. From casting weak nations as female to assuming that all soldiers are heterosexual males, Americans have deployed mainstream assumptions about men’s and women’s proper behavior to justify US diplomatic and military interventions in the world. State Department pamphlets describing newly independent countries in the 1950s and 1960s featured gendered imagery like the picture of a young Vietnamese woman on a bicycle that was meant to symbolize South Vietnam, a young nation in need of American guidance. Language in news reports and government cables, as well as film representations of international affairs and war, expressed gendered dichotomies such as protector and protected, home front and battlefront, strong and weak leadership, and stable and rogue states. These and other episodes illustrate how thoroughly gender shaped important dimensions about the character and the making of US foreign policy and historians’ examinations of diplomatic and military history.