Political lobbying has always played a key role in American governance, but the concept of paid influence peddling has been marked by a persistent tension throughout the country’s history. On the one hand, lobbying represents a democratic process by which citizens maintain open access to government. On the other, the outsized clout of certain groups engenders corruption and perpetuates inequality. The practice of lobbying itself has reflected broader social, political, and economic changes, particularly in the scope of state power and the scale of business organization. During the Gilded Age, associational activity flourished and lobbying became increasingly the province of organized trade associations. By the early 20th century, a wide range at political reforms worked to counter the political influence of corporations. Even after the Great Depression and New Deal recast the administrative and regulatory role of the federal government, business associations remained the primary vehicle through which corporations and their designated lobbyists influenced government policy. By the 1970s, corporate lobbyists had become more effective and better organized, and trade associations spurred a broad-based political mobilization of business. Business lobbying expanded in the latter decades of the 20th century; while the number of companies with a lobbying presence leveled off in the 1980s and 1990s, the number of lobbyists per company increased steadily and corporate lobbyists grew increasingly professionalized. A series of high-profile political scandals involving lobbyists in 2005 and 2006 sparked another effort at regulation. Yet despite popular disapproval of lobbying and distaste for politicians, efforts to substantially curtail the activities of lobbyists and trade associations did not achieve significant success.
Article
Lobbying and Business Associations
Benjamin C. Waterhouse
Article
The Memorial Day Massacre and American Labor
Ahmed White
On the afternoon of May 30, 1937, the Chicago Police killed or mortally wounded ten men who were among a large group of unionists attempting to picket a mill operated by the Republic Steel Corporation. Scores of demonstrators were injured, some critically, in this shocking episode. The “Memorial Day Massacre” occurred during the Little Steel Strike, a sprawling and protracted conflict that arose out of the Committee for Industrial Organization’s (CIO) attempt to overcome the strident resistance of a coalition of power companies and to organize the basic steel industry. The strike evolved into a contest to decide how much the Second New Deal and its legislative centerpiece, the Wagner Act, would alter the landscape of American labor relations. This was evident in Chicago, where the unionists’ efforts to engage in mass picketing at Republic’s plant were an attempt to wrest from the Wagner Act’s ambiguous terms an effective right to strike, and where the violence of the police, who were doing Republic’s bidding, was intended to prevent this. Ultimately, the use of violence against the unionists not only defeated this bid to engage in mass picketing but served, along with similar clashes elsewhere during the strike, to justify government intervention that ended the walkout and secured the companies’ victory. Later, the strike and the massacre were invoked to justify political and legal changes that further limited the right to strike and that endorsed much of what the police, the steel companies, and their allies had done during the conflict. While the CIO did eventually organize steel, this success was primarily the result of the war and not the strike or the labor law. And although the National Labor Relations Board prosecuted the steel companies for violating the Wagner Act, this litigation took years and ended with Republic facing only modest penalties.
Article
Mexican Americans in the United States
Iliana Yamileth Rodriguez
Mexican American history in the United States spans centuries. In the 16th and 17th centuries, the Spanish Empire colonized North American territories. Though met with colonial rivalries in the southeast, Spanish control remained strong in the US southwest through the 19th century. The mid-1800s were an era of power struggles over territory and the construction of borders, which greatly impacted ethnic Mexicans living in the US-Mexico borderlands. After the Mexican-American War (1846–1848), the passage of the Treaty of Guadalupe Hidalgo allowed the United States to take all or parts of California, Arizona, Nevada, Utah, Colorado, and New Mexico. Ethnic Mexicans living in newly incorporated regions in the mid- through late 19th century witnessed the radical restructuring of their lives along legal, economic, political, and cultural lines.
The early 20th century witnessed the rise of anti-Mexican sentiment and violence. As ethnic Mexican communities came under attack, Mexican Americans took leadership roles in institutions, labor unions, and community groups to fight for equality. Both tensions and coalition-building efforts between Mexican Americans and Mexican migrants animated the mid-20th century, as did questions about wartime identity, citizenship, and belonging. By the late 20th century, Chicana/o politics took center stage and brought forth a radical politics informed by the Mexican American experience. Finally, the late 20th through early 21st centuries saw further geographic diversification of Mexican American communities outside of the southwest.
Article
The Mexican Revolution
Benjamin H. Johnson
When rebels captured the border city of Juárez, Mexico, in May 1911 and forced the abdication of President Porfirio Díaz shortly thereafter, they not only overthrew the western hemisphere’s oldest regime but also inaugurated the first social revolution of the 20th century. Driven by disenchantment with an authoritarian regime that catered to foreign investment, labor exploitation, and landlessness, revolutionaries dislodged Díaz’s regime, crushed an effort to resurrect it, and then spent the rest of the decade fighting one another for control of the nation. This struggle, recognized ever since as foundational for Mexican politics and identity, also had enormous consequences for the ethnic makeup, border policing, and foreign policy of the United States. Over a million Mexicans fled north during the 1910s, perhaps tripling the country’s Mexican-descent population, most visibly in places such as Los Angeles that had become overwhelmingly Anglo-American. US forces occupied Mexican territory twice, nearly bringing the two nations to outright warfare for the first time since the US–Mexican War of 1846–1848. Moreover, revolutionary violence and radicalism transformed the ways that much of the American population and its government perceived their border with Mexico, providing a rationale for a much more highly policed border and for the increasingly brutal treatment of Mexican-descent people in the United States. The Mexican Revolution was a turning point for Mexico, the United States, and their shared border, and for all who crossed it.
Article
The Multinational Corporation
Paula De la Cruz-Fernandez
A multinational corporation is a multiple unit business enterprise, vertically managed, that operates in various countries, called host economies. Operations beyond national borders are controlled and managed from one location or headquarters, called the home economy. The units or business activities such as manufacturing, distribution, and marketing are, in the modern multinational as opposed to other forms of international business, all structured under a single organization. The location of the headquarters of the multinational corporation, where the business is registered, defines the “nationality” of the company. While United Kingdom held ownership of over half of the world’s foreign direct investment (FDI), defined not as acquisition but as a managed, controlled investment that an organization does beyond its national border, at the beginning of the 20th century, the United States grew to first place throughout the 20th century—in 2002, 22 percent of the world’s FDI came from the United States, which was also home to ten of the fifty largest corporations in the world.
The US-based, large, modern corporation, operated by salaried managers with branches and operations in many nations, emerged in the mid-19th century and has since been a key player and driver in both economic and cultural globalization. The development of corporate capitalism in the United States is closely related with the growth of US-driven business abroad and has unique features that place the US multinational model apart from other business organizations operating internationally such as family multinational businesses which are more common in Europe and Latin America. The range and diversity of US-headquartered multinationals changed over time as well, and different countries and cultures made the nature of managing business overseas more complex. Asia came strong into the picture in the last third of the 20th century as regulations and deindustrialization grew in Europe. Global expansion also meant that societies around the world were connecting transnationally through new channels. Consumers and producers globally are also part of the history of multinational corporations—cultural values, socially constructed perceptions of gender and race, different understandings of work, and the everyday lives and experiences of peoples worldwide are integral to the operations and forms of multinationals.
Article
Municipal Housing in America
Margaret Garb
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
Article
Museums and the Educational Mission from the Progressive Era to World War II
Jessie Swigger
In May 1906, museum workers from across the country gathered in New York City at the American Museum of Natural History for the first annual meeting of the American Association of Museums (AAM). Over the course of two days, AAM members elected officers, ratified a constitution, and shared ideas about how best to collect, store, and display objects and specimens. The meeting culminated with a resolution to create a formal partnership with the National Education Association (NEA).
AAM members’ interest in linking their work with the NEA signified that by the early 20th century, most museum leaders agreed that educating the public was a priority. This commitment to education shaped exhibition and collecting practices and the services that museums provided and expanded the power of museum visitors and audiences. While administrators, curators, and exhibit preparers often agreed on the collective goal of educating the public, their approaches varied. How museum education was defined and assessed depended on the type of museum in which one was employed, and it changed over time in response to broader social, cultural, and political forces. By 1945, however, museums of all types had formalized and institutionalized their practices in ways that placed education at the core of their purpose and actions.
Article
The National Parks
Donald Worster
The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
Article
Native People and American Film and TV
Liza Black
Native people have appeared as characters in film and television in America from their inceptions. Throughout the 20th century, Native actors, writers, directors, and producers worked in the film and television industry. In terms of characterization, Native employment sits uncomfortable beside racist depictions of Native people. From the 1950s to the present, revisionist westerns come into being, giving the viewer a moral tale in which Native people are depicted with sympathy and white Americans are seen as aggressors. Today, a small but important group of Native actors in film and television work in limiting roles but turn in outstanding performances. Native directors, writers, and documentarians in the 1990s to the early 21st century have created critical interventions into media representations, telling stories from Indigenous viewpoints and bringing Native voices to the fore. The 2021 television show Rutherford Falls stands out as an example of Native writers gaining entry into the television studio system. Additionally, we have several Native film festivals in the early 21st century, and this trend continues to grow.
Article
Neutrality/Nonalignment and the United States
Robert Rakove
For almost a century and a half, successive American governments adopted a general policy of neutrality on the world stage, eschewing involvement in European conflicts and, after the Quasi War with France, alliances with European powers. Neutrality, enshrined as a core principle of American foreign relations by the outgoing President George Washington in 1796, remained such for more than a century.
Finally, in the 20th century, the United States emerged as a world power and a belligerent in the two world wars and the Cold War. This article explores the modern conflict between traditional American attitudes toward neutrality and the global agenda embraced by successive U.S. governments, beginning with entry in the First World War. With the United States immersed in these titanic struggles, the traditional U.S. support for neutrality eroded considerably. During the First World War, the United States showed some sympathy for the predicaments of the remaining neutral powers. In the Second World War it applied considerable pressure to those states still trading with Germany. During the Cold War, the United States was sometimes impatient with the choices of states to remain uncommitted in the global struggle, while at times it showed understanding for neutrality and pursued constructive relations with neutral states. The wide varieties of neutrality in each of these conflicts complicated the choices of U.S. policy makers. Americans remained torn between memory of their own long history of neutrality and a capacity to understand its potential value, on one hand, and a predilection to approach conflicts as moral struggles, on the other.
Article
The New Deal
Wendy L. Wall
The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state.
The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy.
Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”
Article
The New Deal and the Arts
Sharon Musher
During the Great Depression, artists and intellectuals—like others who were down-and-out—turned to the federal government to demand work and a livable wage. In a brief flowering of public art, the New Deal funded thousands of needy and meritorious artists to decorate, document, entertain, and teach the nation. Working through Federal Project Number One under the auspices of the Works Progress Administration, which included the Federal Theatre Project, Federal Art Project, Federal Music Project, and Federal Writers’ Project as well as the Treasury’s Section of Painting and Sculpture (renamed the Section of Fine Arts) and Roy Stryker’s Historical Section, which operated under the Resettlement Administration, the Farm Security Administration, and then the Office of War Information, the artists produced hundreds of thousands of works of art to entertain millions of Americans.
The arts projects democratized the artists receiving public support, the citizens creating and experiencing original works of art, creative styles, and artistic subjects. They drew attention to previously neglected publics, including formerly enslaved people, Native Americans, migrant workers, and the working class. But art administrators also limited artists’ autonomy. They rejected nudity and overt politics, maintained racial segregation, and upheld racial and gendered discrimination. Political realignment, budget cuts, decentralization, congressional hearings, and loyalty oaths further constrained artists. In 1939, Congress terminated the Theatre Project and reorganized the other art projects. Congress defunded most of the remaining art projects in 1943, almost two years after the United States entered World War II. Despite a relatively short life and enduring controversies, New Deal art remains an important example of how robust public patronage can stimulate the arts and society.
Article
New Women in Early 20th-Century America
Einav Rabinovitch-Fox
In late 19th- and early 20th-century America, a new image of womanhood emerged that began to shape public views and understandings of women’s role in society.
Identified by contemporaries as a Gibson Girl, a suffragist, a Progressive reformer, a bohemian feminist, a college girl, a bicyclist, a flapper, a working-class militant, or a Hollywood vamp, all of these images came to epitomize the New Woman, an umbrella term for modern understandings of femininity. Referring both to real, flesh-and-blood women, and also to an abstract idea or a visual archetype, the New Woman represented a generation of women who came of age between 1890 and 1920 and challenged gender norms and structures by asserting a new public presence through work, education, entertainment, and politics, while also denoting a distinctly modern appearance that contrasted with Victorian ideals. The New Woman became associated with the rise of feminism and the campaign for women’s suffrage, as well as with the rise of consumerism, mass culture, and freer expressions of sexuality that defined the first decades of the 20th century. Emphasizing youth, mobility, freedom, and modernity, the image of the New Woman varied by age, class, race, ethnicity, and geographical region, offering a spectrum of behaviors and appearances with which different women could identify. At times controversial, the New Woman image provided women with opportunities to negotiate new social roles and to promote ideas of equality and freedom that would later become mainstream.
Article
Philadelphia
Timothy J. Lombardo
Officially established by English Quaker William Penn in 1682, Philadelphia’s history began when indigenous peoples first settled the area near the confluence of the Delaware and Schuylkill Rivers. Since European colonization, Philadelphia has grown from a major colonial-era port to an industrial manufacturing center to a postindustrial metropolis. For more than three centuries, Philadelphia’s history has been shaped by immigration, migration, industrialization, deindustrialization, ethnic and racial conflict, political partisanship, and periods of economic restructuring. The city’s long history offers a window into urban development in the United States.
Article
Phoenix
Elizabeth Tandy Shermer
Phoenix, the capital of the state of Arizona, exemplifies the ways Sun Belt cities dramatically grew after World War II. Phoenix was best described as a small trading town in 1912, when Arizona became the last territory to achieve statehood in the continental United States. Although Phoenix was a capital city located in an area with little rainfall and high summer temperatures, its economy depended heavily on the sale of cotton and copper as well as tourists attracted to the Salt River valley’s warm winters. But members of the local Chamber of Commerce, like many small-town boosters across the US South and West, wanted to attract manufacturers by the 1930s, when the Great Depression upended the agricultural, mining, and tourism markets. The Chamber’s White male leaders (including future Senator Barry Goldwater) succeeded during World War II. They lobbied for wartime investment that transformed Phoenix into one of the many boom towns that dotted the South and West. That success fueled postwar efforts to attract industry by building a favorable “business climate.” Local leaders, business executives, and industry experts used that seemingly benign phrase to describe cities that guaranteed investors low taxes, weak unions, few government regulations, and other policies that maximized profits and undermined 1930s reforms. Phoenix stood out in what reporters called the “Second War between the States” for industry. General Electric, Motorola, and Sperry Rand had all opened branch plants by 1960, when Phoenix was already one of the largest US cities. It also stood out in 1969, when Republican strategist Kevin Phillips drew attention to the “Sun Belt phenomenon” that seemed to be the metropolitan core of a new conservative politics dedicated to free enterprise and poised to spread across the rapidly deindustrializing Northeast and Midwest. But growth undermined the Chamber’s power. By the 1970s, citizens questioned putting business first, and investors began shifting manufacturing overseas, which left residents to deal with the environmental, fiscal, and political damage the business climate ideal had wrought.
Article
Postbellum Banking
Robert Wright
Between passage of the National Banking Acts near the end of the US Civil War and the outbreak of the Great War and implementation of the Federal Reserve System in 1914, a large, vibrant financial system based on the gold standard and composed of markets and intermediaries supported the rapid growth and development of the American economy. Markets included over-the-counter markets and formal exchanges for financial securities, including bills of exchange (foreign currencies), cash (short-term debt), debt (corporate and government bonds), and equities (ownership shares in corporations), initial issuance of which increasingly fell to investment banks. Intermediaries included various types of insurers (marine, fire, and life, plus myriad specialists like accident and wind insurers) and true depository institutions, which included trust companies, mutual and stock savings banks, and state- and federally-chartered commercial banks. Nominal depository institutions also operated, and included building and loan associations and, eventually, credit unions and Morris Plan and other industrial banks. Non-depository lenders included finance and mortgage companies, provident loan societies, pawn brokers, and sundry other small loan brokers. Each type of “bank,” broadly construed, catered to customers differentiated by their credit characteristics, gender, race/ethnicity, country of birth, religion, and/or socioeconomic class, had distinctive balance sheets and loan application and other operating procedures, and reacted differently to the three major postbellum financial crises in 1873, 1892, and 1907.
Article
Poverty in the Modern American City
Ella Howard
American cities expanded during the late 19th century, as industrial growth was fueled by the arrival of millions of immigrants and migrants. Poverty rates escalated, overwhelming existing networks of private charities. Progressive reformers created relief organizations and raised public awareness of urban poverty. The devastating effects of the Great Depression inspired greater focus on poverty from state and federal agencies. The Social Security Act, the greatest legacy of the New Deal, would provide a safety net for millions of Americans. During the postwar era of general prosperity, federal housing policies often reinforced and deepened racial and socioeconomic inequality and segregation. The 1960s War on Poverty created vital aid programs that expanded access to food, housing, and health care. These programs also prompted a rising tide of conservative backlash against perceived excesses. Fueled by such critical sentiments, the Reagan administration implemented dramatic cuts to assistance programs. Later, the Clinton administration further reformed welfare by tying aid to labor requirements. Throughout the 20th century, the urban homeless struggled to survive in hostile environments. Skid row areas housed the homeless for decades, providing shelter, food, and social interaction within districts that were rarely visited by the middle and upper classes. The loss of such spaces to urban renewal and gentrification in many cities left many of the homeless unsheltered and dislocated.
Article
Progressives and Progressivism in an Era of Reform
Maureen A. Flanagan
The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members.
Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.
Article
Public Space in North American Cities
Jessica Ellen Sewell
From 1800 to 2000, cities grew enormously, and saw an expansion of public spaces to serve the varied needs of a diverse population living in ever more cramped and urban circumstances. While a wide range of commercial semipublic spaces became common in the late 19th century, parks and streets were the best examples of truly public spaces with full freedom of access. Changes in the design and management of streets, sidewalks, squares, parks, and plazas during this period reflect changing ideas about the purpose of public space and how it should be used.
Streets shifted from being used for a wide range of activities, including vending, playing games, and storing goods, to becoming increasingly specialized spaces of movement, designed and managed by the early twentieth century for automobile traffic. Sidewalks, which in the early nineteenth century were paid for and liberally used by adjacent businesses, were similarly specialized as spaces of pedestrian movement. However, the tradition of using streets and sidewalks as a space of public celebration and public speech remained strong throughout the period. During parades and protests, streets and sidewalks were temporarily remade as spaces of the performance of the public, and the daily activities of circulation and commerce were set aside.
In 1800, the main open public spaces in cities were public squares or commons, often used for militia training and public celebration. In the second half of the 19th century, these were augmented by large picturesque parks. Designed as an antidote to urbanity, these parks served the public as a place for leisure, redefining public space as a polite leisure amenity, rather than a place for people to congregate as a public. The addition of playgrounds, recreational spaces, and public plazas in the 20th century served both the physical and mental health of the public. In the late 20th century, responding to neoliberal ideas and urban fiscal crises, the ownership and management of public parks and plazas was increasingly privatized, further challenging public accessibility.
Article
Race Films
Alyssa Lopez
In the early 1910s, Black Americans turned to motion pictures in order to resist the incessant racism they experienced through popular culture and in their everyday lives. Entrepreneurs, educators, and uplift-minded individuals believed that this modern medium could be used as a significant means to demonstrate Black humanity and dignity while, perhaps, making money in the burgeoning industry. The resultant race films ranged in content from fictionalized comedies and dramas to local exhibitions of business meetings and Black institutions. Racial uplift was a central tenet of the race film industry and was reflected most clearly in the intra-racial debate over positive versus negative images of Black life. Inside theaters, Black spectators also developed ways to mitigate racism on screen when race films were not the evening’s entertainment. The race film industry encouraged Black institution-building in the form of a critical Black film criticism tradition, Black-owned theaters, and the hiring of Black employees. Race films and the industry that made their success possible constituted a community affair that involved filmmakers, businessmen, leaders, journalists, and the moviegoing public.