1-20 of 24 Results

  • Keywords: urban x
Clear all

Article

The Immigration Act of 1924 was in large part the result of a deep political and cultural divide in America between heavily immigrant cities and far less diverse small towns and rural areas. The 1924 legislation, together with growing residential segregation, midcentury federal urban policy, and postwar suburbanization, undermined scores of ethnic enclaves in American cities between 1925 and the 1960s. The deportation of Mexicans and their American children during the Great Depression, the incarceration of West Coast Japanese Americans during World War II, and the wartime and postwar shift of so many jobs to suburban and Sunbelt areas also reshaped many US cities in these years. The Immigration Act of 1965, which enabled the immigration of large numbers of people from Asia, Latin America, and, eventually, Africa, helped to revitalize many depressed urban areas and inner-ring suburbs. In cities and suburbs across the country, the response to the new immigration since 1965 has ranged from welcoming to hostile. The national debate over immigration in the early 21st century reflects both familiar and newer cultural, linguistic, religious, racial, and regional rifts. However, urban areas with a history of immigrant incorporation remain the most politically supportive of such people, just as they were a century ago.

Article

The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.

Article

The transformation of post-industrial American life in the late 20th and early 21st centuries includes several economically robust metropolitan centers that stand as new models of urban and economic life, featuring well-educated populations that engage in professional practices in education, medical care, design and legal services, and artistic and cultural production. By the early 21st century, these cities dominated the nation’s consciousness economically and culturally, standing in for the most dynamic and progressive sectors of the economy, driven by collections of technical and creative spark. The origins of these academic and knowledge centers are rooted in the political economy, including investments shaped by federal policy and philanthropic ambition. Education and health care communities were and remain frequently economically robust but also rife with racial, economic, and social inequality, and riddled with resulting political tensions over development. These information communities fundamentally incubated and directed the proceeds of the new economy, but also constrained who accessed this new mode of wealth in the knowledge economy.

Article

American cities expanded during the late 19th century, as industrial growth was fueled by the arrival of millions of immigrants and migrants. Poverty rates escalated, overwhelming existing networks of private charities. Progressive reformers created relief organizations and raised public awareness of urban poverty. The devastating effects of the Great Depression inspired greater focus on poverty from state and federal agencies. The Social Security Act, the greatest legacy of the New Deal, would provide a safety net for millions of Americans. During the postwar era of general prosperity, federal housing policies often reinforced and deepened racial and socioeconomic inequality and segregation. The 1960s War on Poverty created vital aid programs that expanded access to food, housing, and health care. These programs also prompted a rising tide of conservative backlash against perceived excesses. Fueled by such critical sentiments, the Reagan administration implemented dramatic cuts to assistance programs. Later, the Clinton administration further reformed welfare by tying aid to labor requirements. Throughout the 20th century, the urban homeless struggled to survive in hostile environments. Skid row areas housed the homeless for decades, providing shelter, food, and social interaction within districts that were rarely visited by the middle and upper classes. The loss of such spaces to urban renewal and gentrification in many cities left many of the homeless unsheltered and dislocated.

Article

Urban politics provides a means to understand the major political and economic trends and transformations of the last seventy years in American cities. The growth of the federal government; the emergence of new powerful identity- and neighborhood-based social movements; and large-scale economic restructuring have characterized American cities since 1945. The postwar era witnessed the expansion of scope and scale of the federal government, which had a direct impact on urban space and governance, particularly as urban renewal fundamentally reshaped the urban landscape and power configurations. Urban renewal and liberal governance, nevertheless, spawned new and often violent tensions and powerful opposition movements among old and new residents. These movements engendered a generation of city politicians who assumed power in the 1970s. Yet all of these figures were forced to grapple with the larger forces of capital flight, privatization, the war on drugs, mass incarceration, immigration, and gentrification. This confluence of factors meant that as many American cities and their political representatives became demographically more diverse by the 1980s and 1990s, they also became increasingly separated by neighborhood boundaries and divided by the forces of class and economic inequality.

Article

The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.

Article

A. K. Sandoval-Strausz

“Latino urbanism” describes a culturally specific set of spatial forms and practices created by people of Hispanic origin. It includes many different aspects of those forms and practices, including town planning; domestic, religious, and civic architecture; the adaptation of existing residential, commercial, and other structures; and the everyday use of spaces such as yards, sidewalks, storefronts, streets, and parks. Latino urbanism has developed over both time and space. It is the evolving product of half a millennium of colonization, settlement, international and domestic migration, and globalization. It has spanned a wide geographic range, beginning in the southern half of North America and gradually expanding to much of the hemisphere. There have been many variations on Latino urbanism, but most include certain key features: shared central places where people show their sense of community, a walking culture that encourages face-to-face interaction with neighbors, and a sense that sociability should take place as much in the public realm as in the privacy of the home. More recently, planners and architects have realized that Latino urbanism offers solutions to problems such as sprawl, social isolation, and environmental unsustainability. The term “urbanism” connotes city spaces, and Latino urbanism is most concentrated and most apparent at the center of metropolitan areas. At the same time, it has also been manifested in a wide variety of places and at different scales, from small religious altars in private homes; to Spanish-dominant commercial streetscapes in Latino neighborhoods; and ultimately to settlement patterns that reach from the densely packed centers of cities to the diversifying suburbs that surround them, out to the agricultural hinterlands at their far peripheries—and across borders to big cities and small pueblos elsewhere in the Americas.

Article

During the 1890s, the word segregation became the preferred term for the practice of coercing different groups of people, especially those designated by race, to live in separate and unequal urban residential neighborhoods. In the southern states of the United States, segregationists imported the word—originally used in the British colonies of Asia—to describe Jim Crow laws, and, in 1910, whites in Baltimore passed a “segregation ordinance” mandating separate black and white urban neighborhoods. Copy-cat legislation sprang up in cities across the South and the Midwest. But in 1917, a multiracial team of lawyers from the fledgling National Association for the Advancement of Colored People (NAACP) mounted a successful legal challenge to these ordinances in the U.S. Supreme Court—even as urban segregation laws were adopted in other places in the world, most notably in South Africa. The collapse of the movement for legislated racial segregation in the United States occurred just as African Americans began migrating in large numbers into cities in all regions of the United States, resulting in waves of anti-black mob violence. Segregationists were forced to rely on nonstatutory or formally nonracial techniques. In Chicago, an alliance of urban reformers and real estate professionals invented alternatives to explicitly racist segregation laws. The practices they promoted nationwide created one of the most successful forms of urban racial segregation in world history, rivaling and finally outliving South African apartheid. Understanding how this system came into being and how it persists today requires understanding both how the Chicago segregationists were connected to counterparts elsewhere in the world and how they adapted practices of city-splitting to suit the peculiarities of racial politics in the United States.

Article

American cities developed under relatively quiescent climatic conditions. A gradual rise in average global temperatures during the 19th and 20th centuries had a negligible impact on how urban Americans experienced the weather. Much more significant were the dramatic changes in urban form and social organization that meditated the relationship between routine weather fluctuations and the lives of city dwellers. Overcoming weather-related impediments to profit, comfort, and good health contributed to many aspects of urbanization, including population migration to Sunbelt locations, increased reliance on fossil fuels, and comprehensive re-engineering of urban hydrological systems. Other structural shifts such as sprawling development, intensification of the built environment, socioeconomic segregation, and the tight coupling of infrastructural networks were less directly responsive to weather conditions but nonetheless profoundly affected the magnitude and social distribution of weather-related risks. Although fatalities resulting from extreme meteorological events declined in the 20th century, the scale of urban disruption and property damage increased. In addition, social impacts became more concentrated among poorer Americans, including many people of color, as Hurricane Katrina tragically demonstrated in 2005. Through the 20th century, cities responded to weather hazards through improved forecasting and systematic planning for relief and recovery rather than alterations in metropolitan design. In recent decades, however, growing awareness and concern about climate change impacts have made volatile weather more central to urban planning.

Article

The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.

Article

Nicolas G. Rosenthal

An important relationship has existed between Native Americans and cities from pre-Columbian times to the early 21st century. Long before Europeans arrived in the Americas, indigenous peoples developed societies characterized by dense populations, large-scale agriculture, monumental architecture, and complex social hierarchies. Following European and American conquest and colonization, Native Americans played a crucial role in the development of towns and cities throughout North America, often on the site of former indigenous settlements. Beginning in the early 20th century, Native Americans began migrating from reservations to U.S. cities in large numbers and formed new intertribal communities. By 1970, the majority of the Native American population lived in cities and the numbers of urban American Indians have been growing ever since. Indian Country in the early 21st century continues to be influenced by the complex and evolving ties between Native Americans and cities.

Article

Mass transit has been part of the urban scene in the United States since the early 19th century. Regular steam ferry service began in New York City in the early 1810s and horse-drawn omnibuses plied city streets starting in the late 1820s. Expanding networks of horse railways emerged by the mid-19th century. The electric streetcar became the dominant mass transit vehicle a half century later. During this era, mass transit had a significant impact on American urban development. Mass transit’s importance in the lives of most Americans started to decline with the growth of automobile ownership in the 1920s, except for a temporary rise in transit ridership during World War II. In the 1960s, congressional subsidies began to reinvigorate mass transit and heavy-rail systems opened in several cities, followed by light rail systems in several others in the next decades. Today concerns about environmental sustainability and urban revitalization have stimulated renewed interest in the benefits of mass transit.

Article

As places of dense habitation, cities have always required coordination and planning. City planning has involved the design and construction of large-scale infrastructure projects to provide basic necessities such as a water supply and drainage. By the 1850s, immigration and industrialization were fueling the rise of big cities, creating immense, collective problems of epidemics, slums, pollution, gridlock, and crime. From the 1850s to the 1900s, both local governments and utility companies responded to this explosive physical and demographic growth by constructing a “networked city” of modern technologies such as gaslight, telephones, and electricity. Building the urban environment also became a wellspring of innovation in science, medicine, and administration. In 1909–1910, a revolutionary idea—comprehensive city planning—opened a new era of professionalization and institutionalization in the planning departments of city halls and universities. Over the next thirty-five years, however, wars and depression limited their influence. From 1945 to 1965, in contrast, represents the golden age of formal planning. During this unprecedented period of peace and prosperity, academically trained experts played central roles in the modernization of the inner cities and the sprawl of the suburbs. But the planners’ clean-sweep approach to urban renewal and the massive destruction caused by highway construction provoked a revolt of the grassroots. Beginning in the Watts district of Los Angeles in 1965, mass uprisings escalated over the next three years into a national crisis of social disorder, racial and ethnic inequality, and environmental injustice. The postwar consensus of theory and practice was shattered, replaced by a fragmented profession ranging from defenders of top-down systems of computer-generated simulations to proponents of advocacy planning from the bottom up. Since the late 1980s, the ascendency of public-private partnerships in building the urban environment has favored the planners promoting systems approaches, who promise a future of high-tech “smart cities” under their complete control.

Article

Between 1820 and 1924, nearly thirty-six million immigrants entered the United States. Prior to the Civil War, the vast majority of immigrants were northern and western Europeans, though the West Coast received Chinese immigration from the late 1840s onward. In mid-century, the United States received an unprecedented influx of Irish and German immigrants, who included a large number of Catholics and the poor. At the turn of the 20th century, the major senders of immigrants shifted to southern and eastern Europe, and Asians and Mexicans made up a growing portion of newcomers. Throughout the long 19th century, urban settlement remained a popular option for immigrants, and they contributed to the social, cultural, political, economic, and physical growth of the cities they resided in. Foreign-born workers also provided much-needed labor for America’s industrial development. At the same time, intense nativism emerged in cities in opposition to the presence of foreigners, who appeared to be unfit for American society, threats to Americans’ jobs, or sources of urban problems such as poverty. Anti-immigrant sentiment resulted in the introduction of state and federal laws for preventing the immigration of undesirable foreigners, such as the poor, southern and eastern Europeans, and Asians. Cities constituted an integral part of the 19th-century American immigration experience.

Article

Jessica Ellen Sewell

From 1800 to 2000, cities grew enormously, and saw an expansion of public spaces to serve the varied needs of a diverse population living in ever more cramped and urban circumstances. While a wide range of commercial semipublic spaces became common in the late 19th century, parks and streets were the best examples of truly public spaces with full freedom of access. Changes in the design and management of streets, sidewalks, squares, parks, and plazas during this period reflect changing ideas about the purpose of public space and how it should be used. Streets shifted from being used for a wide range of activities, including vending, playing games, and storing goods, to becoming increasingly specialized spaces of movement, designed and managed by the early twentieth century for automobile traffic. Sidewalks, which in the early nineteenth century were paid for and liberally used by adjacent businesses, were similarly specialized as spaces of pedestrian movement. However, the tradition of using streets and sidewalks as a space of public celebration and public speech remained strong throughout the period. During parades and protests, streets and sidewalks were temporarily remade as spaces of the performance of the public, and the daily activities of circulation and commerce were set aside. In 1800, the main open public spaces in cities were public squares or commons, often used for militia training and public celebration. In the second half of the 19th century, these were augmented by large picturesque parks. Designed as an antidote to urbanity, these parks served the public as a place for leisure, redefining public space as a polite leisure amenity, rather than a place for people to congregate as a public. The addition of playgrounds, recreational spaces, and public plazas in the 20th century served both the physical and mental health of the public. In the late 20th century, responding to neoliberal ideas and urban fiscal crises, the ownership and management of public parks and plazas was increasingly privatized, further challenging public accessibility.

Article

Fires have plagued American cities for centuries. During the 18th century, the Great Fire of Boston (1760), the First Great Fire of New York City (1776), the First Great New Orleans Fire (1788), and the Great Fire of Savannah (1796) each destroyed hundreds of buildings and challenged municipal authorities to improve safety in an increasingly risky environment. Beginning in the 19th century, with increasing commerce, rapid urbanization, and the rise of industrial capitalism, fires became more frequent and destructive. Several initiatives sought to reduce the risk of fire: volunteer fire companies emerged in all major cities, fire insurance developed to help economic recovery, and municipal infrastructure like fire hydrants became ubiquitous to combat blazes. Despite significant efforts to curb this growing urban problem, fire dangers increased in the late 19th century as cities became epicenters of industry and the populations boomed. The “great” fires of the late 19th century, like those that took place in Chicago (1871), Boston (1872), Seattle (1889), Baltimore (1904), and San Francisco (1906), fundamentally altered cities. The fires not only destroyed buildings and took lives, but they also unearthed deep-rooted social tensions. Rebuilding in the aftermath of fire further exacerbated inequalities and divided cities. While fire loss tapered off after 1920, other issues surrounding urban fires heated up. The funneling of resources to suburbs in the post-war white-flight period left inner cities ill-equipped to handle serious conflagrations. In last few decades, suburban sprawl has created exurban fire regimes, where wildfires collide with cities. Extreme weather events, dependence on fossil fuels, deregulation of risky industries, and a lack of safe and affordable housing has put American metropolitan areas on a path to experience another period of “great” fires like those of the late 19th and 20th centuries.

Article

Katherine R. Jewell

The term “Sunbelt” connotes a region defined by its environment. “Belt” suggests the broad swath of states from the Atlantic coast, stretching across Texas and Oklahoma, the Southwest, to southern California. “Sun” suggests its temperate—even hot—climate. Yet in contrast to the industrial northeastern and midwestern Rust Belt, or perhaps, “Frost” Belt, the term’s emergence at the end of the 1960s evoked an optimistic, opportunistic brand. Free from snowy winters, with spaces cooled by air conditioners, and Florida’s sandy beaches or California’s surfing beckoning, it is true that more Americans moved to the Sunbelt states in the 1950s and 1960s than to the deindustrializing centers of the North and East. But the term “Sunbelt” also captures an emerging political culture that defies regional boundaries. The term originates more from the diagnosis of this political climate, rather than an environmental one, associated with the new patterns of migration in the mid-20th century. The term defined a new regional identity: politically, economically, in policy, demographically, and socially, as well as environmentally. The Sunbelt received federal money in an unprecedented manner, particularly because of rising Cold War defense spending in research and military bases, and its urban centers grew in patterns unlike those in the old Northeast and Midwest, thanks to the policy innovations wrought by local boosters, business leaders, and politicians, which defined politics associated with the region after the 1970s. Yet from its origin, scholars debate whether the Sunbelt’s emergence reflects a new regional identity, or something else.

Article

Many Asian American neighborhoods faced displacement after World War II because of urban renewal or redevelopment under the 1949 Housing Act. In the name of blight removal and slum clearance this Act allowed local elites to procure federal money to seize land designated as blighted, clear it of its structures, and sell this land to private developers—in the process displacing thousands of residents, small businesses, and community institutions. San Francisco’s Fillmore District, a multiracial neighborhood that housed the city’s largest Japanese American and African American communities, experienced this postwar redevelopment. Like many Asian American neighborhoods that shared space with other communities of color, the Fillmore formed at the intersection of class inequality and racism, and it was this intersection of structural factors that led to substandard urban conditions. Rather than recognize the root causes of urban decline, San Francisco urban and regional elites argued that the Fillmore was among the city’s most blighted neighborhoods and advocated for the neighborhood’s destruction in the name of the public good. They also targeted the Fillmore because their postwar plans for remaking the city’s political economy envisioned the Fillmore as (1) a space to house white- collar workers in the postwar economy and (2) as an Asian-themed space for tourism that connected the city symbolically and economically to Japan, an important U.S. postwar ally. For over four decades these elite-directed plans for the Fillmore displaced more than 20,000 residents in two phases, severely damaging the community. The Fillmore’s redevelopment, then, provides a window into other cases of redevelopment and aids further investigations of the connection between Asian Americans and urban crisis. It also sheds light on the deeper history of displacement in the Asian American experience and contextualizes contemporary gentrification in Asian American neighborhoods.

Article

During the American Civil War, Union and Confederate commanders made the capture and destruction of enemy cities a central feature of their military campaigns. They did so for two reasons. First, most mid-19th-century cities had factories, foundries, and warehouses within their borders, churning out and storing war materiel; military officials believed that if they interrupted or incapacitated the enemy’s ability to arm or clothe themselves, the war would end. Second, it was believed that the widespread destruction of property—especially in major or capital cities—would also damage civilians’ morale, undermining their political convictions and decreasing their support for the war effort. Both Union and Confederate armies bombarded and burned cities with these goals in mind. Sometimes they fought battles on city streets but more often, Union troops initiated long-term sieges in order to capture Confederate cities and demoralize their inhabitants. Soldiers on both sides were motivated by vengeance when they set fire to city businesses and homes; these acts were controversial, as was defensive burning—the deliberate destruction of one’s own urban center in order to keep its war materiel out of the hands of the enemy. Urban destruction, particularly long-term sieges, took a psychological toll on (mostly southern) city residents. Many were wounded, lost property, or were forced to become refugees. Because of this, the destruction of cities during the American Civil War provoked widespread discussions about the nature of “civilized warfare” and the role that civilians played in military strategy. Both soldiers and civilians tried to make sense of the destruction of cities in writing, and also in illustrations and photographs; images in particular shaped both northern and southern memories of the war and its costs.

Article

The civil rights movement in the urban South transformed the political, economic, and cultural landscape of post–World War II America. Between 1955 and 1968, African Americans and their white allies relied on nonviolent direct action, political lobbying, litigation, and economic boycotts to dismantle the Jim Crow system. Not all but many of the movement’s most decisive political battles occurred in the cities of Montgomery and Birmingham, Alabama; Nashville and Memphis, Tennessee; Greensboro and Durham, North Carolina; and Atlanta, Georgia. In these and other urban centers, civil rights activists launched full-throttled campaigns against white supremacy, economic exploitation, and state-sanctioned violence against African Americans. Their fight for racial justice coincided with monumental changes in the urban South as the upsurge in federal spending in the region created unprecedented levels of economic prosperity in the newly forged “Sunbelt.” A dynamic and multifaceted movement that encompassed a wide range of political organizations and perspectives, the black freedom struggle proved successful in dismantling legal segregation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 expanded black southerners’ economic, political, and educational opportunities. And yet, many African Americans continued to struggle as they confronted not just the long-term effects of racial discrimination and exclusion but also the new challenges engendered by deindustrialization and urban renewal as well as entrenched patterns of racial segregation in the public-school system.