41-50 of 67 Results  for:

  • Economic History x
Clear all

Article

The Central Business District in American Cities  

Emily Remus

The central business district, often referred to as the “downtown,” was the economic nucleus of the American city in the 19th and 20th centuries. It stood at the core of urban commercial life, if not always the geographic center of the metropolis. Here was where the greatest number of offices, banks, stores, and service institutions were concentrated—and where land values and building heights reached their peaks. The central business district was also the most easily accessible point in a city, the place where public transit lines intersected and brought together masses of commuters from outlying as well as nearby neighborhoods. In the downtown, laborers, capitalists, shoppers, and tourists mingled together on bustling streets and sidewalks. Not all occupants enjoyed equal influence in the central business district. Still, as historian Jon C. Teaford explained in his classic study of American cities, the downtown was “the one bit of turf common to all,” the space where “the diverse ethnic, economic, and social strains of urban life were bound together, working, spending, speculating, and investing.” The central business district was not a static place. Boundaries shifted, expanding and contracting as the city grew and the economy evolved. So too did the primary land uses. Initially a multifunctional space where retail, wholesale, manufacturing, and financial institutions crowded together, the central business district became increasingly segmented along commercial lines in the 19th century. By the early 20th century, rising real estate prices and traffic congestion drove most manufacturing and processing operations to the periphery. Remaining behind in the city center were the bulk of the nation’s offices, stores, and service institutions. As suburban growth accelerated in the mid-20th century, many of these businesses also vacated the downtown, following the flow of middle-class, white families. Competition with the suburbs drained the central business district of much of its commercial vitality in the second half of the 20th century. It also inspired a variety of downtown revitalization schemes that tended to reinforce inequalities of race and class.

Article

Public Housing in Urban America  

D. Bradford Hunt

Public housing emerged during the New Deal as a progressive effort to end the scourge of dilapidated housing in American cities. Reformers argued that the private market had failed to provide decent, safe, and affordable housing, and they convinced Congress to provide deep subsidies to local housing authorities to build and manage modern, low-cost housing projects for the working poor. Well-intentioned but ultimately misguided policy decisions encouraged large-scale developments, concentrated poverty and youth, and starved public housing of needed resources. Further, the antipathy of private interests to public competition and the visceral resistance of white Americans to racial integration saddled public housing with many enemies and few friends. While residents often formed tight communities and fought for improvements, stigmatization and neglect undermined the success of many projects; a sizable fraction became disgraceful and tangible symbols of systemic racism toward the nation’s African American poor. Federal policy had few answers and retreated in the 1960s, eventually making a neoliberal turn to embrace public-private partnerships for delivering affordable housing. Housing vouchers and tax credits effectively displaced the federal public housing program. In the 1990s, the Clinton administration encouraged the demolition and rebuilding of troubled projects using vernacular “New Urbanist” designs to house “mixed-income” populations. Policy problems, political weakness, and an ideology of homeownership in the United States meant that a robust, public-centered program of housing for use rather than profit could not be sustained.

Article

Skyscrapers and Tall Buildings  

Elihu Rubin

The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.

Article

Service Economies and the American Postindustrial City, 1950–Present  

Patrick Vitale

In the seventy years since the end of World War II (1939–1945), postindustrialization—the exodus of manufacturing and growth of finance and services—has radically transformed the economy of North American cities. Metropolitan areas are increasingly home to transnational firms that administer dispersed production networks that span the world. A few major global centers host large banks that coordinate flows of finance capital necessary not only for production, but also increasingly for education, infrastructure, municipal government, housing, and nearly every other aspect of life. In cities of the global north, fewer workers produce goods and more produce information, entertainment, and experiences. Women have steadily entered the paid workforce, where they often do the feminized work of caring for children and the ill, cleaning homes, and preparing meals. Like the Gilded Age city, the postindustrial city creates immense social divisions, injustices, and inequalities: penthouses worth millions and rampant homelessness, fifty-dollar burgers and an epidemic of food insecurity, and unparalleled wealth and long-standing structural unemployment all exist side by side. The key features of the postindustrial service economy are the increased concentration of wealth, the development of a privileged and celebrated workforce of professionals, and an economic system reliant on hyperexploited service workers whose availability is conditioned by race, immigration status, and gender.

Article

Autoworkers and Their Unions  

Daniel Clark

Since the introduction of “Fordism” in the early 1910s, which emphasized technological improvements and maximizing productive efficiency, US autoworkers have struggled with repetitive, exhausting, often dangerous jobs. Yet beginning with Ford’s Five Dollar Day, introduced in 1914, auto jobs have also provided higher pay than most other wage work, attracting hundreds of thousands of people, especially to Detroit, Michigan, through the 1920s, and again from World War II until the mid-1950s. Successful unionization campaigns by the United Auto Workers (UAW) in the 1930s and early 1940s resulted in contracts that guaranteed particular wage increases, reduced the power of foremen, and created a process for resolving workplace conflicts. In the late 1940s and early 1950s UAW president Walter Reuther negotiated generous medical benefits and pensions for autoworkers. The volatility of the auto industry, however, often brought layoffs that undermined economic security. By the 1950s overproduction and automation contributed heavily to instability for autoworkers. The UAW officially supported racial and gender equality, but realities in auto plants and the makeup of union leadership often belied those principles. Beginning in the 1970s US autoworkers faced disruptions caused by high oil prices, foreign competition, and outsourcing to Mexico. Contract concessions at unionized plants began in the late 1970s and continued into the 2000s. By the end of the 20th century, many American autoworkers did not belong to the UAW because they were employed by foreign automakers, who built factories in the United States and successfully opposed unionization. For good reason, autoworkers who survived the industry’s turbulence and were able to retire with guaranteed pensions and medical care look back fondly on all that they gained from working in the industry under UAW contracts. Countless others left auto work permanently and often reluctantly in periodic massive layoffs and the continuous loss of jobs from automation.

Article

Foreign Economic Aid  

Jeffrey F. Taffet

In the first half of the 20th century, and more actively in the post–World War II period, the United States government used economic aid programs to advance its foreign policy interests. US policymakers generally believed that support for economic development in poorer countries would help create global stability, which would limit military threats and strengthen the global capitalist system. Aid was offered on a country-by-country basis to guide political development; its implementation reflected views about how humanity had advanced in richer countries and how it could and should similarly advance in poorer regions. Humanitarianism did play a role in driving US aid spending, but it was consistently secondary to political considerations. Overall, while funding varied over time, amounts spent were always substantial. Between 1946 and 2015, the United States offered almost $757 billion in economic assistance to countries around the world—$1.6 trillion in inflation-adjusted 2015 dollars. Assessing the impact of this spending is difficult; there has long been disagreement among scholars and politicians about how much economic growth, if any, resulted from aid spending and similar disputes about its utility in advancing US interests. Nevertheless, for most political leaders, even without solid evidence of successes, aid often seemed to be the best option for constructively engaging poorer countries and trying to create the kind of world in which the United States could be secure and prosperous.

Article

Globalization and the American City  

B. Alex Beasley

American cities have been transnational in nature since the first urban spaces emerged during the colonial period. Yet the specific shape of the relationship between American cities and the rest of the world has changed dramatically in the intervening years. In the mid-20th century, the increasing integration of the global economy within the American economy began to reshape US cities. In the Northeast and Midwest, the once robust manufacturing centers and factories that had sustained their residents—and their tax bases—left, first for the South and West, and then for cities and towns outside the United States, as capital grew more mobile and businesses sought lower wages and tax incentives elsewhere. That same global capital, combined with federal subsidies, created boomtowns in the once-rural South and West. Nationwide, city boosters began to pursue alternatives to heavy industry, once understood to be the undisputed guarantor of a healthy urban economy. Increasingly, US cities organized themselves around the service economy, both in high-end, white-collar sectors like finance, consulting, and education, and in low-end pink-collar and no-collar sectors like food service, hospitality, and health care. A new legal infrastructure related to immigration made US cities more racially, ethnically, and linguistically diverse than ever before. At the same time, some US cities were agents of economic globalization themselves. Dubbed “global cities” by celebrants and critics of the new economy alike, these cities achieved power and prestige in the late 20th century not only because they had survived the ruptures of globalization but because they helped to determine its shape. By the end of the 20th century, cities that are not routinely listed among the “global city” elite jockeyed to claim “world-class” status, investing in high-end art, entertainment, technology, education, and health care amenities to attract and retain the high-income white-collar workers understood to be the last hope for cities hollowed out by deindustrialization and global competition. Today, the extreme differences between “global cities” and the rest of US cities, and the extreme socioeconomic stratification seen in cities of all stripes, is a key concern of urbanists.

Article

Railroads in US History  

Albert Churella

Since the early 1800s railroads have served as a critical element of the transportation infrastructure in the United States and have generated profound changes in technology, finance, business-government relations, and labor policy. By the 1850s railroads, at least in the northern states, had evolved into the nation’s first big businesses, replete with managerial hierarchies that in many respects resembled the structure of the US Army. After the Civil War ended, the railroad network grew rapidly, with lines extending into the Midwest and ultimately, with the completion of the first transcontinental railroad in 1869, to the Pacific Coast. The last third of the 19th century was characterized by increased militancy among railroad workers, as well as by the growing danger that railroading posed to employees and passengers. Intense competition among railroad companies led to rate wars and discriminatory pricing. The presence of rebates and long-haul/short-haul price differentials led to the federal regulation of the railroads in 1887. The Progressive Era generated additional regulation that reduced profitability and discouraged additional investment in the railroads. As a result, the carriers were often unprepared for the traffic demands associated with World War I, leading to government operation of the railroads between 1917 and 1920. Highway competition during the 1920s and the economic crises of the 1930s provided further challenges for the railroads. The nation’s railroads performed well during World War II but declined steadily in the years that followed. High labor costs, excessive regulatory oversight, and the loss of freight and passenger traffic to cars, trucks, and airplanes ensured that by the 1960s many once-profitable companies were on the verge of bankruptcy. A wave of mergers failed to halt the downward slide. The bankruptcy of Penn Central in 1970 increased public awareness of the dire circumstances and led to calls for regulatory reform. The 1980 Staggers Act abolished most of the restrictions on operations and pricing, thus revitalizing the railroads.

Article

DDT and Pesticides  

Frederick Rowe Davis

The history of DDT and pesticides in America is overshadowed by four broad myths. The first myth suggests that DDT was the first insecticide deployed widely by American farmers. The second indicates that DDT was the most toxic pesticide to wildlife and humans alike. The third myth assumes that Rachel Carson’s Silent Spring (1962) was an exposé of the problems of DDT rather than a broad indictment of American dependency on chemical insecticides. The fourth and final myth reassures Americans that the ban on DDT late in 1972 resolved the pesticide paradox in America. Over the course of the 20th century, agricultural chemists have developed insecticides from plants with phytotoxic properties (“botanical” insecticides) and a range of chemicals including heavy metals such as lead and arsenic, chlorinated hydrocarbons like DDT, and organophosphates like parathion. All of the synthetic insecticides carried profound unintended consequences for landscapes and wildlife alike. More recently, chemists have returned to nature and developed chemical analogs of the botanical insecticides, first with the synthetic pyrethroids and now with the neonicotinoids. Despite recent introduction, neonics have become widely used in agriculture and there are suspicions that these chemicals contribute to declines in bees and grassland birds.

Article

The Car and the City  

David Blanke

The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.