1-20 of 67 Results  for:

  • Economic History x
Clear all

Article

American Food, Cooking, and Nutrition, 1900–1945  

Helen Zoe Veit

The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease. American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.

Article

Antebellum U.S. Labor Markets  

Joshua L. Rosenbloom

The United States economy underwent major transformations between American independence and the Civil War through rapid population growth, the development of manufacturing, the onset of modern economic growth, increasing urbanization, the rapid spread of settlement into the trans-Appalachian west, and the rise of European immigration. These decades were also characterized by an increasing sectional conflict between free and slave states that culminated in 1861 in Southern secession from the Union and a bloody and destructive Civil War. Labor markets were central to each of these developments, directing the reallocation of labor between sectors and regions, channeling a growing population into productive employment, and shaping the growing North–South division within the country. Put differently, labor markets influenced the pace and character of economic development in the antebellum United States. On the one hand, the responsiveness of labor markets to economic shocks helped promote economic growth; on the other, imperfections in labor market responses to these shocks significantly affected the character and development of the national economy.

Article

Anti-capitalist Thought and Utopian Alternatives in America  

Howard Brick

Utopia—the term derived from Thomas More’s 1516 volume by that name—always suggested a place that was both non-existent, a product of the imagination usually depicted fictionally as far distant in time or space, and better than the real and familiar world. In modern times, it has served as a mode of anti-capitalist critique and also, despite its supposed “unreality,” as a disposition joined to actual social movements for dramatic reform. Utopian alternatives to American capitalism, both in the sense of literary works projecting visions of ideal social relations and in real efforts to establish viable communitarian settlements, have long been a significant part of the nation’s cultural and political history. In the 1840s, American followers of the French “utopian socialist” Charles Fourier established dozens of communities based at least in part on Fourier’s principles, and those principles filtered down to the world’s most influential modern utopian novel, Edward Bellamy’s Looking Backward of 1888. Utopian community-building and the writing of anti-capitalist utopian texts surged and declined in successive waves from the 19th to the 21st century, and while the recent surges have never equaled the impact borne by Fourierism or Bellamy, the appeal of the utopian imagination has again surfaced, since the Great Recession of 2008 provoked new doubts about the viability or justice of capitalist economic and social relations.

Article

Antimonopoly in American Politics, 1945–2000  

Daniel Scroop

Antimonopoly, meaning opposition to the exclusive or near-exclusive control of an industry or business by one or a very few businesses, played a relatively muted role in the history of the post-1945 era, certainly compared to some earlier periods in American history. However, the subject of antimonopoly is important because it sheds light on changing attitudes toward concentrated power, corporations, and the federal government in the United States after World War II. Paradoxically, as antimonopoly declined as a grass-roots force in American politics, the technical, expert-driven field of antitrust enjoyed a golden age. From the 1940s to the 1960s, antitrust operated on principles that were broadly in line with those that inspired its creation in the late 19th and early 20th century, acknowledging the special contribution small-business owners made to US democratic culture. In these years, antimonopoly remained sufficiently potent as a political force to sustain the careers of national-level politicians such as congressmen Wright Patman and Estes Kefauver and to inform the opinions of Supreme Court justices such as Hugo Black and William O. Douglas. Antimonopoly and consumer politics overlapped in this period. From the mid-1960s onward, Ralph Nader repeatedly tapped antimonopoly ideas in his writings and consumer activism, skillfully exploiting popular anxieties about concentrated economic power. At the same time, as part of the United States’ rise to global hegemony, officials in the federal government’s Antitrust Division exported antitrust overseas, building it into the political, economic, and legal architecture of the postwar world. Beginning in the 1940s, conservative lawyers and economists launched a counterattack against the conception of antitrust elaborated in the progressive era. By making consumer welfare—understood in terms of low prices and market efficiency—the determining factor in antitrust cases, they made a major intellectual and political contribution to the rightward thrust of US politics in the 1970s and 1980s. Robert Bork’s The Antitrust Paradox, published in 1978, popularized and signaled the ascendency of this new approach. In the 1980s and 1990s antimonopoly drifted to the margin of political debate. Fear of big government now loomed larger in US politics than the specter of monopoly or of corporate domination. In the late 20th century, Americans, more often than not, directed their antipathy toward concentrated power in its public, rather than its private, forms. This fundamental shift in the political landscape accounts in large part for the overall decline of antimonopoly—a venerable American political tradition—in the period 1945 to 2000.

Article

Autoworkers and Their Unions  

Daniel Clark

Since the introduction of “Fordism” in the early 1910s, which emphasized technological improvements and maximizing productive efficiency, US autoworkers have struggled with repetitive, exhausting, often dangerous jobs. Yet beginning with Ford’s Five Dollar Day, introduced in 1914, auto jobs have also provided higher pay than most other wage work, attracting hundreds of thousands of people, especially to Detroit, Michigan, through the 1920s, and again from World War II until the mid-1950s. Successful unionization campaigns by the United Auto Workers (UAW) in the 1930s and early 1940s resulted in contracts that guaranteed particular wage increases, reduced the power of foremen, and created a process for resolving workplace conflicts. In the late 1940s and early 1950s UAW president Walter Reuther negotiated generous medical benefits and pensions for autoworkers. The volatility of the auto industry, however, often brought layoffs that undermined economic security. By the 1950s overproduction and automation contributed heavily to instability for autoworkers. The UAW officially supported racial and gender equality, but realities in auto plants and the makeup of union leadership often belied those principles. Beginning in the 1970s US autoworkers faced disruptions caused by high oil prices, foreign competition, and outsourcing to Mexico. Contract concessions at unionized plants began in the late 1970s and continued into the 2000s. By the end of the 20th century, many American autoworkers did not belong to the UAW because they were employed by foreign automakers, who built factories in the United States and successfully opposed unionization. For good reason, autoworkers who survived the industry’s turbulence and were able to retire with guaranteed pensions and medical care look back fondly on all that they gained from working in the industry under UAW contracts. Countless others left auto work permanently and often reluctantly in periodic massive layoffs and the continuous loss of jobs from automation.

Article

Baltimore  

David Schley

Baltimore, Maryland, rose to prominence in the late 18th century as a hub for the Atlantic wheat trade. A slave city in a slave state, Baltimore was home to the largest free Black community in antebellum America. Nineteenth-century Baltimore saw trend-setting experiments in railroading as well as frequent episodes of collective violence that left the city with the nickname, “mobtown”; one such riot, in 1861, led to the first bloodshed of the Civil War. After the war, Baltimore’s African American community waged organized campaigns to realize civil rights. Residential segregation—both de jure and de facto—posed a particular challenge. Initiatives in Baltimore such as a short-lived segregation ordinance and racial covenants in property deeds helped establish associations between race and property values that shaped federal housing policy during the New Deal. The African American population grew during World War II and strained against the limited housing available to them, prompting protests, often effective, against segregation. Nonetheless, suburbanization, deindustrialization, and redlining have left the city with challenging legacies to confront.

Article

Banking and Finance from the Revolution to the Civil War  

Sharon Ann Murphy

In creating a new nation, the United States also had to create a financial system from scratch. During the period from the Revolution to the Civil War, the country experimented with numerous options. Although the Constitution deliberately banned the issuance of paper money by either Congress or the states, states indirectly reclaimed this power by incorporating state-chartered banks with the ability to print banknotes. These provided Americans with a medium of exchange to facilitate trade and an expansionary money supply to meet the economic needs of a growing nation. The federal government likewise entered into the world of money and finance with the incorporation of the First and Second Banks of the United States. Not only did critics challenge the constitutionality of these banks, but contemporaries likewise debated whether any banking institutions promoted the economic welfare of the nation or if they instead introduced unnecessary instability into the economy. These debates became particularly heated during moments of crisis. Periods of war, including the Revolutionary War, the War of 1812, and the Civil War, highlighted the necessity of a robust financial system to support the military effort, while periods of economic panic such as the Panic of 1819, the Panics of 1837 and 1839, and the Panic of 1857 drew attention to the weaknesses inherent in this decentralized, largely unregulated system. Whereas Andrew Jackson succeeded in destroying the Second Bank of the United States during the Bank War, state-chartered commercial banks, savings banks, and investment banks still multiplied rapidly throughout the period. Numerous states introduced regulations intended to control the worst excesses of these banks, but the most comprehensive legislation occurred with the federal government’s Civil War-era Banking Acts, which created the first uniform currency for the nation.

Article

Bethlehem, Pennsylvania  

Chloe E. Taft

Bethlehem, Pennsylvania, a city of seventy-five thousand people in the Lehigh Valley, was settled on the traditional homelands of the Lenape in 1741 as a Moravian religious settlement. The Moravian community on the North Side of the Lehigh River was closed to outsiders and was characterized by orderly stone buildings and a communitarian economy. The settlement opened and expanded on the South Side of the river as an industrial epicenter beginning in the mid-19th century and was ultimately home to the headquarters of the Bethlehem Steel Corporation. By the late 1930s, the city’s 1,800-acre steel plant was ramping up to peak production with employment of more than thirty thousand. When Bethlehem Steel began a long, slow decline after 1950 until the plant’s closure in 1998, Bethlehem evolved into an archetype of a postindustrial city drawing on its long history of heritage tourism and an increasingly diversified economy in healthcare, education, and distribution, among other sectors. The city’s population has roots in multiple waves of migration—the Germanic Moravians in the 18th century, throngs of European immigrants who arrived in the late 19th and early 20th centuries, and a Latino/a population that grew after World War II to represent an increasingly large share of residents. The city’s landscape, culture, and economy are imbued with a multifaceted history that is both deeply local and reflective of the city’s position since its founding as an important node in regional and global networks.

Article

Business in the Civil War: Trade, Markets, and Industry  

Mark R. Wilson

The Civil War disrupted domestic and international trade. Union strategy included a considerable focus on economic warfare, especially in the form of a naval blockade of Southern ports. Because the war lasted four years, its outcome was affected deeply by the success or failure of Union and Confederate economic mobilizations of capital, industry, and labor. On the home fronts, many businesses, large and small, confronted new challenges to their normal operations and supply chains. Generally speaking, businesses in the South suffered more than their Northern counterparts. Forced to deal with the consequences of the blockade, high inflation, and Union advances, many Southern farmers, merchants, and manufacturers struggled to keep afloat, especially after 1862. In the North, there was more wartime prosperity, thanks to a smoother economic mobilization and the Union’s ability to continue internal and international trade. But there was no single uniform experience: at the levels of specific industries and individual firms, the impact of the war varied widely. Clearly, the single biggest economic change—and political and social change, as well—was the end of slavery. Beyond that, the Civil War’s effects on long-run economic and industrial development were more complex and uncertain. The conflict’s heavy costs in blood and treasure harmed the North as well as the South, but many industries came out of the war in a strong enough position to allow the United States to continue on its path to becoming the world’s largest and most prosperous national economy by the end of the 19th century.

Article

Business Social Responsibility  

Gavin Benke

“Corporate social responsibility” is a term that first began to circulate widely in the late 1960s and early 1970s. Though it may seem to be a straightforward concept, the phrase can imply a range of activities, from minority hiring initiatives and environmentally sound operations, to funding local nonprofits and cultural institutions. The idea appeared to have developed amid increasing demands made of corporations by a number of different groups, such as the consumer movement. However, American business managers engaged in many of these practices well before that phrase was coined. As far back as the early 19th century, merchants and business owners envisioned a larger societal role. However, broader political, social, and economic developments, from the rise of Gilded Age corporations to the onset of the Cold War, significantly influenced understandings of business social responsibility. Likewise, different managers and corporations have had different motives for embracing social responsibility initiatives. Some embraced social responsibility rhetoric as a public relations tool. Others saw the concept as a way to prevent government regulation. Still others undertook social responsibility efforts because they fit well with their own socially progressive ethos. Though the terms and understandings of a business’s social responsibilities have shifted over time, the basic idea has been a perennial feature of commercial life in the United States.

Article

The Car and the City  

David Blanke

The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.

Article

The Central Business District in American Cities  

Emily Remus

The central business district, often referred to as the “downtown,” was the economic nucleus of the American city in the 19th and 20th centuries. It stood at the core of urban commercial life, if not always the geographic center of the metropolis. Here was where the greatest number of offices, banks, stores, and service institutions were concentrated—and where land values and building heights reached their peaks. The central business district was also the most easily accessible point in a city, the place where public transit lines intersected and brought together masses of commuters from outlying as well as nearby neighborhoods. In the downtown, laborers, capitalists, shoppers, and tourists mingled together on bustling streets and sidewalks. Not all occupants enjoyed equal influence in the central business district. Still, as historian Jon C. Teaford explained in his classic study of American cities, the downtown was “the one bit of turf common to all,” the space where “the diverse ethnic, economic, and social strains of urban life were bound together, working, spending, speculating, and investing.” The central business district was not a static place. Boundaries shifted, expanding and contracting as the city grew and the economy evolved. So too did the primary land uses. Initially a multifunctional space where retail, wholesale, manufacturing, and financial institutions crowded together, the central business district became increasingly segmented along commercial lines in the 19th century. By the early 20th century, rising real estate prices and traffic congestion drove most manufacturing and processing operations to the periphery. Remaining behind in the city center were the bulk of the nation’s offices, stores, and service institutions. As suburban growth accelerated in the mid-20th century, many of these businesses also vacated the downtown, following the flow of middle-class, white families. Competition with the suburbs drained the central business district of much of its commercial vitality in the second half of the 20th century. It also inspired a variety of downtown revitalization schemes that tended to reinforce inequalities of race and class.

Article

Civilian Nuclear Power  

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

DDT and Pesticides  

Frederick Rowe Davis

The history of DDT and pesticides in America is overshadowed by four broad myths. The first myth suggests that DDT was the first insecticide deployed widely by American farmers. The second indicates that DDT was the most toxic pesticide to wildlife and humans alike. The third myth assumes that Rachel Carson’s Silent Spring (1962) was an exposé of the problems of DDT rather than a broad indictment of American dependency on chemical insecticides. The fourth and final myth reassures Americans that the ban on DDT late in 1972 resolved the pesticide paradox in America. Over the course of the 20th century, agricultural chemists have developed insecticides from plants with phytotoxic properties (“botanical” insecticides) and a range of chemicals including heavy metals such as lead and arsenic, chlorinated hydrocarbons like DDT, and organophosphates like parathion. All of the synthetic insecticides carried profound unintended consequences for landscapes and wildlife alike. More recently, chemists have returned to nature and developed chemical analogs of the botanical insecticides, first with the synthetic pyrethroids and now with the neonicotinoids. Despite recent introduction, neonics have become widely used in agriculture and there are suspicions that these chemicals contribute to declines in bees and grassland birds.

Article

Death and Dying in the Working Class  

Michael K. Rosenow

In the broader field of thanatology, scholars investigate rituals of dying, attitudes toward death, evolving trajectories of life expectancy, and more. Applying a lens of social class means studying similar themes but focusing on the men, women, and children who worked for wages in the United States. Working people were more likely to die from workplace accidents, occupational diseases, or episodes of work-related violence. In most periods of American history, it was more dangerous to be a wage worker than it was to be a soldier. Battlegrounds were not just the shop floor but also the terrain of labor relations. American labor history has been filled with violent encounters between workers asserting their views of economic justice and employers defending their private property rights. These clashes frequently turned deadly. Labor unions and working-class communities extended an ethos of mutualism and solidarity from the union halls and picket lines to memorial services and gravesites. They lauded martyrs to movements for human dignity and erected monuments to honor the fallen. Aspects of ethnicity, race, and gender added layers of meaning that intersected with and refracted through individuals’ economic positions. Workers’ encounters with death and the way they made sense of loss and sacrifice in some ways overlapped with Americans from other social classes in terms of religious custom, ritual practice, and material consumption. Their experiences were not entirely unique but diverged in significant ways.

Article

Deindustrialization and the Postindustrial City, 1950–Present  

Chloe E. Taft

The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.

Article

The Department Store  

Traci Parker

Department stores were the epicenter of American consumption and modernity in the late 19th and through the 20th century. Between 1846 and 1860 store merchants and commercial impresarios remade dry goods stores and small apparel shops into department stores—downtown emporiums that departmentalized its vast inventory and offered copious services and amenities. Their ascendance corresponded with increased urbanization, immigration, industrialization, and the mass production of machine-made wares. Urbanization and industrialization also helped to birth a new White middle class who were eager to spend their money on material comforts and leisure activities. And department stores provided them with a place where they could do so. Stores sold shoppers an astounding array of high-quality, stylish merchandise including clothing, furniture, radios, sporting equipment, musical instruments, luggage, silverware, china, and books. They also provided an array of services and amenities, including public telephones, postal services, shopping assistance, free delivery, telephone-order and mail-order departments, barber shops, hair salons, hospitals and dental offices, radio departments, shoe-shining stands, wedding gift registries and wedding secretary services, tearooms, and restaurants. Stores enthroned consumption as the route to democracy and citizenship, inviting everybody—regardless of race, gender, age, and class—to enter, browse, and purchase material goods. They were major employers of white-collar workers and functioned as a new public space for women as workers and consumers. The 20th century brought rapid and significant changes and challenges. Department stores weathered economic crises; two world wars; new and intense competition from neighborhood, chain, and discount stores; and labor and civil rights protests that threatened to damage their image and displace them as the nation’s top retailers. They experienced cutbacks, consolidated services, and declining sales during the Great Depression, played an essential role in the war effort, and contended with the Office of Price Administration’s Emergency Price Control Act during the Second World War. In the postwar era, they opened branch locations in suburban neighborhoods where their preferred clientele—the White middle class—now resided and shaped the development and proliferation of shopping centers. They hastened the decline of downtown shopping as a result. The last three decades of the 20th century witnessed a wave of department store closures, mergers, and acquisitions because of changing consumer behaviors, shifts in the retail landscape, and evolving market dynamics. Department stores would continue to suffer into the 21st century as online retailing exploded.

Article

Detroit  

Ryan S. Pettengill

From its earliest origins through the 21st century, Detroit was a capitalist venture that was tied to the global economy. Throughout the pre-Columbian period, Detroit served as a meeting point where a diverse confederation of Native Americans came together to conduct business and diplomacy. Later, the city became a contested territorial holding that the Western imperial powers of France, Spain, Great Britain, and the United States fought over, as it represented a critical gateway that opened up trade to the central and western regions of North America. Between 1835 and 1929, capitalists built wharfs, railroad lines, factories, warehouses, and other forms of industrial infrastructure, attracting throngs of working-class job seekers and causing Detroit’s population to boom from approximately 1,100 in 1819 to more than one million in 1930. The population peaked at nearly two million in 1950 and, by 2020, it had declined to approximately 700,000. Detroit’s history might be thought of in three distinct periods: a pre-Columbian period where the region consisted of a preindustrial space that was occupied by Anishinaabeg peoples, later to be claimed by European colonists; a long industrial era in which businessmen, such as Henry Ford, centralized production within the city; and a slow period of economic decline as the city struggled to adapt to different trends in a global economy. As Detroit entered the 21st century, the city faced a declining population, rising budget deficits, and a crumbling infrastructure. Still, as several multinational corporations based their operations out of Detroit, the city remained a capitalist venture fundamentally tied to the global economy.

Article

Economic Thought and Practice in America  

Christopher W. Calvo

The conspicuous timing of the publication of Adam Smith’s The Wealth of Nations and America’s Declaration of Independence, separated by only a few months in 1776, has attracted a great deal of historical attention. America’s revolution was in large part motivated by the desire to break free from British mercantilism and engage the principles, both material and ideological, found in Smith’s work. From 1776 to the present day, the preponderance of capitalism in American economic history and the influence of The Wealth of Nations in American intellectual culture have contributed to the conventional wisdom that America and Smith enjoy a special relationship. After all, no nation has consistently pursued the tenets of Smithian-inspired capitalism, mainly free and competitive markets, a commitment to private property, and the pursuit of self-interests and profits, more than the United States. The shadow of Smith’s The Wealth of Nations looms large over America. But a closer look at American economic thought and practice demonstrates that Smith’s authority was not as dominant as the popular history assumes. Although most Americans accepted Smith’s work as the foundational text in political economy and extracted from it the cardinal principles of intellectual capitalism, its core values were twisted, turned, and fused together in contorted, sometimes contradictory fashions. American economic thought also reflects the widespread belief that the nation would trace an exceptional course, distinct from the Old World, and therefore necessitating a political economy suited to American traditions and expectations. Hybrid capitalist ideologies, although rooted in Smithian-inspired liberalism, developed within a dynamic domestic discourse that embraced ideological diversity and competing paradigms, exactly the kind expected from a new nation trying to understand its economic past, establish its present, and project its future. Likewise, American policymakers crafted legislation that brought the national economy both closer to and further from the Smithian ideal. Hybrid intellectual capitalism—a compounded ideological approach that antebellum American economic thinkers deployed to help rationalize the nation’s economic development—imitated the nation’s emergent hybrid material capitalism. Labor, commodity, and capital markets assumed amalgamated forms, combining, for instance, slave and free labor, private and public enterprises, and open and protected markets. Americans constructed different types of capitalism, reflecting a preference for mixtures of practical thought and policy that rarely conformed to strict ideological models. Historians of American economic thought and practice study capitalism as an evolutionary, dynamic institution with manifestations in traditional, expected corners, but historians also find capitalism demonstrated in unorthodox ways and practiced in obscure corners of market society that blended capitalist with non-capitalist experiences. In the 21st century, the benefits of incorporating conventional economic analysis with political, social, and cultural narratives are widely recognized. This has helped broaden scholars’ understanding of what exactly constitutes capitalism. And in doing so, the malleability of American economic thought and practice is put on full display, improving scholars’ appreciation for what remains the most significant material development in world history.

Article

The Economy of Colonial British America  

Aaron Slater

Identifying and analyzing a unified system called the “economy of colonial British America” presents a number of challenges. The regions that came to constitute Britain’s North American empire developed according to a variety of factors, including climate and environment, relations with Native peoples, international competition and conflict, internal English/British politics, and the social system and cultural outlook of the various groups that settled each colony. Nevertheless, while there was great diversity in the socioeconomic organization across colonial British America, a few generalizations can be made. First, each region initially focused economic activity on some form of export-oriented production that tied it to the metropole. New England specialized in timber, fish, and shipping services, the Middle Colonies in furs, grains, and foodstuffs, the Chesapeake in tobacco, the South in rice, indigo, and hides, and the West Indies in sugar. Second, the maturation of the export-driven economy in each colony eventually spurred the development of an internal economy directed toward providing the ancillary goods and services necessary to promote the export trade. Third, despite variations within and across colonies, colonial British America underwent more rapid economic expansion over the course of the 17th and 18th centuries than did its European counterparts, to the point that, on the eve of the American Revolution, white settlers in British America enjoyed one of the highest living standards in the world at the time. A final commonality that all the regions shared was that this robust economic growth spurred an almost insatiable demand for land and labor. With the exception of the West Indies, where the Spanish had largely exterminated the Native inhabitants by the time the English arrived, frontier warfare was ubiquitous across British America, as land-hungry settlers invaded Indian territory and expropriated their lands. The labor problem, while also ubiquitous, showed much greater regional variation. The New England and the Middle colonies largely supplied their labor needs through a combination of family immigration, natural increase, and the importation of bound European workers known as indentured servants. The Chesapeake, Carolina, and West Indian colonies, on the other hand, developed “slave societies,” where captive peoples of African descent were imported in huge numbers and forced to serve as enslaved laborers on colonial plantations. Despite these differences, it should be emphasized that, by the outbreak of the American Revolution, the institution of slavery had, to a greater or lesser extent, insinuated itself into the economy of every British American colony. The expropriation of land from Indians and labor from enslaved Africans thus shaped the economic history of all the colonies of British America.