11-20 of 66 Results  for:

  • Economic History x
Clear all

Article

Guest Workers in U.S. History  

David Griffith

Guest workers have been part of the economic and cultural landscapes of the United States since the founding of republics across the Americas, evolving from indentured servants to the use of colonial subjects to foreign nationals imported under a variety of intergovernmental agreements and U.S. visas. Guest worker programs became institutionalized with the Bracero Program with Mexico, which ran from 1942 to 1964, and with the British West Indies Temporary Alien Labor Program, which began in 1943. Both of these programs were established under the Emergency Farm Labor Supply Program to address real and perceived labor shortages in agriculture during World War II. Both programs were structurally similar to programs employed to import colonial subjects, primarily Puerto Ricans, for U.S. agriculture. Although the U.S. Departments of Labor and Agriculture oversaw the operation of the programs during the war, control over guest workers’ labor and the conditions of their employment increasingly became the responsibility of their employers and employer associations following the war. Nevertheless, U.S. government support for guest worker programs has been steady, if uneven, since the 1940s, and most new legislation addressing immigration reform has included some sort of guest worker provision. Under the Immigration Reform and Control Act of 1986, for example, H-2A and H-2B visas were created to import workers primarily from Latin America and the Caribbean for low-wage work in agricultural (H-2A) and non-agricultural (H-2B) seasonal employment. In the Immigration Act of 1990, H-1 visas were added to import guest workers, primarily from India and China, for work in computer programming, higher education, and other skilled occupations. Although an unknown portion of the guest worker labor force resists the terms of their employment and slips into the shadow economy as undocumented immigrants, the number of legal guest workers in the United States has increased into the 21st century.

Article

Native American Captivity and Slavery in North America, 1492–1848  

Ann Little

The capture, adoption, and/or enslavement of enemies in North American warfare long predated the European invasion of the 16th century. In every region and among nearly every nation of Native North America, captive-taking continued after the arrival of the Spanish, English, and French and accelerated in the 18th century as a result of the opportunities and pressures that colonialism brought to bear on indigenous peoples. Although the famous narratives of Indian captivity were written by people of European descent, the majority of people who were taken and adopted or enslaved by Native Americans were themselves Native American women, girls, and boys. One scholar estimates that perhaps as many as 2.5 to 5 million Indigenous slaves were owned by Europeans in the Western hemisphere from 1492 to 1900; this estimate excludes the millions more who were retained within other Indigenous communities. Within these Native American communities, captives served a variety of purposes along a continuum: depending on their age and sex, they might be adopted fully into a new kinship network, or they might be ritually executed. Most captive adults seem to have endured fates in-between these dramatic poles: they might be marked as “adopted slaves” and set to the most tedious and repetitive work; they might be traded or given as gifts for profit or diplomacy; they might be subjected to coerced sex; or they might marry a captor and have children who were full kin members of their new community. Most would probably experience more than one of these fates. In the early 21st century, important scholarship on Native American captivity has emphasized its similarities to African slavery and how the African slave trade influenced Native American captive raiding, trading, and enslavement in the colonial era and in the early United States. But there were two possibly interrelated important differences between these two slaveries. First, unlike the adult male African captives who were preferred by Europeans for enslavement in North America, most captives taken by other Native Americans were women and children. Second, this Indigenous slavery was not heritable, although the captives themselves were frequently marked or even mutilated to signify their status as outsiders, or not-kin, in a world defined by kinship ties. Although the differences of intersecting European and Indigenous cultures, chronology, and context made for widely disparate experiences in Indian captivity and slavery over four centuries, one constant across time and space is that captive-taking seems to have been intended to grow the captors’ populations as well as deprive their enemies of productive and reproductive labor. The appropriation of girls’ and women’s sexuality and reproductive power became the means by which female captives might suffer intensely as well as possibly improve their standing and their children’s futures.

Article

Death and Dying in the Working Class  

Michael K. Rosenow

In the broader field of thanatology, scholars investigate rituals of dying, attitudes toward death, evolving trajectories of life expectancy, and more. Applying a lens of social class means studying similar themes but focusing on the men, women, and children who worked for wages in the United States. Working people were more likely to die from workplace accidents, occupational diseases, or episodes of work-related violence. In most periods of American history, it was more dangerous to be a wage worker than it was to be a soldier. Battlegrounds were not just the shop floor but also the terrain of labor relations. American labor history has been filled with violent encounters between workers asserting their views of economic justice and employers defending their private property rights. These clashes frequently turned deadly. Labor unions and working-class communities extended an ethos of mutualism and solidarity from the union halls and picket lines to memorial services and gravesites. They lauded martyrs to movements for human dignity and erected monuments to honor the fallen. Aspects of ethnicity, race, and gender added layers of meaning that intersected with and refracted through individuals’ economic positions. Workers’ encounters with death and the way they made sense of loss and sacrifice in some ways overlapped with Americans from other social classes in terms of religious custom, ritual practice, and material consumption. Their experiences were not entirely unique but diverged in significant ways.

Article

The Economy of Colonial British America  

Aaron Slater

Identifying and analyzing a unified system called the “economy of colonial British America” presents a number of challenges. The regions that came to constitute Britain’s North American empire developed according to a variety of factors, including climate and environment, relations with Native peoples, international competition and conflict, internal English/British politics, and the social system and cultural outlook of the various groups that settled each colony. Nevertheless, while there was great diversity in the socioeconomic organization across colonial British America, a few generalizations can be made. First, each region initially focused economic activity on some form of export-oriented production that tied it to the metropole. New England specialized in timber, fish, and shipping services, the Middle Colonies in furs, grains, and foodstuffs, the Chesapeake in tobacco, the South in rice, indigo, and hides, and the West Indies in sugar. Second, the maturation of the export-driven economy in each colony eventually spurred the development of an internal economy directed toward providing the ancillary goods and services necessary to promote the export trade. Third, despite variations within and across colonies, colonial British America underwent more rapid economic expansion over the course of the 17th and 18th centuries than did its European counterparts, to the point that, on the eve of the American Revolution, white settlers in British America enjoyed one of the highest living standards in the world at the time. A final commonality that all the regions shared was that this robust economic growth spurred an almost insatiable demand for land and labor. With the exception of the West Indies, where the Spanish had largely exterminated the Native inhabitants by the time the English arrived, frontier warfare was ubiquitous across British America, as land-hungry settlers invaded Indian territory and expropriated their lands. The labor problem, while also ubiquitous, showed much greater regional variation. The New England and the Middle colonies largely supplied their labor needs through a combination of family immigration, natural increase, and the importation of bound European workers known as indentured servants. The Chesapeake, Carolina, and West Indian colonies, on the other hand, developed “slave societies,” where captive peoples of African descent were imported in huge numbers and forced to serve as enslaved laborers on colonial plantations. Despite these differences, it should be emphasized that, by the outbreak of the American Revolution, the institution of slavery had, to a greater or lesser extent, insinuated itself into the economy of every British American colony. The expropriation of land from Indians and labor from enslaved Africans thus shaped the economic history of all the colonies of British America.

Article

Railroad Workers and Organized Labor  

Paul Michel Taillon

Railroad workers occupy a singular place in United States history. Working in the nation’s first “big businesses,” they numbered in the hundreds of thousands, came from a wide range of ethnic and racial groups, included both men and women, and performed a wide range of often esoteric tasks. As workers in an industry that shaped the nation’s financial, technological, and political-economic development, railroaders drove the leading edge of industrialization in the 19th century and played a central role in the nation’s economy for much of the 20th. With the legends of “steel-driving” John Henry and “Cannonball” Casey Jones, railroad workers entered the national folklore as Americans pondered the benefits and costs of progress in an industrial age. Those tales highlighted the glamor and rewards, the risks and disparities, and the gender-exclusive and racially hierarchical nature of railroad work. They also offer insight into the character of railroad unionism, which, from its beginnings in the 1860s, oriented toward craft-based, male-only, white-supremacist forms of organization. Those unions remained fragmented, but they also became among the most powerful in the US labor movement, leveraging their members’ strategic location in a central infrastructural industry, especially those who operated the trains. That strategic location also ensured that any form of collective organization—and therefore potential disruption of the national economy—would lead to significant state intervention. Thus, the epic railroad labor conflict of the late 19th century generated the first federal labor relations laws in US history, which in turn set important precedents for 20th-century national labor relations policy. At the same time, the industry nurtured the first national all-Black, civil-rights-oriented unions, which played crucial roles in the 20th-century African American freedom struggle. By the mid-20th century, however, with technological change and the railroads entering a period of decline, the numbers of railroad workers diminished and with them, too, their once-powerful unions.

Article

The History of Route 66  

Stephen Mandrgoc and David Dunaway

During its existence from 1926 to its formal decommissioning in 1985, US Highway 66, or Route 66, came to occupy a special place in the American imagination. For a half-century and more, it symbolized American individualism, travel, and the freedom of the open road with the transformative rise of America’s automobile culture. Route 66 was an essential connection between the Midwest and the West for American commercial, military, and civilian transportation. It chained together small towns and cities across the nation as America’s “Main Street.” Following the path of older trails and railroads, Route 66 hosted travelers in many different eras: the adventurous motorist in his Ford Model A in the 1920s, the Arkies and Okies desperate for a new start in California in the 1930s, trucks carrying wartime soldiers and supplies in the 1940s, and postwar tourists and travelers from the 1950s onward. By its nature, it brought together diverse cultures of different regions, introducing Americans to the “others” that were their regional neighbors, and exposing travelers to new arts, music, foods, and traditions. It became firmly embedded in pop culture through songs, books, television, and advertisements for its attractions as America’s most famous road. Travel on Highway 66 steadily declined with the development of controlled-access interstate highways in the 1960s and 1970s. The towns and cities it connected and the many businesses and attractions dependent on its traffic and tourism protested the removal of the highway designation by the US Transportation Department in 1985, but their efforts failed. Nonetheless, revivalists who treasured the old road worked to preserve the road sections and attractions that remained, as well as founding a wide variety of organizations and donating to museums and libraries to preserve Route 66 ephemera. In the early 21st century, Route 66 is an international icon of America, traveled by fans from all over the world.

Article

The Late-19th-Century Economy  

Sean Adams

The United States underwent massive economic change in the four decades following the end of the American Civil War in 1865. A vibrant industrial economy catapulted the nation to a world leader in mining and manufacturing; the agricultural sector overcame organizational and technological challenges to increase productivity; and the innovations in financial, accounting, and marketing methods laid the foundation for a powerful economy that would dominate the globe in the 20th century. The emergence of this economy, however, did not come without challenges. Workers in both the industrial and agricultural sectors offered an alternative path for the American economy in the form of labor strikes and populist reforms; their attempts to disrupt the growing concentration of wealth and power played out in both the polls and the factory floor. Movements that sought to regulate the growth of large industrial firms and railroads failed to produce much meaningful policy, even as they raised major critiques of the emerging economic order. In the end, a form of industrial capitalism emerged that used large corporate structures, relatively weak unions, and limited government interventions to build a dynamic, but unbalanced, economic order in the United States.

Article

Financial Crises in American History  

Christoph Nitschke and Mark Rose

U.S. history is full of frequent and often devastating financial crises. They have coincided with business cycle downturns, but they have been rooted in the political design of markets. Financial crises have also drawn from changes in the underpinning cultures, knowledge systems, and ideologies of marketplace transactions. The United States’ political and economic development spawned, guided, and modified general factors in crisis causation. Broadly viewed, the reasons for financial crises have been recurrent in their form but historically specific in their configuration: causation has always revolved around relatively sudden reversals of investor perceptions of commercial growth, stock market gains, monetary availability, currency stability, and political predictability. The United States’ 19th-century financial crises, which happened in rapid succession, are best described as disturbances tied to market making, nation building, and empire creation. Ongoing changes in America’s financial system aided rapid national growth through the efficient distribution of credit to a spatially and organizationally changing economy. But complex political processes—whether Western expansion, the development of incorporation laws, or the nation’s foreign relations—also underlay the easy availability of credit. The relationship between systemic instability and ideas and ideals of economic growth, politically enacted, was then mirrored in the 19th century. Following the “Golden Age” of crash-free capitalism in the two decades after the Second World War, the recurrence of financial crises in American history coincided with the dominance of the market in statecraft. Banking and other crises were a product of political economy. The Global Financial Crisis of 2007–2008 not only once again changed the regulatory environment in an attempt to correct past mistakes, but also considerably broadened the discursive situation of financial crises as academic topics.

Article

The Information Economy  

Jamie L. Pietruska

The term “information economy” first came into widespread usage during the 1960s and 1970s to identify a major transformation in the postwar American economy in which manufacturing had been eclipsed by the production and management of information. However, the information economy first identified in the mid-20th century was one of many information economies that have been central to American industrialization, business, and capitalism for over two centuries. The emergence of information economies can be understood in two ways: as a continuous process in which information itself became a commodity, as well as an uneven and contested—not inevitable—process in which economic life became dependent on various forms of information. The production, circulation, and commodification of information has historically been essential to the growth of American capitalism and to creating and perpetuating—and at times resisting—structural racial, gender, and class inequities in American economy and society. Yet information economies, while uneven and contested, also became more bureaucratized, quantified, and commodified from the 18th century to the 21st century. The history of information economies in the United States is also characterized by the importance of systems, networks, and infrastructures that link people, information, capital, commodities, markets, bureaucracies, technologies, ideas, expertise, laws, and ideologies. The materiality of information economies is historically inextricable from production of knowledge about the economy, and the concepts of “information” and “economy” are themselves historical constructs that change over time. The history of information economies is not a teleological story of progress in which increasing bureaucratic rationality, efficiency, predictability, and profit inevitably led to the 21st-century age of Big Data. Nor is it a singular story of a single, coherent, uniform information economy. The creation of multiple information economies—at different scales in different regions—was a contingent, contested, often inequitable process that did not automatically democratize access to objective information.

Article

United States Financial History  

Christy Ford Chapin

The history of US finance—spanning from the republic’s founding through the 2007–2008 financial crisis—exhibits two primary themes. The first theme is that Americans have frequently expressed suspicion of financiers and bankers. This abiding distrust has generated ferocious political debates through which voters either have opposed government policies that empower financial interests or have advocated proposals to steer financial institutions toward serving the public. A second, related theme that emerges from this history is that government policy—both state and federal—has shaped and reshaped financial markets. This feature follows the pattern of American capitalism, which rather than appearing as laissez-faire market competition, instead materializes as interactions between government and private enterprise structuring each economic sector in a distinctive manner. International comparison illustrates this premise. Because state and federal policies produced a highly splintered commercial banking sector that discouraged the development of large, consolidated banks, American big business has frequently had to rely on securities financing. This shareholder model creates a different corporate form than a commercial-bank model. In Germany, for example, large banks often provide firms with financing as well as business consulting and management strategy services. In this commercial-bank model, German business executives cede some autonomy to bankers but also have more ability to engage in long-term planning than do American executives who tend to cater to short-term stock market demands. Under the banner of the public–private financial system two subthemes appear: fragmented institutional arrangements and welfare programming. Because of government policy, the United States, compared to other western nations, has an unusually fragmented financial system. Adding to this complexity, some of these institutions can be either state or federally chartered; meanwhile, the commercial banking sector has traditionally hosted thousands of banks, ranging from urban, money-center institutions to small unit banks. Space constraints exclude examination of numerous additional organizations, such as venture capital firms, hedge funds, securities brokers, mutual funds, real estate investment trusts, and mortgage brokers. The US regulatory framework reflects this fragmentation, as a bevy of federal and state agencies supervise the financial sector. Since policymakers passed deregulatory measures during the 1980s and 1990s, the sector has moved toward consolidation and universal banking, which permits a large assortment of financial services to coexist under one institutional umbrella. Nevertheless, the US financial sector continues to be more fragmented than other industrialized countries. The public–private financial system has also delivered many government benefits, revealing that the American welfare state is perhaps more robust than scholars often claim. Welfare programming through financial policy tends be “hidden,” frequently because significant portions of benefits provision reside “off the books,” either as government-sponsored enterprises that are nominally private or as government guarantees in the place of direct spending. Yet these programs have heavily affected both their beneficiaries and the nation’s economy. The government, for example, has directed significant resources toward the construction and maintenance of a massive farm credit system. Moreover, policymakers established mortgage insurance and residential financing programs, creating an economy and consumer culture that revolve around home ownership. While both agricultural and mortgage programs have helped low-income beneficiaries, they have dispensed more aid to middle-class and corporate recipients. These programs, along with the institutional configuration of the banking and credit system, demonstrate just how important US financial policy has been to the nation’s unfolding history.