31-40 of 66 Results  for:

  • Economic History x
Clear all

Article

Banking and Finance from the Revolution to the Civil War  

Sharon Ann Murphy

In creating a new nation, the United States also had to create a financial system from scratch. During the period from the Revolution to the Civil War, the country experimented with numerous options. Although the Constitution deliberately banned the issuance of paper money by either Congress or the states, states indirectly reclaimed this power by incorporating state-chartered banks with the ability to print banknotes. These provided Americans with a medium of exchange to facilitate trade and an expansionary money supply to meet the economic needs of a growing nation. The federal government likewise entered into the world of money and finance with the incorporation of the First and Second Banks of the United States. Not only did critics challenge the constitutionality of these banks, but contemporaries likewise debated whether any banking institutions promoted the economic welfare of the nation or if they instead introduced unnecessary instability into the economy. These debates became particularly heated during moments of crisis. Periods of war, including the Revolutionary War, the War of 1812, and the Civil War, highlighted the necessity of a robust financial system to support the military effort, while periods of economic panic such as the Panic of 1819, the Panics of 1837 and 1839, and the Panic of 1857 drew attention to the weaknesses inherent in this decentralized, largely unregulated system. Whereas Andrew Jackson succeeded in destroying the Second Bank of the United States during the Bank War, state-chartered commercial banks, savings banks, and investment banks still multiplied rapidly throughout the period. Numerous states introduced regulations intended to control the worst excesses of these banks, but the most comprehensive legislation occurred with the federal government’s Civil War-era Banking Acts, which created the first uniform currency for the nation.

Article

Anti-capitalist Thought and Utopian Alternatives in America  

Howard Brick

Utopia—the term derived from Thomas More’s 1516 volume by that name—always suggested a place that was both non-existent, a product of the imagination usually depicted fictionally as far distant in time or space, and better than the real and familiar world. In modern times, it has served as a mode of anti-capitalist critique and also, despite its supposed “unreality,” as a disposition joined to actual social movements for dramatic reform. Utopian alternatives to American capitalism, both in the sense of literary works projecting visions of ideal social relations and in real efforts to establish viable communitarian settlements, have long been a significant part of the nation’s cultural and political history. In the 1840s, American followers of the French “utopian socialist” Charles Fourier established dozens of communities based at least in part on Fourier’s principles, and those principles filtered down to the world’s most influential modern utopian novel, Edward Bellamy’s Looking Backward of 1888. Utopian community-building and the writing of anti-capitalist utopian texts surged and declined in successive waves from the 19th to the 21st century, and while the recent surges have never equaled the impact borne by Fourierism or Bellamy, the appeal of the utopian imagination has again surfaced, since the Great Recession of 2008 provoked new doubts about the viability or justice of capitalist economic and social relations.

Article

Labor and US Foreign Relations  

Elizabeth McKillen

American workers have often been characterized by the press, scholars, and policy-makers as apathetic and ill-informed about foreign policy issues. To highlight this point, scholars have frequently used an anecdote about a blue-collar worker who responded to an interviewer’s questions regarding international issues in the 1940s by exclaiming “Foreign Affairs! That’s for people who don’t have to work for a living.” Yet missing from many such appraisals is a consideration of the long history of efforts by both informal groups of workers and labor unions to articulate and defend the perceived international interests of American workers. During the early years of the American Republic, groups of workers used crowd actions, boycotts, and protests to make their views on important foreign policy issues known. In the late 19th century, emerging national labor unions experimented with interest group lobbying as well as forms of collective action championed by the international labor movement to promote working-class foreign policy interests. Many 20th- and 21st-century US labor groups shared in common a belief that government leaders failed to adequately understand the international concerns and perspectives of workers. Yet such groups often pursued different types of foreign policy influence. Some dominant labor organizations, such as the American Federation of Labor (AFL) and Congress of Industrial Organizations (CIO), participated in federal bureaucracies, advisory councils, and diplomatic missions and programs designed to encourage collaboration among business, state, and labor leaders in formulating and promoting US foreign policy. Yet other labor groups, as well as dissidents within the AFL and CIO, argued that these power-sharing arrangements compromised labor’s independence and led some trade union leaders to support policies that actually hurt both American and foreign workers. Particularly important in fueling internal opposition to AFL-CIO foreign policies were immigrant workers and those with specific ethno-racial concerns. Some dissenting groups and activists participated in traditional forms of interest group lobbying in order to promote an independent international agenda for labor; others committed themselves to the foreign policy programs of socialist, labor, or communist parties. Still others, such as the Industrial Workers of the World, advocated strike and international economic actions by workers to influence US foreign policy or to oppose US business activities abroad.

Article

Women in Early American Economy  

Jane T. Merritt

From the planter societies and subsistence settlements of the 17th century to the global markets of the late 18th century, white, black, and Indian women participated extensively in the early American economy. As the colonial world gave way to an independent nation and household economies yielded to cross-Atlantic commercial networks, women played an important role as consumers and producers. Was there, however, a growing gendered divide in the American economy by the turn of the 19th century? Were there more restrictions on women’s business activities, property ownership, work lives, consumer demands, or productive skills? Possibly, we ask the wrong questions when exploring women’s history. By posing questions that compare the past with present conditions, we miss the more nuanced and shifting patterns that made up the variety of women’s lives. Whether rural or urban, rich or poor, free or enslaved, women’s legal and marital status dictated some basic parameters of how they operated within the early American economy. But despite these boundaries, or perhaps because of them, women created new strategies to meet the economic needs of households, families, and themselves. As entrepreneurs they brought in lodgers or operated small businesses that generated extra income. As producers they finagled the materials necessary to create items for home use and to sell at market. As consumers, women, whether free or enslaved, demanded goods from merchants and negotiated prices that fit their budgets. As laborers, these same women translated myriad skills into wages or exchanged labor for goods. In all these capacities, women calculated, accumulated, and survived in the early American economy.

Article

Civilian Nuclear Power  

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

The History of Credit in America  

Rowena Olegario

The United States is a nation built on credit, both public and private. This article focuses on private credit: that is, credit extended to businesses and consumers by private entities such as banks, other businesses, and retail stores. Business credit involves short-term lending for items such as inventories, payroll, and the like; and long-term lending for the building of factories, offices, and other physical plant. Trade credit, bank loans, bonds, and commercial paper are all forms of business credit. Consumer credit is extended to individuals or households to fund purchases ranging from basic necessities to homes. Informal store credits, installment sales, personal loans from banks and other institutions, credit cards, home mortgages, and student loans are forms of consumer credit. Until the 20th century, the federal government remained mostly uninvolved in the private credit markets. Then, after World War I and especially during the Great Depression, the government deliberately expanded the credit available for certain targeted groups, such as farmers and home buyers. After World War II the government helped to expand lending even further, this time to small businesses and students. Mostly the government accomplished its goal not through lending directly but by insuring the loans made by private entities, thereby encouraging them to make more loans. In the case of home mortgages and student loans, the government took the lead in creating a national market for securitized debt—debt that is turned into securities, such as bonds, and offered to investors—through the establishment of government-sponsored enterprises, nicknamed Fannie Mae (1938), Ginnie Mae (1968), Freddie Mac (1970), and Sallie Mae (1972). Innovations such as these by businesses and government made credit increasingly available to ordinary people, whose attitudes toward borrowing changed accordingly.

Article

OPEC, International Oil, and the United States  

Gregory Brew

After World War II, the United States backed multinational private oil companies known as the “Seven Sisters”—five American companies (including Standard Oil of New Jersey and Texaco), one British (British Petroleum), and one Anglo-Dutch (Shell)—in their efforts to control Middle East oil and feed rising demand for oil products in the West. In 1960 oil-producing states in Latin America and the Middle East formed the Organization of the Petroleum Exporting Countries (OPEC) to protest what they regarded as the inequitable dominance of the private oil companies. Between 1969 and 1973 changing geopolitical and economic conditions shifted the balance of power from the Seven Sisters to OPEC. Following the first “oil shock” of 1973–1974, OPEC assumed control over the production and price of oil, ending the rule of the companies and humbling the United States, which suddenly found itself dependent upon OPEC for its energy security. Yet this dependence was complicated by a close relationship between the United States and major oil producers such as Saudi Arabia, which continued to adopt pro-US strategic positions even as they squeezed out the companies. Following the Iranian Revolution (1978–1979), the Iran–Iraq War (1980–1988), and the First Iraq War (1990–1991), the antagonism that colored US relations with OPEC evolved into a more comfortable, if wary, recognition of the new normal, where OPEC supplied the United States with crude oil while acknowledging the United States’ role in maintaining the security of the international energy system.

Article

The United States Department of Agriculture, 1900–1945  

Anne Effland

President Abraham Lincoln signed the law that established the Department of Agriculture in 1862 and in 1889, President Grover Cleveland signed the law that raised the Department to Cabinet status. Thus, by 1900 the US Department of Agriculture had been established for nearly four decades, had been a Cabinet-level department for one, and was recognized as a rising star among agricultural science institutions. Over the first half of the next century, the USDA would grow beyond its scientific research roots to assume a role in supporting rural and farm life more broadly, with a presence that reached across the nation. The Department acquired regulatory responsibilities in plant and animal health and food safety and quality, added research in farm management and agricultural economics, provided extension services to reach farms and rural communities in all regions, and created conservation and forestry programs to protect natural resources and prevent soil erosion and flooding across the geographical diversity of rural America. The Department gained additional responsibility for delivering credit, price supports, supply management, and rural rehabilitation programs during the severe economic depression that disrupted the agricultural economy and rural life from 1920 to 1940, while building efficient systems for encouraging production and facilitating distribution of food during the crises of World War I and World War II that bounded those decades. In the process, the Department became a pioneer in developing the regulatory state as well as in piloting programs and bureaucratic systems that empowered cooperative leadership at the federal, state, and local levels and democratic participation in implementing programs in local communities.

Article

Lobbying and Business Associations  

Benjamin C. Waterhouse

Political lobbying has always played a key role in American governance, but the concept of paid influence peddling has been marked by a persistent tension throughout the country’s history. On the one hand, lobbying represents a democratic process by which citizens maintain open access to government. On the other, the outsized clout of certain groups engenders corruption and perpetuates inequality. The practice of lobbying itself has reflected broader social, political, and economic changes, particularly in the scope of state power and the scale of business organization. During the Gilded Age, associational activity flourished and lobbying became increasingly the province of organized trade associations. By the early 20th century, a wide range at political reforms worked to counter the political influence of corporations. Even after the Great Depression and New Deal recast the administrative and regulatory role of the federal government, business associations remained the primary vehicle through which corporations and their designated lobbyists influenced government policy. By the 1970s, corporate lobbyists had become more effective and better organized, and trade associations spurred a broad-based political mobilization of business. Business lobbying expanded in the latter decades of the 20th century; while the number of companies with a lobbying presence leveled off in the 1980s and 1990s, the number of lobbyists per company increased steadily and corporate lobbyists grew increasingly professionalized. A series of high-profile political scandals involving lobbyists in 2005 and 2006 sparked another effort at regulation. Yet despite popular disapproval of lobbying and distaste for politicians, efforts to substantially curtail the activities of lobbyists and trade associations did not achieve significant success.

Article

The Central Business District in American Cities  

Emily Remus

The central business district, often referred to as the “downtown,” was the economic nucleus of the American city in the 19th and 20th centuries. It stood at the core of urban commercial life, if not always the geographic center of the metropolis. Here was where the greatest number of offices, banks, stores, and service institutions were concentrated—and where land values and building heights reached their peaks. The central business district was also the most easily accessible point in a city, the place where public transit lines intersected and brought together masses of commuters from outlying as well as nearby neighborhoods. In the downtown, laborers, capitalists, shoppers, and tourists mingled together on bustling streets and sidewalks. Not all occupants enjoyed equal influence in the central business district. Still, as historian Jon C. Teaford explained in his classic study of American cities, the downtown was “the one bit of turf common to all,” the space where “the diverse ethnic, economic, and social strains of urban life were bound together, working, spending, speculating, and investing.” The central business district was not a static place. Boundaries shifted, expanding and contracting as the city grew and the economy evolved. So too did the primary land uses. Initially a multifunctional space where retail, wholesale, manufacturing, and financial institutions crowded together, the central business district became increasingly segmented along commercial lines in the 19th century. By the early 20th century, rising real estate prices and traffic congestion drove most manufacturing and processing operations to the periphery. Remaining behind in the city center were the bulk of the nation’s offices, stores, and service institutions. As suburban growth accelerated in the mid-20th century, many of these businesses also vacated the downtown, following the flow of middle-class, white families. Competition with the suburbs drained the central business district of much of its commercial vitality in the second half of the 20th century. It also inspired a variety of downtown revitalization schemes that tended to reinforce inequalities of race and class.