At the dawn of the 20th century, the region that would become the Democratic Republic of Congo fell to the brutal colonialism of Belgium’s King Leopold. Except for a brief moment when anti-imperialists decried the crimes of plantation slavery, the United States paid little attention to Congo before 1960. But after winning its independence from Belgium in June 1960, Congo suddenly became engulfed in a crisis of decolonization and the Cold War, a time when the United States and the Soviet Union competed for resources and influence. The confrontation in Congo was kept limited by a United Nations (UN) peacekeeping force, which ended the secession of the province of Katanga in 1964. At the same time, the CIA (Central Intelligence Agency) intervened to help create a pro-Western government and eliminate the Congo’s first prime minister, Patrice Lumumba. Ironically, the result would be a growing reliance on the dictatorship of Joseph Mobutu throughout the 1980s. In 1997 a rebellion succeeded in toppling Mobutu from power. Since 2001 President Joseph Kabila has ruled Congo. The United States has supported long-term social and economic growth but has kept its distance while watching Kabila fight internal opponents and insurgents in the east. A UN peacekeeping force returned to Congo and helped limit unrest. Despite serving out two full terms that ended in 2016, Kabila was slow to call elections amid rising turmoil.
121-140 of 593 Results
Article
Congo/Zaire-US Relations
Lise Namikas
Article
The Constitution of the United States and Foreign Relations
Martin S. Flaherty
Foreign relations under the US Constitution starts with the paradox, also seen in domestic matters, of relatively scant text providing guidance for the exercise of vast power. Founding understandings, structural inference, and ongoing constitutional custom and precedent have filled in much, though hardly all, of the framework over the course of two hundred years. As a result, two basic questions frame the relationship between the Constitution and US foreign policy: (1) which parts of the US government, alone or in combination, properly exercise authority in the making of foreign policy; and (2) once made, what is the status of the nation’s international legal obligations in the US domestic legal system.
The making of American foreign policy is framed by the Constitution’s commitment to separation of powers. Congress, the president, and the courts are all allocated discrete yet significant foreign affairs authority. Determining the exact borders and overlaps in areas such as the use of military force, emergency measures, and treaty termination continues to generate controversy. The status of international law in the US legal system in the first instance turns on whether resulting obligations derive from agreements or custom. The United States enters into international agreements in three ways: treaties, congressional-executive agreements, and sole executive agreements. Complex doctrine deals with the domestic applicability of treaties in particular. US courts primarily apply customary international law in two basic ways. They can exercise a version of their common lawmaking authority to fashion rules of decision based on international custom. They also apply customary international law when incorporated into domestic law by statute.
Article
Contagious Disease and Public Health in the American City
Daniel Wilson
Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.
Article
The Context and Consequences of the 1986 Immigration Reform and Control Act (IRCA)
Frank D. Bean and Thoa V. Khuu
The United States often views itself as a nation of immigrants. This may in part be why since the early 20th century the country has seldom adopted major changes in its immigration policy. Until 1986, only the 1924 National Origins Quota Act, its dismantlement in the 1952 McCarran-Walter Act, and the 1965 Immigration and Nationality Act, also known as the Hart-Celler Act, involved far-reaching reforms. Another large shift occurred with the passage of the 1986 Immigration Reform and Control Act (IRCA) and its derivative sequel, the 1990 Immigration Act. No major immigration legislation has yet won congressional approval in the 21st century. IRCA emerged from and followed in considerable measure the recommendations of the Select Commission on Immigration and Refugee Policy (1979–1981). That body sought to reconcile two competing political constituencies, one favoring the restriction of immigration, or at least unauthorized immigration, and the other an expansion of family-based and work-related migration. The IRCA legislation contained something for each side: the passage of employer sanctions, or serious penalties on employers for hiring unauthorized workers, for the restriction side; and the provision of a legalization program, which outlined a pathway for certain unauthorized entrants to obtain green cards and eventually citizenship, for the reform side. The complete legislative package also included other provisions: including criteria allowing the admission of agricultural workers, a measure providing financial assistance to states for the costs they would incur from migrants legalizing, a requirement that states develop ways to verify that migrants were eligible for welfare benefits, and a provision providing substantial boosts in funding for border enforcement activities. In the years after the enactment of IRCA, research has revealed that the two major compromise provisions, together with the agricultural workers provision, generated mixed results. Employer sanctions failed to curtail unauthorized migration much, in all likelihood because of minimal funding for enforcement, while legalization and the agricultural measures resulted in widespread enrollment, with almost all of the unauthorized migrants who qualified coming forward to take advantage of the opportunity to become U.S. legalized permanent residents (LPRs). But when the agricultural workers provisions allowing entry of temporary workers are juxtaposed with the relatively unenforceable employer-sanctions provisions, IRCA entailed contradictory elements that created frustration for some observers. In sociocultural, political, and historical terms, scholars and others can interpret IRCA’s legalization as reflecting the inclusive, pluralistic, and expansionist tendencies characteristic of much of 18th-century U.S. immigration. But some of IRCA’s other elements led to contradictory effects, with restriction efforts being offset by the allowances for more temporary workers. This helped to spawn subsequent political pressures in favor of new restrictive or exclusive immigration controls that created serious hazards for immigrants.
Article
Contraband Camps and the African American Refugee Experience during the Civil War
Chandra Manning
In May 1861, three enslaved men who were determined not to be separated from their families ran to Fort Monroe, Virginia. Their flight led to the phenomenon of Civil War contraband camps. Contraband camps were refugee camps to which between four hundred thousand and five hundred thousand enslaved men, women, and children in the Union-occupied portions of the Confederacy fled to escape their owners by getting themselves to the Union Army. Army personnel had not envisioned overseeing a massive network of refugee camps. Responding to the interplay between the actions of the former slaves who fled to the camps, Republican legislation and policy, military orders, and real conditions on the ground, the army improvised. In the contraband camps, former slaves endured overcrowding, food and clothing shortages, poor sanitary conditions, and constant danger. They also gained the protection of the Union Army and access to the power of the US government as new, though unsteady, allies in the pursuit of their key interests, including education, employment, and the reconstitution of family, kin, and social life. The camps brought together actors who had previously had little to no contact with each other, exposed everyone involved to massive structural forces that were much larger than the human ability to control them, and led to unexpected outcomes. They produced a refugee crisis on US soil, affected the course and outcome of the Civil War, influenced the progress of wartime emancipation, and altered the relationship between the individual and the national government. Contraband camps were simultaneously humanitarian crises and incubators for a new relationship between African Americans and the US government.
Article
Counterinsurgency in United States Army History, 1860 to 1975
Robert J. Thompson III
American history is replete with instances of counterinsurgency. An unsurprising reality considering the United States has always participated in empire building, thus the need to pacify resistance to expansion. For much of its existence, the U.S. has relied on its Army to pacify insurgents. While the U.S. Army used traditional military formations and use of technology to battle peer enemies, the same strategy did not succeed against opponents who relied on speed and surprise. Indeed, in several instances, insurgents sought to fight the U.S. Army on terms that rendered superior manpower and technology irrelevant. By introducing counterinsurgency as a strategy, the U.S. Army attempted to identify and neutralize insurgents and the infrastructure that supported them. Discussions of counterinsurgency include complex terms, thus readers are provided with simplified, yet accurate definitions and explanations. Moreover, understanding the relevant terms provided continuity between conflicts. While certain counterinsurgency measures worked during the American Civil War, the Indian Wars, and in the Philippines, the concept failed during the Vietnam War. The complexities of counterinsurgency require readers to familiarize themselves with its history, relevant scholarship, and terminology—in particular, counterinsurgency, pacification, and infrastructure.
Article
Credit Reporting and the History of Commercial Surveillance in America
Josh Lauer
The first credit reporting organizations emerged in the United States during the 19th century to address problems of risk and uncertainty in an expanding market economy. Early credit reporting agencies assisted merchant lenders by collecting and centralizing information about the business activities and reputations of unknown borrowers throughout the country. These agencies quickly evolved into commercial surveillance networks, amassing huge archives of personal information about American citizens and developing credit rating systems to rank them. Shortly after the Civil War, separate credit reporting organizations devoted to monitoring consumers, rather than businesspeople, also began to emerge to assist credit-granting retailers. By the early 20th century, hundreds of local credit bureaus dissected the personal affairs of American consumers, forming the genesis of a national consumer credit surveillance infrastructure.
The history of American credit reporting reveals fundamental links between the development of modern capitalism and contemporary surveillance society. These connections became increasingly apparent during the late 20th century as technological advances in computing and networked communication fueled the growth of new information industries, raising concerns about privacy and discrimination. These connections and concerns, however, are not new. They can be traced to 19th-century credit reporting organizations, which turned personal information into a commodity and converted individual biographies into impersonal financial profiles and risk metrics. As these disembodied identities and metrics became authoritative representations of one’s reputation and worth, they exerted real effects on one’s economic life chances and social legitimacy. While drawing attention to capitalism’s historical twin, surveillance, the history of credit reporting illuminates the origins of surveillance-based business models that became ascendant during the 21st century.
Article
The Creek Confederacy
Andrew Frank
The Creek Confederacy was a loose coalition of ethnically and linguistically diverse Native American towns that slowly coalesced as a political entity in the 18th and early 19th centuries. Its towns existed in Georgia, Alabama, and northern Florida, and for most of its preremoval history, these towns operated as autonomous entities. Several Creek leaders tried to consolidate power and create a more centralized polity, but these attempts at nation building largely failed. Instead, a fragile and informal confederacy connected the towns together for various cultural rituals as well as for purposes of diplomacy and trade. Disputes over centralization, as well as a host of other connected issues, ultimately led to the Creek War of 1813–1814. In the 1830s, the United States forced most members of the Creek Confederacy to vacate their eastern lands and relocate their nation to Indian Territory. Today, their western descendants are known as the Muskogee (Creek) Nation. Those who remained in the east include members of the federally recognized Seminole Tribe of Florida and the Poarch Band of Creek Indians who live in Alabama.
Article
The Cuban Revolution
Michael J. Bustamante
The Cuban Revolution transformed the largest island nation of the Caribbean into a flashpoint of the Cold War. After overthrowing US-backed ruler Fulgencio Batista in early 1959, Fidel Castro established a socialist, anti-imperialist government that defied the island’s history as a dependent and dependable ally of the United States. But the Cuban Revolution is not only significant for its challenge to US interests and foreign policy prerogatives. For Cubans, it fundamentally reordered their lives, inspiring multitudes yet also driving thousands of others to migrate to Miami and other points north.
Sixty years later, Fidel Castro may be dead and the Soviet Union may be long gone. Cuban socialism has become more hybrid in economic structure, and in 2014 the Cuban and US governments moved to restore diplomatic ties. But Cuba’s leaders continue to insist that “the Revolution,” far from a terminal political event, is still alive. Today, as the founding generation of Cuban leaders passes from the scene, “the Revolution” faces another important crossroads of uncertainty and reform.
Article
Cultural Heritage in the United States
Alicia Ebbitt McGill
A complex concept with a range of meanings and definitions, cultural heritage, often referred to simply as heritage, is characterized by the myriad ways individuals, groups, institutions, and political entities value and engage with manifestations of culture and history. Such manifestations encompass both tangible and intangible forms of the past, including cultural objects, landscapes, historic sites, memories, daily practices, and historical narratives. Heritage is tied to personal and group identity and can bring people together or be used to marginalize groups. People engage with heritage through behaviors that range from visits to culturally significant places, traditions, education programs, scholarly research, government policies, preservation, and tourism.
Heritage is culturally constructed and dynamic. Critical heritage scholarship since the late 20th century highlights ways societal values, political structures, and power dynamics shape how people define, engage with, utilize, and manage cultural heritage across the globe. Though much critical heritage scholarship emphasizes that dominant Western value systems have long influenced heritage management, it also draws attention to the diverse ways humans connect with the past and the cultural practices communities and individuals employ to resist hegemonic heritage ideology and processes. Heritage scholarship is interdisciplinary, drawing on methods and theories from fields such as archeology, anthropology, history, public history, architecture, historic preservation, museum studies, and geography to examine how people interact with “the past” in the present.
Article
Daily Life in the Jim Crow South, 1900–1945
Jennifer Ritterhouse
Distinctive patterns of daily life defined the Jim Crow South. Contrary to many observers’ emphasis on de jure segregation—meaning racial separation demanded by law—neither law nor the physical separation of blacks and whites was at the center of the early 20th-century South’s social system. Instead, separation, whether by law or custom, was one of multiple tools whites used to subordinate and exclude blacks and to maintain notions of white racial purity. In turn, these notions themselves varied over time and across jurisdictions, at least in their details, as elites tried repeatedly to establish who was “white,” who was “black,” and how the legal fictions they created would apply to Native Americans and others who fit neither category.
Within this complex multiracial world of the South, whites’ fundamental commitment to keeping blacks “in their place” manifested most routinely in day-to-day social dramas, often described in terms of racial “etiquette.” The black “place” in question was socially but not always physically distant from whites, and the increasing number of separate, racially marked spaces and actual Jim Crow laws was a development over time that became most pronounced in urban areas. It was a development that reveals blacks’ determination to resist racial oppression and whites’ perceived need to shore up a supposedly natural order that had, in fact, always been enforced by violence as well as political and economic power. Black resistance took many forms, from individual, covert acts of defiance to organized political movements. Whether in response to African Americans’ continued efforts to vote or their early 20th-century boycotts of segregated streetcars or World War I-era patterns of migration that threatened to deplete the agricultural labor force, whites found ways to counter blacks’ demands for equal citizenship and economic opportunity whenever and wherever they appeared.
In the rural South, where the majority of black Southerners remained economically dependent on white landowners, a “culture of personalism” characterized daily life within a paternalistic model of white supremacy that was markedly different from urban—and largely national, not merely southern—racial patterns. Thus, distinctions between rural and urban areas and issues of age and gender are critical to understanding the Jim Crow South. Although schools were rigorously segregated, preadolescent children could be allowed greater interracial intimacy in less official settings. Puberty became a break point after which close contact, especially between black males and white females, was prohibited. All told, Jim Crow was an inconsistent and uneven system of racial distinction and separation whose great reach shaped the South’s landscape and the lives of all Southerners, including those who were neither black nor white.
Article
Dallas
Patricia Evridge Hill
From its origins in the 1840s, Dallas developed quickly into a prosperous market town. After acquiring two railroads in the 1870s, the city became the commercial and financial center of North Central Texas. Early urban development featured competition and cooperation between the city’s business leadership, women’s groups, and coalitions formed by Populists, socialists, and organized labor. Notably, the city’s African Americans were marginalized economically and excluded from civic affairs. By the end of the 1930s, city building became more exclusive even for the white population. A new generation of business leaders threatened by disputes over Progressive Era social reforms and city planning, the revival of the Ku Klux Klan, and attempts to organize industrial workers used its control of local media, at-large elections, and repression to dominate civic affairs until the 1970s.
Article
Dayton, Ohio
Janet Bednarek
In 1796, twelve white settlers traveled north from the Ohio River into what became known as the Miami Valley. There, they established a small settlement on the banks of the Great Miami River, not far from where the Mad River, the Stillwater River, and Wolf Creek empty into the Great Miami. They named their new town after Jonathan Dayton, a Revolutionary War veteran, investor in land in Ohio, and the youngest signer of the US Constitution. Though the settlement grew slowly at first, once connected by canal (1829), railroad (1851), and telegraph (1847), the city began to flourish. By the time of the US Civil War, Dayton had emerged as a manufacturing city with an increasingly diverse population. Between the 1870s and 1920s, Dayton became known for products ranging from paper to potato chips, cash registers, bicycles, and refrigerators. During this time, several individuals from Dayton rose to international prominence, including poet Paul Laurence Dunbar and the inventors of the airplane, Wilbur and Orville Wright. This period also witnessed the most important event in Dayton’s history, the 1913 flood. After that, Dayton became the largest city in the United States to adopt the city manager form of city government. After World War II, Dayton, like many cities in the so-called “Rust Belt,” suffered from deindustrialization and racial tensions. Dayton’s population peaked in the 1960s and, thereafter, the city lost population in every decade through the 2020s. Deep and lasting patterns of racial segregation divided Dayton’s population between an African American West Side and the largely white East and North Dayton. Local leaders embraced urban renewal and highway construction as potential answers to the city’s challenges with, at best, mixed results. As economic and population losses continued into the 21st century, the local economy shifted from manufacturing to “eds and meds.”
Article
DDT and Pesticides
Frederick Rowe Davis
The history of DDT and pesticides in America is overshadowed by four broad myths. The first myth suggests that DDT was the first insecticide deployed widely by American farmers. The second indicates that DDT was the most toxic pesticide to wildlife and humans alike. The third myth assumes that Rachel Carson’s Silent Spring (1962) was an exposé of the problems of DDT rather than a broad indictment of American dependency on chemical insecticides. The fourth and final myth reassures Americans that the ban on DDT late in 1972 resolved the pesticide paradox in America. Over the course of the 20th century, agricultural chemists have developed insecticides from plants with phytotoxic properties (“botanical” insecticides) and a range of chemicals including heavy metals such as lead and arsenic, chlorinated hydrocarbons like DDT, and organophosphates like parathion. All of the synthetic insecticides carried profound unintended consequences for landscapes and wildlife alike. More recently, chemists have returned to nature and developed chemical analogs of the botanical insecticides, first with the synthetic pyrethroids and now with the neonicotinoids. Despite recent introduction, neonics have become widely used in agriculture and there are suspicions that these chemicals contribute to declines in bees and grassland birds.
Article
Death and Dying in the Working Class
Michael K. Rosenow
In the broader field of thanatology, scholars investigate rituals of dying, attitudes toward death, evolving trajectories of life expectancy, and more. Applying a lens of social class means studying similar themes but focusing on the men, women, and children who worked for wages in the United States. Working people were more likely to die from workplace accidents, occupational diseases, or episodes of work-related violence. In most periods of American history, it was more dangerous to be a wage worker than it was to be a soldier. Battlegrounds were not just the shop floor but also the terrain of labor relations. American labor history has been filled with violent encounters between workers asserting their views of economic justice and employers defending their private property rights. These clashes frequently turned deadly. Labor unions and working-class communities extended an ethos of mutualism and solidarity from the union halls and picket lines to memorial services and gravesites. They lauded martyrs to movements for human dignity and erected monuments to honor the fallen. Aspects of ethnicity, race, and gender added layers of meaning that intersected with and refracted through individuals’ economic positions. Workers’ encounters with death and the way they made sense of loss and sacrifice in some ways overlapped with Americans from other social classes in terms of religious custom, ritual practice, and material consumption. Their experiences were not entirely unique but diverged in significant ways.
Article
Death in Colonial North America: Cross-Cultural Encounters
Erik R. Seeman
Death is universal yet is experienced in culturally specific ways. Because of this, when individuals in colonial North America encountered others from different cultural backgrounds, they were curious about how unfamiliar mortuary practices resembled and differed from their own. This curiosity spawned communication across cultural boundaries. The resulting knowledge sometimes facilitated peaceful relations between groups, while at other times it helped one group dominate another.
Colonial North Americans endured disastrously high mortality rates caused by disease, warfare, and labor exploitation. At the same time, death was central to the religions of all residents: Indians, Africans, and Europeans. Deathways thus offer an unmatched way to understand the colonial encounter from the participants’ perspectives.
Article
Decolonization and US Foreign Relations
Jason C. Parker
The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded.
Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.
Article
Deindustrialization and the Postindustrial City, 1950–Present
Chloe E. Taft
The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.
Article
The Democratic Party and US Foreign Relations
Jeffrey Bloodworth
Will Rogers understood the confounding nature of the Democratic Party. In noting that “Democrats never agree on anything, that’s why they’re Democrats,” the Oklahoma humorist highlighted a consistent theme in the party’s more than 200-year history: division. The political party of the underdog and ethnic, racial, and social minorities has always lacked the cultural cohesion that the Federalists, Whigs, and Republicans possessed. As a result, the main currents of Democratic Party foreign policy elude simple categorization. Muddying any efforts at classification are the dramatically disparate eras in which Democrats conducted foreign policy over two centuries.
Like other major American political parties, the Democrats’ foreign policy was animated by a messianic theme balanced against the national and constituent interests. Thinking themselves a “chosen people,” the Revolutionary generation thought their experiment foreshadowed a new global order with universal appeal. As representatives of God’s new Israel, the Founders made their new nation’s messianic relationship to the international system essential to its identity. Shunning established foreign policy practices, they founded a style of American diplomacy that combined idealism with pragmatism. Democrats, along with most every other major political party, have followed the Founders’ example but in a manner particular to the party’s history, constituents, and circumstance.
The foreign policy connective tissue of the Democratic Party has been its particular expression of the Founders’ messianic mission interpreted through its ever-evolving cast of disparate constituent groups. In pursuit of this, 19th-century Democratic foreign policy favored territorial and commercial expansion to safeguard their republican experiment. In the 20th and 21st century Democrats globalized these sentiments and sought a world conducive to democracy’s survival. But consistency is scarcely the hallmark of Democratic foreign policy. Driven by its disparate constituent groups and domestic politics, the party has bandied diverse foreign policy strategies through an array of historical circumstances. The sum total of Democratic foreign policy is, at times, a contradictory amalgam of diverse constituencies responding to the issues of the moment in a combination of self-interest and democratic idealism.
Article
The Department Store
Traci Parker
Department stores were the epicenter of American consumption and modernity in the late 19th and through the 20th century. Between 1846 and 1860 store merchants and commercial impresarios remade dry goods stores and small apparel shops into department stores—downtown emporiums that departmentalized its vast inventory and offered copious services and amenities. Their ascendance corresponded with increased urbanization, immigration, industrialization, and the mass production of machine-made wares. Urbanization and industrialization also helped to birth a new White middle class who were eager to spend their money on material comforts and leisure activities. And department stores provided them with a place where they could do so. Stores sold shoppers an astounding array of high-quality, stylish merchandise including clothing, furniture, radios, sporting equipment, musical instruments, luggage, silverware, china, and books. They also provided an array of services and amenities, including public telephones, postal services, shopping assistance, free delivery, telephone-order and mail-order departments, barber shops, hair salons, hospitals and dental offices, radio departments, shoe-shining stands, wedding gift registries and wedding secretary services, tearooms, and restaurants.
Stores enthroned consumption as the route to democracy and citizenship, inviting everybody—regardless of race, gender, age, and class—to enter, browse, and purchase material goods. They were major employers of white-collar workers and functioned as a new public space for women as workers and consumers.
The 20th century brought rapid and significant changes and challenges. Department stores weathered economic crises; two world wars; new and intense competition from neighborhood, chain, and discount stores; and labor and civil rights protests that threatened to damage their image and displace them as the nation’s top retailers. They experienced cutbacks, consolidated services, and declining sales during the Great Depression, played an essential role in the war effort, and contended with the Office of Price Administration’s Emergency Price Control Act during the Second World War. In the postwar era, they opened branch locations in suburban neighborhoods where their preferred clientele—the White middle class—now resided and shaped the development and proliferation of shopping centers. They hastened the decline of downtown shopping as a result. The last three decades of the 20th century witnessed a wave of department store closures, mergers, and acquisitions because of changing consumer behaviors, shifts in the retail landscape, and evolving market dynamics. Department stores would continue to suffer into the 21st century as online retailing exploded.