1-17 of 17 Results

  • Keywords: American Civil War x
Clear all

Article

The American War for Independence lasted eight years. It was one of the longest and bloodiest wars in America’s history, and yet it was not such a protracted conflict merely because the might of the British armed forces was brought to bear on the hapless colonials. The many divisions among Americans themselves over whether to fight, what to fight for, and who would do the fighting often had tragic and violent consequences. The Revolutionary War was by any measure the first American civil war. Yet national narratives of the Revolution and even much of the scholarship on the era focus more on simple stories of a contest between the Patriots and the British. Loyalists and other opponents of the Patriots are routinely left out of these narratives, or given short shrift. So, too, are the tens of thousands of ordinary colonists—perhaps a majority of the population—who were disaffected or alienated from either side or who tried to tack between the two main antagonists to make the best of a bad situation. Historians now estimate that as many as three-fifths of the colonial population were neither active Loyalists nor Patriots. When we take the war seriously and begin to think about narratives that capture the experience of the many, rather than the few, an illuminating picture emerges. The remarkably wide scope of the activities of the disaffected during the war—ranging from nonpayment of taxes to draft dodging and even to armed resistance to protect their neutrality—has to be integrated with older stories of militant Patriots and timid Loyalists. Only then can we understand the profound consequences of disaffection—particularly in creating divisions within the states, increasing levels of violence, prolonging the war, and changing the nature of the political settlements in each state. Indeed, the very divisions among diverse Americans that made the War for Independence so long, bitter, and bloody also explains much of the Revolutionary energy of the period. Though it is not as seamless as traditional narratives of the Revolution would suggest, a more complicated story also helps better explain the many problems the new states and eventually the new nation would face. In making this argument, we may finally suggest ways we can overcome what John Shy long ago noted as the tendency of scholars to separate the ‘destructive’ War for Independence from the ‘constructive’ political Revolution.

Article

American history is replete with instances of counterinsurgency. An unsurprising reality considering the United States has always participated in empire building, thus the need to pacify resistance to expansion. For much of its existence, the U.S. has relied on its Army to pacify insurgents. While the U.S. Army used traditional military formations and use of technology to battle peer enemies, the same strategy did not succeed against opponents who relied on speed and surprise. Indeed, in several instances, insurgents sought to fight the U.S. Army on terms that rendered superior manpower and technology irrelevant. By introducing counterinsurgency as a strategy, the U.S. Army attempted to identify and neutralize insurgents and the infrastructure that supported them. Discussions of counterinsurgency include complex terms, thus readers are provided with simplified, yet accurate definitions and explanations. Moreover, understanding the relevant terms provided continuity between conflicts. While certain counterinsurgency measures worked during the American Civil War, the Indian Wars, and in the Philippines, the concept failed during the Vietnam War. The complexities of counterinsurgency require readers to familiarize themselves with its history, relevant scholarship, and terminology—in particular, counterinsurgency, pacification, and infrastructure.

Article

Thomas A. Reinstein

The United States has a rich history of intelligence in the conduct of foreign relations. Since the Revolutionary War, intelligence has been most relevant to U.S. foreign policy in two ways. Intelligence analysis helps to inform policy. Intelligence agencies also have carried out overt action—secret operations—to influence political, military, or economic conditions in foreign states. The American intelligence community has developed over a long period, and major changes to that community have often occurred because of contingent events rather than long-range planning. Throughout their history, American intelligence agencies have used intelligence gained from both human and technological sources to great effect. Often, U.S. intelligence agencies have been forced to rely on technological means of intelligence gathering for lack of human sources. Recent advances in cyberwarfare have made technology even more important to the American intelligence community. At the same time, the relationship between intelligence and national-security–related policymaking has often been dysfunctional. Indeed, though some American policymakers have used intelligence avidly, many others have used it haphazardly or not at all. Bureaucratic fights also have crippled the American intelligence community. Several high-profile intelligence failures tend to dominate the recent history of intelligence and U.S. foreign relations. Some of these failures were due to lack of intelligence or poor analytic tradecraft. Others came because policymakers failed to use the intelligence they had. In some cases, policymakers have also pressured intelligence officers to change their findings to better suit those policymakers’ goals. And presidents have often preferred to use covert action to carry out their preferred policies without paying attention to intelligence analysis. The result has been constant debate about the appropriate role of intelligence in U.S. foreign relations.

Article

Megan Threlkeld

The issue of compulsory military service has been contested in the United States since before its founding. In a nation characterized by both liberalism and republicanism, there is an inherent tension between the idea that individuals should be able to determine their own destiny and the idea that all citizens have a duty to serve their country. Prior to the 20th century, conscription occurred mainly on the level of local militias, first in the British colonies and later in individual states. It was during the Civil War that the first federal drafts were instituted, both in the Union and the Confederacy. In the North, the draft was unpopular and largely ineffective. Congress revived national conscription when the United States entered World War I and established the Selective Service System to oversee the process. That draft ended when U.S. belligerency ended in 1918. The first peacetime draft was implemented in 1940; with the exception of one year, it remained in effect until 1973. Its most controversial days came during the Vietnam War, when thousands of people across the country demonstrated against it and, in some cases, outright refused to be inducted. The draft stopped with the end of the war, but in 1980, Congress reinstated compulsory Selective Service registration. More than two decades into the 21st century, male citizens and immigrant noncitizens are still required to register within thirty days of their eighteenth birthday. The very idea of “selective service” is ambiguous. It is selective because not everyone is conscripted, but it is compulsory because one can be prosecuted for failing to register or to comply with orders of draft boards. Especially during the Cold War, one of the system’s main functions was not to procure soldiers but to identify and exempt from service those men best suited for other endeavors framed as national service: higher education, careers in science and engineering, and even supporting families. That fact, combined with the decentralized nature of the Selective Service System itself, left the process vulnerable to the prejudices of local draft boards and meant that those most likely to be drafted were poor and nonwhite.

Article

Canada has sometimes been called the United States’ attic: a useful feature, but one easily forgotten. Of all countries, it has historically resembled the United States the most closely, in terms of culture, geography, economy, society, politics, ideology and, especially, history. A shared culture—literary, social, legal, and political—is a crucial factor in Canadian-American relations. Geography is at least as important. It provides the United States with strategic insulation to the north and enhances geographic isolation to the east and west. North-south economic links are inevitable and very large. It has been a major recipient of American investment, and for most of the time since 1920 has been the United States’ principal trading partner. Prosperous and self-sufficient, it has seldom required American aid. There have been no overtly hostile official encounters since the end of the War of 1812, partly because many Americans tended to believe that Canadians would join the republic; when that did not occur, the United States accepted an independent but friendly Canada as a permanent, useful, and desirable neighbor—North America’s attic. The insulation the attic provided was a common belief in the rule of law, both domestic and international; liberal democracy; a federal constitution; liberal capitalism; and liberal international trade regimes. That said, the United States, with its large population, huge economy, and military power, insulates Canada from hostile external forces. An attack on Canada from outside the continent is hard to imagine without a simultaneous attack on the United States. Successive American and Canadian governments have reaffirmed the political status quo while favoring mutually beneficial economic and military linkages—bilateral and multilateral. Relations have traditionally been grounded in a negotiating style that is evidence-based, proceeding issue by issue. A sober diplomatic and political context sometimes frames irritations and exclamations, but even these have usually been defined and limited by familiarity. For example, there has always been anti-Americanism in Canada. Most often it consists of sentiments derived from the United States itself, channeled by cultural similarities. No American idea, good or bad, from liberalism to populism, fails to find an echo in Canada. How loud or how soft the echo makes the difference.

Article

Sophie Cooper

Irish and American histories are intertwined as a result of migration, mercantile and economic connections, and diplomatic pressures from governments and nonstate actors. The two fledgling nations were brought together by their shared histories of British colonialism, but America’s growth as an imperial power complicated any natural allegiances that were invoked across the centuries. Since the beginnings of that relationship in 1607 with the arrival of Irish migrants in America (both voluntary and forced) and the building of a transatlantic linen trade, the meaning of “Irish” has fluctuated in America, mirroring changes in both migrant patterns and international politics. The 19th century saw Ireland enter into Anglo-American diplomacy on both sides of the Atlantic, while the 20th century saw Ireland emerge from Britain’s shadow with the establishment of separate diplomatic connections between the United States and Ireland. American recognition of the newly independent Irish Free State was vital for Irish politicians on the world stage; however the Free State’s increasingly isolationist policies during the 1930s to 1950s alienated its American allies. The final decade of the century, however, brought America and Ireland (including both Northern Ireland and the Republic of Ireland) closer than ever before. Throughout their histories, the Irish diasporas—both Protestant and Catholic—in America have played vital roles as pressure groups and fundraisers. The history of American–Irish relations therefore brings together governmental and nonstate organizations and unites political, diplomatic, social, cultural, and economic histories which are still relevant today.

Article

Nicole Etcheson and Cortney Cantrell

During the Civil War, the entire North constituted the homefront, an area largely removed from the din and horror of combat. With a few exceptions of raids and battles such as Gettysburg, civilians in the North experienced the war indirectly. The people on the homefront mobilized for war, sent their menfolk off to fight, supplied the soldiers and the army, coped without their breadwinners, and suffered the loss or maiming of men they loved. All the while, however, the homefront was crucially important to the course of the war. The mobilization of northern resources—not just men, but the manufacture of the arms and supplies needed to fight a war—enabled the North to conduct what some have called a total war, one on which the Union expended money and manpower at unprecedented levels. Confederate strategists hoped to break the will of the northern homefront to secure southern independence. Despite the hardships endured in the North, this strategy failed. On the homefront, women struggled to provide for their families as well as to serve soldiers and the army by sending care packages and doing war work. Family letters reveal the impact of the war on children who lost their fathers either temporarily or permanently. Communities rallied to aid soldiers’ families but were riven by dissension over issues such as conscription and emancipation. Immigrants and African Americans sought a new place in U.S. society by exploiting the opportunities the war offered to prove their worth. Service in the Union army certainly advanced the status of some groups, but was not the only means to that end. Nuns who nursed the wounded improved the reputation of the Catholic Church and northern African Americans used the increasingly emancipationist war goals to improve their legal status in the North. The Civil War altered race relations most radically, but change came to everyone on the northern homefront.

Article

America’s Civil War became part of a much larger international crisis as European powers, happy to see the experiment in self-government fail in America’s “Great Republic,” took advantage of the situation to reclaim former colonies in the Caribbean and establish a European monarchy in Mexico. Overseas, in addition to their formal diplomatic appeals to European governments, both sides also experimented with public diplomacy campaigns to influence public opinion. Confederate foreign policy sought to win recognition and aid from Europe by offering free trade in cotton and aligning their cause with that of the aristocratic anti-democratic governing classes of Europe. The Union, instead, appealed to liberal, republican sentiment abroad by depicting the war as a trial of democratic government and embracing emancipation of the slaves. The Union victory led to the withdrawal of European empires from the New World: Spain from Santo Domingo, France from Mexico, Russia from Alaska, and Britain from Canada, and the destruction of slavery in the United States hastened its end in Puerto Rico, Cuba, and Brazil.

Article

Holly Pinheiro

The United States Colored Troops (USCT) were a collection of racially segregated, as mandated by the US War Department, Black US Army military units that served during the Civil War and the Reconstruction era. Their collective military service is widely known for playing critical roles in ending slavery, protecting freedpeople, defeating the Confederate military, enforcing multiple US government policies, and reframing gender ideology while making explicit demands for more racially inclusive conceptions of citizenship. Black men, from a wide range of backgrounds and ages, comprised the 179,000 individuals that served in a USCT regiment. For instance, some soldiers were formerly bondsmen from Confederate states, while others (who were freeborn) came from free states and even internationally (including Canada). USCT regiments were never solely male-exclusive domains. Numerous Black women supported the US war effort, in and outside of the military spaces, in many ways. For example, Susie King Taylor served as a laundress and nurse in the Thirty-Third United States Colored Infantry. Thus, Black women are important figures in understanding Black Civil War–era military service. Ultimately, USCT regiments, and their supporters, fought for racial and social justice (during and long after USCT soldiering ended). Their service also provided avenues for prominent abolitionists, including Frederick Douglass, William Still, and Mary Ann Shadd Cary, who used Black military service to make clear demands for slavery and racial discrimination to end. Meanwhile, various Black communities (especially Black women) lobbied to protect their civil rights (while attempting to support USCT soldiers’ training). Additionally, the families of USCT soldiers vocalized to the Bureau of Pensions (a branch of the US government) to remember their collective wartime sacrifices through Civil War pensions. Their collective actions highlight that the history of USCT regiments requires an understanding of Black families and communities whose lived experiences remain relevant today.

Article

The history of the African American military experience in World War II tends to revolve around two central questions: How did World War II and American racism shape the black experience in the American military? And how did black GIs reshape the parameters of their wartime experiences? From the mid-1920s through the Great Depression years of the 1930s, military planners evaluated the performance of black soldiers in World War I while trying to ascertain their presence in future wars. However, quite often their discussions about African American servicemen in the military establishment were deeply moored in the traditions, customs, and practices of American racism, racist stereotypes, and innuendo. Simultaneously, African American leaders and their allies waged a relentless battle to secure the future presence of the uniformed men and women who would serve in the nation’s military. Through their exercise of voting rights, threats of protest demonstration, litigation, and White House lobbying from 1939 through 1942, civil rights advocates and their affiliates managed to obtain some minor concessions from the military establishment. But the military’s stubborn adherence to a policy barring black and white soldiers from serving in the same units continued through the rest of the war. Between 1943 and 1945, black GIs faced white officer hostility, civilian antagonism, and military police brutality while undergoing military training throughout the country. Similarly, African American servicewomen faced systemic racism and sexism in the military during the period. Throughout various stages of the American war effort, black civil rights groups, the press, and their allies mounted the opening salvoes in the battle to protect and defend the wellbeing of black soldiers in uniform. While serving on the battlefields of World War II, fighting African American GIs became foot soldiers in the wider struggles against tyranny abroad. After returning home in 1945, black World War II-era activists such as Daisy Lampkin and Ruby Hurley, and ex-servicemen and women, laid the groundwork for the Civil Rights Movement.

Article

The Japanese American Redress Movement refers to the various efforts of Japanese Americans from the 1940s to the 1980s to obtain restitution for their removal and confinement during World War II. This included judicial and legislative campaigns at local, state, and federal levels for recognition of government wrongdoing and compensation for losses, both material and immaterial. The push for redress originated in the late 1940s as the Cold War opened up opportunities for Japanese Americans to demand concessions from the government. During the 1960s and 1970s, Japanese Americans began to connect the struggle for redress with anti-racist and anti-imperialist movements of the time. Despite their growing political divisions, Japanese Americans came together to launch several successful campaigns that laid the groundwork for redress. During the early 1980s, the government increased its involvement in redress by forming a congressional commission to conduct an official review of the World War II incarceration. The commission’s recommendations of monetary payments and an official apology paved the way for the passage of the Civil Liberties Act of 1988 and other redress actions. Beyond its legislative and judicial victories, the redress movement also created a space for collective healing and generated new forms of activism that continue into the present.

Article

Sherman’s March, more accurately known as the Georgia and Carolinas Campaigns, cut a swath across three states in 1864–1865. It was one of the most significant campaigns of the war, making Confederate civilians “howl” as farms and plantations were stripped of everything edible and all their valuables. Outbuildings, and occasionally homes, were burned, railroads were destroyed, and enslaved workers were emancipated. Long after the war ended, Sherman’s March continued to shape American’s memories as one of the most symbolically powerful aspects of the Civil War. Sherman’s March began with the better-known March to the Sea, which started in Atlanta on November 15, 1864, and concluded in Savannah on December 22 of the same year. Sherman’s men proceeded through South Carolina and North Carolina in February, March, and April of 1865. The study of this military campaign illuminates the relationships between Sherman’s soldiers and Southern white civilians, especially women, and African Americans. Sherman’s men were often uncomfortable with their role as an army of liberation, and African Americans, in particular, found the March to be a double-edged sword.

Article

In 1944 President Franklin D. Roosevelt’s State of the Union address set out what he termed an “economic Bill of Rights” that would act as a manifesto of liberal policies after World War Two. Politically, however, the United States was a different place than the country that had faced the ravages of the Great Depression of the 1930s and ushered in Roosevelt’s New Deal to transform the relationship between government and the people. Key legacies of the New Deal, such as Social Security, remained and were gradually expanded, but opponents of governmental regulation of the economy launched a bitter campaign after the war to roll back labor union rights and dismantle the New Deal state. Liberal heirs to FDR in the 1950s, represented by figures like two-time presidential candidate Adlai Stevenson, struggled to rework liberalism to tackle the realities of a more prosperous age. The long shadow of the U.S. Cold War with the Soviet Union also set up new challenges for liberal politicians trying to juggle domestic and international priorities in an era of superpower rivalry and American global dominance. The election of John F. Kennedy as president in November 1960 seemed to represent a narrow victory for Cold War liberalism, and his election coincided with the intensification of the struggle for racial equality in the United States that would do much to shape liberal politics in the 1960s. After his assassination in 1963, President Lyndon Johnson launched his “Great Society,” a commitment to eradicate poverty and to provide greater economic security for Americans through policies such as Medicare. But his administration’s deepening involvement in the Vietnam War and its mixed record on alleviating poverty did much to taint the positive connotations of “liberalism” that had dominated politics during the New Deal era.

Article

Long regarded as a violent outburst significant mainly for California history, the 1871 Los Angeles anti-Chinese massacre raises themes central to America’s Civil War Reconstruction era between 1865 and 1877, namely, the resort to threats and violence to preserve traditionally conceived social and political authority and power. Although the Los Angeles events occurred far from the American South, the Los Angeles anti-Chinese massacre paralleled the anti-black violence that rose in the South during Reconstruction. Although the immediate causes of the violence in the post–Civil War South and California were far different, they shared one key characteristic: they employed racial disciplining to preserve traditional social orders that old elites saw as threatened by changing times and circumstances.

Article

Emancipation celebrations in the United States have been important and complicated moments of celebration and commemoration. Since the end of the slave trade in 1808 and the enactment of the British Emancipation Act in 1834 people of African descent throughout the Atlantic world have gathered, often in festival form, to remember and use that memory for more promising futures. In the United States, emancipation celebrations exploded after the Civil War, when each local community celebrated their own experience of emancipation. For many, the commemoration took the form of a somber church service, Watch Night, which recognized the signing of the Emancipation Proclamation on January 1, 1863. Juneteenth, which recognized the end of slavery in Texas on June 19, 1865, became one of the most vibrant and longstanding celebrations. Although many emancipation celebrations disappeared after World War I, Juneteenth remained a celebration in most of Texas through the late 1960s when it disappeared from all cities in the state. However, because of the Second Great Migration, Texans transplanted in Western cities continued the celebration in their new communities far from Texas. In Texas, Juneteenth was resurrected in 1979 when state representative, later Congressman, Al Edwards successfully sponsored a bill to make Juneteenth a state holiday and campaigned to spread Juneteenth throughout the country. This grassroots movement brought Juneteenth resolutions to forty-six states and street festivals in hundreds of neighborhoods. Juneteenth’s remarkable post-1980 spread has given it great resonance in popular culture as well, even becoming a focus of two major television episodes in 2016 and 2017.

Article

The patterns of urban slavery in North American and pre-Civil War US cities reveal the ways in which individual men and women, as well as businesses, institutions, and governmental bodies employed slave labor and readily adapted the system of slavery to their economic needs and desires. Colonial cities east and west of the Mississippi River founded initially as military forts, trading posts, and maritime ports, relied on African and Native American slave labor from their beginnings. The importance of slave labor increased in Anglo-American East Coast urban settings in the 18th century as the number of enslaved Africans increased in these colonies, particularly in response to the growth of the tobacco, wheat, and rice industries in the southern colonies. The focus on African slavery led most Anglo-American colonies to outlaw the enslavement of Native Americans, and urban slavery on the East Coast became associated almost solely with people of African descent. In addition, these cities became central nodes in the circum-Atlantic transportation and sale of enslaved people, slave-produced goods, and provisions for slave colonies whose economies centered on plantation goods. West of the Mississippi, urban enslavement of Native Americans, Mexicans, and even a few Europeans continued through the 19th century. As the thirteen British colonies transitioned to the United States during and after the Revolutionary War, three different directions emerged regarding the status of slavery, which would affect the status of slavery and people of African descent in cities. The gradual emancipation of enslaved people in states north of Delaware led to the creation of the so-called free states, with large numbers of free blacks moving into cities to take full advantage of freedom and the possibility of creating family and community. Although antebellum northern cities were located within areas where legalized slavery ended, these cities retained economic and political ties to southern slavery. At the same time, the radical antislavery movement developed in Philadelphia, Boston, and New York. Thus, Northern cities were the site of political conflicts between pro- and antislavery forces. In the Chesapeake, as the tobacco economy declined, slave owners manumitted enslaved blacks for whom they did not have enough work, creating large groups of free blacks in cities. But these states began to participate heavily in the domestic slave trade, with important businesses located in cities. And in the Deep South, the recommitment to slavery following the Louisiana Purchase and the emergence of the cotton economy led to the creation of a string of wealthy port cities critical to the transportation of slaves and goods. These cities were situated in local economic geographies that connected rural plantations to urban settings and in national and international economies of exchange of raw and finished goods that fueled industries throughout the Atlantic world. The vast majority of enslaved people employed in the antebellum South worked on rural farms, but slave labor was a key part of the labor force in southern cities. Only after the Civil War did slavery and cities become separate in the minds of Americans, as postwar whites north and south created a mythical South in which romanticized antebellum cotton plantations became the primary symbol of American slavery, regardless of the long history of slavery that preceded their existence.

Article

Jimmy Carter’s “Crisis of Confidence Speech” of July 1979 was a critical juncture in post-1945 U.S. politics, but it also marks an exemplary pivot in post-1945 religion. Five dimensions of faith shaped the president’s sermon. The first concerned the shattered consensus of American religion. When Carter encouraged Americans to recapture a spirit of unity, he spoke in a heartfelt but spent language more suitable to Dwight Eisenhower’s presidency than his own. By 1979, the Protestant-Catholic-Jewish consensus of Eisenhower’s time was fractured into a dynamic pluralism, remaking American religion in profound ways. Carter’s speech revealed a second revolution of post-1945 religion when it decried its polarization and politicization. Carter sought to heal ruptures that were dividing the nation between what observers, two decades hence, would label “red” (conservative Republican) and “blue” (liberal Democratic) constituencies. Yet his endeavors failed, as would be evidenced in the religious politics of Ronald Reagan’s era, which followed. Carter championed community values as the answer to his society’s problems aware of yet a third dawning reality: globalization. The virtues of localism that Carter espoused were in fact implicated in (and complicated by) transnational forces of change that saw immigration, missionary enterprises, and state and non-state actors internationalizing the American religious experience. A fourth illuminating dimension of Carter’s speech was its critique of America’s gospel of wealth. Although this “born-again” southerner was a product of the evangelical South’s revitalized free-market capitalism, he lamented how laissez-faire Christianity had become America’s lingua franca. Finally, Carter wrestled with secularization, revealing a fifth feature of post-1945 America. Even though faith commitments were increasingly cordoned off from formal state functions during this time, the nation’s political discourse acquired a pronounced religiosity. Carter contributed by framing mundane issues (such as energy) in moral contexts that drew no hard-and-fast boundaries between matters of the soul and governance. Drawn from the political and economic crises of his moment, Carter’s speech thus also reveals the all-enveloping tide of religion in America’s post-1945 age.