The history of the African American military experience in World War II tends to revolve around two central questions: How did World War II and American racism shape the black experience in the American military? And how did black GIs reshape the parameters of their wartime experiences? From the mid-1920s through the Great Depression years of the 1930s, military planners evaluated the performance of black soldiers in World War I while trying to ascertain their presence in future wars. However, quite often their discussions about African American servicemen in the military establishment were deeply moored in the traditions, customs, and practices of American racism, racist stereotypes, and innuendo. Simultaneously, African American leaders and their allies waged a relentless battle to secure the future presence of the uniformed men and women who would serve in the nation’s military. Through their exercise of voting rights, threats of protest demonstration, litigation, and White House lobbying from 1939 through 1942, civil rights advocates and their affiliates managed to obtain some minor concessions from the military establishment. But the military’s stubborn adherence to a policy barring black and white soldiers from serving in the same units continued through the rest of the war.
Between 1943 and 1945, black GIs faced white officer hostility, civilian antagonism, and military police brutality while undergoing military training throughout the country. Similarly, African American servicewomen faced systemic racism and sexism in the military during the period. Throughout various stages of the American war effort, black civil rights groups, the press, and their allies mounted the opening salvoes in the battle to protect and defend the wellbeing of black soldiers in uniform. While serving on the battlefields of World War II, fighting African American GIs became foot soldiers in the wider struggles against tyranny abroad. After returning home in 1945, black World War II-era activists such as Daisy Lampkin and Ruby Hurley, and ex-servicemen and women, laid the groundwork for the Civil Rights Movement.
Article
Black Soldiers in World War II America
Robert F. Jefferson
Article
The 1950s
Jennifer Delton
The 1950s have typically been seen as a complacent, conservative time between the end of World War II and the radical 1960s, when anticommunism and the Cold War subverted reform and undermined civil liberties. But the era can also be seen as a very liberal time in which meeting the Communist threat led to Keynesian economic policies, the expansion of New Deal programs, and advances in civil rights. Politically, it was “the Eisenhower Era,” dominated by a moderate Republican president, a high level of bipartisan cooperation, and a foreign policy committed to containing communism. Culturally, it was an era of middle-class conformity, which also gave us abstract expressionism, rock and roll, Beat poetry, and a grassroots challenge to Jim Crow.
Article
Suburbanization in the United States after 1945
Becky Nicolaides and Andrew Wiese
Mass migration to suburban areas was a defining feature of American life after 1945. Before World War II, just 13% of Americans lived in suburbs. By 2010, however, suburbia was home to more than half of the U.S. population. The nation’s economy, politics, and society suburbanized in important ways. Suburbia shaped habits of car dependency and commuting, patterns of spending and saving, and experiences with issues as diverse as race and taxes, energy and nature, privacy and community. The owner occupied, single-family home, surrounded by a yard, and set in a neighborhood outside the urban core came to define everyday experience for most American households, and in the world of popular culture and the imagination, suburbia was the setting for the American dream. The nation’s suburbs were an equally critical economic landscape, home to vital high-tech industries, retailing, “logistics,” and office employment. In addition, American politics rested on a suburban majority, and over several decades, suburbia incubated political movements across the partisan spectrum, from grass-roots conservativism, to centrist meritocratic individualism, environmentalism, feminism, and social justice. In short, suburbia was a key setting for postwar American life.
Even as suburbia grew in magnitude and influence, it also grew more diverse, coming to reflect a much broader cross-section of America itself. This encompassing shift marked two key chronological stages in suburban history since 1945: the expansive, racialized, mass suburbanization of the postwar years (1945–1970) and an era of intensive social diversification and metropolitan complexity (since 1970). In the first period, suburbia witnessed the expansion of segregated white privilege, bolstered by government policies, exclusionary practices, and reinforced by grassroots political movements. By the second period, suburbia came to house a broader cross section of Americans, who brought with them a wide range of outlooks, lifeways, values, and politics. Suburbia became home to large numbers of immigrants, ethnic groups, African Americans, the poor, the elderly and diverse family types. In the face of stubborn exclusionism by affluent suburbs, inequality persisted across metropolitan areas and manifested anew in proliferating poorer, distressed suburbs. Reform efforts sought to alleviate metro-wide inequality and promote sustainable development, using coordinated regional approaches. In recent years, the twin discourses of suburban crisis and suburban rejuvenation captured the continued complexity of America’s suburbs.
Article
The Tuskegee Syphilis Study
Susan M. Reverby
Between 1932 and 1972, the US Public Health Service (PHS) ran the Tuskegee Study of Untreated Syphilis in the Male Negro in Macon County, Alabama, to learn more about the effects of untreated syphilis on African Americans, and to see if the standard heavy metal treatments advocated at the time were efficacious in the disease’s late latent stage. Syphilis is a sexually transmitted infection and can be passed by a mother to her fetus at birth. It is contagious in its first two stages, but usually not in its third late latent stage. Syphilis can be, although is not always, fatal, and usually causes serious cardiovascular or neurological damage. To study the disease, the PHS recruited 624 African American men, 439 who were diagnosed with the latent stage of the disease and 185 without the disease who were to act as the controls in the experiment. However, the men were not told they were to participate in a medical experiment nor were they asked to give their consent to be used as subjects for medical research. Instead, the PHS led the men to believe that they were being treated for their syphilis by the provision of aspirins, iron tonics, vitamins, and diagnosis spinal taps, labeled a “special treatment” for the colloquial term “bad blood.” Indeed, even when penicillin became widely available by the early 1950s as a cure for syphilis, the researchers continued the study and tried to keep the men from treatment, however not always successfully.
Although a number of health professionals raised objections to the study over the years, while—thirteen articles were published in various medical journals, it continued unobstructed until 1972, when a journalist exposed the full implications of the study and a national uproar ensued. The widespread media coverage resulted in a successful lawsuit, federal paid health care to the remaining men and their syphilis-positive wives and children, Congressional hearings, a federal report, and changes to the legislation concerning informed consent for medical research. The government officially closed the study in 1972. In 1996, a Legacy Committee requested a formal apology from the federal government, which took place at the White House on May 16, 1997.
Rumors have surrounded the study since its public exposure, especially the beliefs that the government gave healthy men syphilis, rather than recruiting men that had the disease already, in order to conduct the research, and that all men in the study were left untreated decade after decade. In its public life, the study often serves a metaphor for mistrust of medical care and government research, memorialized in popular culture through music, plays, poems, and films.