141-150 of 150 Results  for:

  • 20th Century: Pre-1945 x
Clear all

Article

West Virginia Mine Wars  

Lou Martin

In the early 20th century, West Virginia coal miners and mine operators fought a series of bloody battles that raged for two decades and prompted national debates over workers’ rights. Miners in the southern part of the state lived in towns wholly owned by coal companies and attempted to join the United Mine Workers of America (UMWA) to negotiate better working conditions but most importantly to restore their civil liberties. Mine operators saw unionization as a threat to their businesses and rights and hired armed guards to patrol towns and prevent workers from organizing. The operators’ allies in local and state government used their authority to help break strikes by sending troops to strike districts, declaring martial law, and jailing union organizers in the name of law and order. Observers around the country were shocked at the levels of violence as well as the conditions that fueled the battles. The Mine Wars include the Paint Creek–Cabin Creek Strike of 1912–1913, the so-called 1920 Matewan Massacre, the 1920 Three Days Battle, and the 1921 Battle of Blair Mountain. In this struggle over unionism, the coal operators prevailed, and West Virginia miners continued to work in nonunion mines and live in company towns through the 1920s.

Article

William McKinley Jr.  

Aroop Mukharji

Born in 1843 as the seventh of nine children to a Methodist family in Niles, Ohio, William McKinley was not destined for political greatness. Much like his politics, his rise was steady and incremental, his ambition as patient as it was large. After four years serving the Union in the Civil War, McKinley returned to Ohio to start a local law practice. Following a short stint as a county prosecutor, he married, started a family, and then met with his life’s greatest tragedy: the deaths of both of his young daughters within two years of each other. Amid this immense personal turmoil, McKinley ran for Congress. He served seven terms until a Democratic challenger unseated him, enabled by gerrymandered district lines. Within a matter of months, McKinley turned around to win Ohio’s governorship twice, before becoming the nation’s twenty-fifth president in 1897. However faded he has become in historical memory, at the time of his assassination in 1901, just six months into his second presidential term, McKinley was a towering figure in US politics. He led the United States in three wars spanning two continents and was only the third US president in almost seven decades to win two consecutive terms. In foreign policy, where he left his greatest mark, McKinley changed the trajectory of US history by consolidating US control over the Caribbean, defeating a European power in war, and irreversibly expanding the US military to sustain an empire that stretched 7,000 miles into the Pacific Ocean. The costs were significant: hundreds of thousands of Filipinos dead, millions colonized under American rule, and new strategic commitments too distant to reasonably protect. It is therefore one of the greatest ironies of US presidential history that so much about McKinley’s life remains shrouded in mystery or, worse, forgotten.

Article

Wilsonianism  

Trygve Throntveit

An ungainly word, it has proven tenacious. Since the early Cold War, “Wilsonianism” has been employed by historians and analysts of US foreign policy to denote two historically related but ideologically and operationally distinct approaches to world politics. One is the foreign policy of the term’s eponym, President Woodrow Wilson, during and after World War I—in particular his efforts to engage the United States and other powerful nations in the cooperative maintenance of order and peace through a League of Nations. The other is the tendency of later administrations and political elites to deem an assertive, interventionist, and frequently unilateralist foreign policy necessary to advance national interests and preserve domestic institutions. Both versions of Wilsonianism have exerted massive impacts on US and international politics and culture. Yet both remain difficult to assess or even define. As historical phenomena they are frequently conflated; as philosophical labels they are ideologically freighted. Perhaps the only consensus is that the term implies the US government’s active rather than passive role in the international order. It is nevertheless important to distinguish Wilson’s “Wilsonianism” from certain doctrines and practices later attributed to him or traced to his influence. The major reasons are two. First, misconceptions surrounding the aims and outcomes of Wilson’s international policies continue to distort historical interpretation in multiple fields, including American political, cultural, and diplomatic history and the history of international relations. Second, these distortions encourage the conflation of Wilsonian internationalism with subsequent yet distinct developments in American foreign policy. The confused result promotes ideological over historical readings of the nation’s past, which in turn constrain critical and creative thinking about its present and future as a world power.

Article

Women and Alcohol in the United States during the 20th Century  

Meg D. O'Sullivan

Women in the United States have drunk, made, bought, sold, and organized both against and for the consumption of alcohol throughout the nation’s history. During the second half of the 20th century, however, women became increasingly visible as social drinkers and alcoholics. Specifically, the 1970s and 1980s marked women’s relationship to alcohol in interesting ways that both echoed moments from the past and ushered in new realities. Throughout these decades, women emerged as: (1) alcoholics who sought recovery in Alcoholics Anonymous or a lesser-known all-women’s sobriety program; (2) anti-alcohol activists who drew authority from their status as mothers; (3) potential criminals who harmed their progeny via fetal alcohol syndrome; and (4) recovery memoirists who claimed their addictions in unprecedented ways.

Article

Women and Sexual Assault in the United States, 1900–1940  

Mara Keire

In the United States, the history of sexual assault in the first half of the 20th century involves multiple contradictions between the ordinary, almost invisible accounts of women of all colors who were raped by fathers, husbands, neighbors, boarders, bosses, hired hands, and other known individuals versus the sensational myths that involved rapacious black men, sly white slavers, libertine elites, and virginal white female victims. Much of the debate about sexual assault revolved around the “unwritten law” that justified “honorable” white men avenging the “defilement” of their women. Both North and South, white people defended lynching and the murder of presumed rapists as “honor killings.” In courtrooms, defense attorneys linked the unwritten law to insanity pleas, arguing that after hearing women tell about their assault, husbands and fathers experienced an irresistible compulsion to avenge the rape of their women. Over time, however, notorious court cases from New York to San Francisco, Indianapolis and Honolulu, to Scottsboro, Alabama, shifted the discourse away from the unwritten law and extralegal “justice” to a more complicated script that demonized unreliable women and absolved imperfect men. National coverage of these cases, made possible by wire services and the Hearst newspaper empire, spurred heated debates concerning the proper roles of men and women. Blockbuster movies like The Birth of a Nation and Gone with the Wind and Book of the Month Club selections such as John Steinbeck’s Of Mice and Men and Richard Wright’s Native Son joined the sensationalized media coverage of high-profile court cases to create new national stereotypes about sexual violence and its causes and culprits. During the 1930s, journalists, novelists, playwrights, and moviemakers increasingly emphasized the culpability of women who, according to this narrative, made themselves vulnerable to assault by stepping outside of their appropriate sphere and tempting men into harming them.

Article

Women, Gender, and Red Scares in the Modern United States  

Erica J. Ryan

The first Red Scare, after World War I, and the Red Scare that followed World War II, both impacted American women in remarkably similar ways. Many women found their lives hemmed in by antifeminism and the conservative gender ideology that underwrote anticommunist national identity in 1919, and then again in the late 1940s. This cultural nationalism tied traditional gender norms to the defense of American values and ideals, positioning the family as a bulwark against communism while making women’s performance of gender roles symbolic of national health or sickness. Within this gendered nationalism, the first Red Scare offered opportunities for conservative women to join the antiradical cause as protectors of the home. These same antiradicals maligned radical and progressive women for their feminism and their social activism. The second Red Scare played out in similar fashion. Anticommunism provided a safe platform for conservative women to engage in political activism in defense of the family, and in turn, they participated in broader efforts that attacked and weakened civil rights claims and the social justice efforts of women on the left. In each Red Scare the symbols and rhetoric of anticommunism prioritized women’s relationship to the family, positioning them either as bastions of American virtue or as fundamental threats to the social and political order. Gender proved critical to the construction of patriotism and national identity.

Article

Women, Gender, and World War II  

Melissa A. McEuen

The Second World War changed the United States for women, and women in turn transformed their nation. Over three hundred fifty thousand women volunteered for military service, while twenty times as many stepped into civilian jobs, including positions previously closed to them. More than seven million women who had not been wage earners before the war joined eleven million women already in the American work force. Between 1941 and 1945, an untold number moved away from their hometowns to take advantage of wartime opportunities, but many more remained in place, organizing home front initiatives to conserve resources, to build morale, to raise funds, and to fill jobs left by men who entered military service. The U.S. government, together with the nation’s private sector, instructed women on many fronts and carefully scrutinized their responses to the wartime emergency. The foremost message to women—that their activities and sacrifices would be needed only “for the duration” of the war—was both a promise and an order, suggesting that the war and the opportunities it created would end simultaneously. Social mores were tested by the demands of war, allowing women to benefit from the shifts and make alterations of their own. Yet dominant gender norms provided ways to maintain social order amidst fast-paced change, and when some women challenged these norms, they faced harsh criticism. Race, class, sexuality, age, religion, education, and region of birth, among other factors, combined to limit opportunities for some women while expanding them for others. However temporary and unprecedented the wartime crisis, American women would find that their individual and collective experiences from 1941 to 1945 prevented them from stepping back into a prewar social and economic structure. By stretching and reshaping gender norms and roles, World War II and the women who lived it laid solid foundations for the various civil rights movements that would sweep the United States and grip the American imagination in the second half of the 20th century.

Article

Women, Militarized Domesticity, and Transnationality in the U.S. Occupation of Okinawa  

Mire Koikari

After World War II, Okinawa was placed under U.S. military rule and administratively separated from mainland Japan. This occupation lasted from 1945 to 1972, and in these decades Okinawa became the “Keystone of the Pacific,” a leading strategic site in U.S. military expansionism in Asia and the Pacific. U.S. rule during this Cold War period was characterized by violence and coercion, resulting in an especially staggering scale of sexual violence against Okinawan women by U.S. military personnel. At the same time, the occupation also facilitated numerous cultural encounters between the occupiers and the occupied, leading to a flourishing cross-cultural grassroots exchange. A movement to establish American-style domestic science (i.e., home economics) in the occupied territory became a particularly important feature of this exchange, one that mobilized an assortment of women—home economists, military wives, club women, university students, homemakers—from the United States, Okinawa, and mainland Japan. The postwar domestic science movement turned Okinawa into a vibrant theater of Cold War cultural performance where women of diverse backgrounds collaborated to promote modern homemaking and build friendship across racial and national divides. As these women took their commitment to domesticity and multiculturalism into the larger terrain of the Pacific, they articulated the complex intertwining that occurred among women, domesticity, the military, and empire.

Article

Working-Class Anti-Unionism  

Dana M. Caldemeyer

Unlike the anti-unionism that runs through the ranks of employers, worker anti-unionism describes the workers who are opposed to or who work against unionization. Anti-union actions can be seen throughout the United States from the early industrial age forward and include anything from refusing to join the union or follow union orders, to fighting against the union, such as with strikebreaking. Workers’ reasons for acting against the union, however, are far more complex, including the economic gains that come from remaining outside the union, moral opposition to unionism, and spite against the union. The variations between workers’ reasons for rejecting the union, then, provide insight into how workers define their place in society as well as their relationship with the union.

Article

Zoning in 20th-Century American Cities  

Christopher Silver

Zoning is a legal tool employed by local governments to regulate land development. It determines the use, intensity, and form of development in localities through enforcement of the zoning ordinance, which consists of a text and an accompanying map that divides the locality into zones. Zoning is an exercise of the police powers by local governments, typically authorized through state statutes. Components of what became part of the zoning process emerged piecemeal in U.S. cities during the 19th century in response to development activities deemed injurious to the health, safety, and welfare of the community. American zoning was influenced by and drew upon models already in place in German cities early in the 20th century. Following the First National Conference on Planning and Congestion, held in Washington, DC in 1909, the zoning movement spread throughout the United States. The first attempt to apply a version of the German zoning model to a U.S. city was in New York City in 1916. In the landmark U.S. Supreme Court case, Ambler Realty v. Village of Euclid (1926), zoning was ruled as a constitutional exercise of the police power, a precedent-setting case that defined the perimeters of land use regulation the remainder of the 20th century. Zoning was explicitly intended to sanction regulation of real property use to serve the public interest, but frequently, it was used to facilitate social and economic segregation. This was most often accomplished by controlling the size and type of housing, where high density housing (for lower income residents) could be placed in relation to commercial and industrial uses, and in some cases through explicit use of racial zoning categories for zones. The U.S. Supreme Court ruled, in Buchanan v. Warley (1917), that a racial zoning plan of the city of Louisville, Kentucky violated the due process clause of the14th Amendment. The decision, however, did not directly address the discriminatory aspects of the law. As a result, efforts to fashion legally acceptable racial zoning schemes persisted late into the 1920s. These were succeeded by the use of restrictive covenants to prohibit black (and other minority) occupancy in certain white neighborhoods (until declared unconstitutional in the late 1940s). More widespread was the use of highly differentiated residential zoning schemes and real estate steering that imbedded racial and ethnic segregation into the residential fabric of American communities. The Standard State Zoning Enabling Act (SSZEA) of 1924 facilitated zoning. Disseminated by the U.S. Department of Commerce, the SSZEA created a relatively uniform zoning process in U.S. cities, although depending upon their size and functions, there were definite differences in the complexity and scope of zoning schemes. The reason why localities followed the basic form prescribed by the SSZEA was to minimize the chance of the zoning ordinance being struck down by the courts. Nonetheless, from the 1920s through the 1970s, thousands of court cases tested aspects of zoning, but only a few reached the federal courts, and typically, zoning advocates prevailed. In the 1950s and 1960s, critics of zoning charged that the fragmented city was an unintended consequence. This critique was a response to concerns that zoning created artificial separations among the various types of development in cities, and that this undermined their vitality. Zoning nevertheless remained a cornerstone of U.S. urban and suburban land regulation, and new techniques such as planned unit developments, overlay zones, and form-based codes introduced needed flexibility to reintegrate urban functions previously separated by conventional zoning approaches.