Show Summary Details

Page of

Printed from Oxford Research Encyclopedias, American History. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 14 April 2021

The United States and World War Ilocked

  • Ross A. KennedyRoss A. KennedyDepartment of History, Illinois State University

Summary

World War I profoundly affected the United States. It led to an expansion of America’s permanent military establishment, a foreign policy focused on reforming world politics, and American preeminence in international finance. In domestic affairs, America’s involvement in the war exacerbated class, racial, and ethnic conflict. It also heightened both the ethos of voluntarism in progressive ideology and the progressive desire to step up state intervention in the economy and society. These dual impulses had a coercive thrust that sometimes advanced progressive goals of a more equal, democratic society and sometimes repressed any perceived threat to a unified war effort. Ultimately the combination of progressive and repressive coercion undermined support for the Democratic Party, shifting the nation’s politics in a conservative direction as it entered the 1920s.

You do not currently have access to this article

Login

Please login to access the full content.

Subscribe

Access to the full content requires a subscription