Technology and environmental history are both relatively young disciplines among Americanists, and during their early years they developed as distinctly different and even antithetical fields, at least in topical terms. Historians of technology initially focused on human-made and presumably “unnatural” technologies, whereas environmental historians focused on nonhuman and presumably “natural” environments. However, in more recent decades, both disciplines have moved beyond this oppositional framing. Historians of technology increasingly came to view anthropogenic artifacts such as cities, domesticated animals, and machines as extensions of the natural world rather than its antithesis. Even the British and American Industrial Revolutions constituted not a distancing of humans from nature, as some scholars have suggested, but rather a deepening entanglement with the material environment. At the same time, many environmental historians were moving beyond the field’s initial emphasis on the ideal of an American and often Western “wilderness” to embrace a concept of the environment as including humans and productive work. Nonetheless, many environmental historians continued to emphasize the independent agency of the nonhuman environment of organisms and things. This insistence that not everything could be reduced to human culture remained the field’s most distinctive feature. Since the turn of millennium, the two fields have increasingly come together in a variety of synthetic approaches, including Actor Network Theory, envirotechnical analysis, and neomaterialist theory. As the influence of the cultural turn has waned, the environmental historians’ emphasis on the independent agency of the nonhuman has come to the fore, gaining wider influence as it is applied to the dynamic “nature” or “wildness” that some scholars argue exists within both the technological and natural environment. The foundational distinctions between the history of technology and environmental history may now be giving way to more materially rooted attempts to understand how a dynamic hybrid environment helps to create human history in all of its dimensions—cultural, social, and biological.
Timothy James LeCain
Michael A. Krysko
Technology is ubiquitous in the history of US foreign relations. Throughout US history, technology has played an essential role in how a wide array of Americans have traveled to and from, learned about, understood, recorded and conveyed information about, and attempted to influence, benefit from, and exert power over other lands and peoples. The challenge for the historian is not to find where technology intersects with the history of US foreign relations, but how to place a focus on technology without falling prey to deterministic assumptions about the inevitability of the global power and influence—or lack thereof—the United States has exerted through the technology it has wielded. “Foreign relations” and “technology” are, in fact, two terms with extraordinarily broad connotations. “Foreign relations” is not synonymous with “diplomacy,” but encompasses all aspects and arenas of American engagement with the world. “Technology” is itself “an unusually slippery term,” notes prominent technology historian David Nye, and can refer to simple tools, more complex machines, and even more complicated and expansive systems on which the functionality of many other innovations depends. Furthermore, processes of technological innovation, proliferation, and patterns of use are shaped by a dizzying array of influences embedded within the larger surrounding context, including but by no means limited to politics, economics, laws, culture, international exchanges, and environment. While some of the variables that have shaped how the United States has deployed its technological capacities were indeed distinctly American, others arose outside the United States and lay beyond any American ability to control. A technology-focused rendering of US foreign relations and global ascendancy is not, therefore, a narrative of uninterrupted progress and achievement, but an accounting of both successes and failures that illuminate how surrounding contexts and decisions have variably shaped, encouraged, and limited the technology and power Americans have wielded.
The transformation of post-industrial American life in the late 20th and early 21st centuries includes several economically robust metropolitan centers that stand as new models of urban and economic life, featuring well-educated populations that engage in professional practices in education, medical care, design and legal services, and artistic and cultural production. By the early 21st century, these cities dominated the nation’s consciousness economically and culturally, standing in for the most dynamic and progressive sectors of the economy, driven by collections of technical and creative spark. The origins of these academic and knowledge centers are rooted in the political economy, including investments shaped by federal policy and philanthropic ambition. Education and health care communities were and remain frequently economically robust but also rife with racial, economic, and social inequality, and riddled with resulting political tensions over development. These information communities fundamentally incubated and directed the proceeds of the new economy, but also constrained who accessed this new mode of wealth in the knowledge economy.
Christopher P. Loss
Until World War II, American universities were widely regarded as good but not great centers of research and learning. This changed completely in the press of wartime, when the federal government pumped billions into military research, anchored by the development of the atomic bomb and radar, and into the education of returning veterans under the GI Bill of 1944. The abandonment of decentralized federal–academic relations marked the single most important development in the history of the modern American university. While it is true that the government had helped to coordinate and fund the university system prior to the war—most notably the country’s network of public land-grant colleges and universities—government involvement after the war became much more hands-on, eventually leading to direct financial support to and legislative interventions on behalf of core institutional activities, not only the public land grants but the nation’s mix of private institutions as well. However, the reliance on public subsidies and legislative and judicial interventions of one kind or another ended up being a double-edged sword: state action made possible the expansion in research and in student access that became the hallmarks of the post-1945 American university; but it also created a rising tide of expectations for continued support that has proven challenging in fiscally stringent times and in the face of ongoing political fights over the government’s proper role in supporting the sector.