1-10 of 27 Results  for:

  • History of Science and Technology x
Clear all

Article

San Jose and “Silicon Valley”  

Glenna Matthews

Since 1988, San Jose has billed itself as the Capital of Silicon Valley. Thirty-five years later, it is the wealthiest city in the country. Despite San Jose’s size and its self-proclaimed title, however, the city remains far less known than its Bay Area sister cities—San Francisco and Oakland—both of which were smaller in population after 1980. Yet the history of San Jose and the metropolitan region deserves to be better known for a multiplicity of reasons. First, in the heart of the Santa Clara Valley, San Jose was the first secular pueblo established by the Spanish in Alta California, in 1777, hence is California’s oldest city. Second, with its Native Californian population, its Hispanic Catholic first settlers, and the diversity of immigrants after the discovery of gold in 1848, San Jose has never been a place of Protestant hegemony. Despite the existence of racism and ethnocentrism, newcomers there encountered a playing field different from much of the country. That the Silicon Valley workforce has had so strong an immigrant profile is perhaps related to the fact that San Jose was born diverse. Finally, San Jose’s political and economic history are important. A small market center for the vast fruit-growing and processing industry in the Valley—as of 1930, there were at least a hundred thousand acres in orchards and dozens of canneries—San Jose began a transformative period of explosive growth during World War II that saw the city’s local economy dramatically change while increasing tenfold in geographical size and twice that in population. Local boosters, in fact, hoped and planned for it to become “the Los Angeles of the North.” Whether that goal was desirable or not, their vision, along with developments at Stanford University and enormous amounts of federal spending on defense, paved the way for the Santa Clara Valley to evolve into “Silicon Valley.”

Article

Credit Reporting and the History of Commercial Surveillance in America  

Josh Lauer

The first credit reporting organizations emerged in the United States during the 19th century to address problems of risk and uncertainty in an expanding market economy. Early credit reporting agencies assisted merchant lenders by collecting and centralizing information about the business activities and reputations of unknown borrowers throughout the country. These agencies quickly evolved into commercial surveillance networks, amassing huge archives of personal information about American citizens and developing credit rating systems to rank them. Shortly after the Civil War, separate credit reporting organizations devoted to monitoring consumers, rather than businesspeople, also began to emerge to assist credit-granting retailers. By the early 20th century, hundreds of local credit bureaus dissected the personal affairs of American consumers, forming the genesis of a national consumer credit surveillance infrastructure. The history of American credit reporting reveals fundamental links between the development of modern capitalism and contemporary surveillance society. These connections became increasingly apparent during the late 20th century as technological advances in computing and networked communication fueled the growth of new information industries, raising concerns about privacy and discrimination. These connections and concerns, however, are not new. They can be traced to 19th-century credit reporting organizations, which turned personal information into a commodity and converted individual biographies into impersonal financial profiles and risk metrics. As these disembodied identities and metrics became authoritative representations of one’s reputation and worth, they exerted real effects on one’s economic life chances and social legitimacy. While drawing attention to capitalism’s historical twin, surveillance, the history of credit reporting illuminates the origins of surveillance-based business models that became ascendant during the 21st century.

Article

The Tuskegee Syphilis Study  

Susan M. Reverby

Between 1932 and 1972, the US Public Health Service (PHS) ran the Tuskegee Study of Untreated Syphilis in the Male Negro in Macon County, Alabama, to learn more about the effects of untreated syphilis on African Americans, and to see if the standard heavy metal treatments advocated at the time were efficacious in the disease’s late latent stage. Syphilis is a sexually transmitted infection and can be passed by a mother to her fetus at birth. It is contagious in its first two stages, but usually not in its third late latent stage. Syphilis can be, although is not always, fatal, and usually causes serious cardiovascular or neurological damage. To study the disease, the PHS recruited 624 African American men, 439 who were diagnosed with the latent stage of the disease and 185 without the disease who were to act as the controls in the experiment. However, the men were not told they were to participate in a medical experiment nor were they asked to give their consent to be used as subjects for medical research. Instead, the PHS led the men to believe that they were being treated for their syphilis by the provision of aspirins, iron tonics, vitamins, and diagnosis spinal taps, labeled a “special treatment” for the colloquial term “bad blood.” Indeed, even when penicillin became widely available by the early 1950s as a cure for syphilis, the researchers continued the study and tried to keep the men from treatment, however not always successfully. Although a number of health professionals raised objections to the study over the years, while—thirteen articles were published in various medical journals, it continued unobstructed until 1972, when a journalist exposed the full implications of the study and a national uproar ensued. The widespread media coverage resulted in a successful lawsuit, federal paid health care to the remaining men and their syphilis-positive wives and children, Congressional hearings, a federal report, and changes to the legislation concerning informed consent for medical research. The government officially closed the study in 1972. In 1996, a Legacy Committee requested a formal apology from the federal government, which took place at the White House on May 16, 1997. Rumors have surrounded the study since its public exposure, especially the beliefs that the government gave healthy men syphilis, rather than recruiting men that had the disease already, in order to conduct the research, and that all men in the study were left untreated decade after decade. In its public life, the study often serves a metaphor for mistrust of medical care and government research, memorialized in popular culture through music, plays, poems, and films.

Article

The Information Economy  

Jamie L. Pietruska

The term “information economy” first came into widespread usage during the 1960s and 1970s to identify a major transformation in the postwar American economy in which manufacturing had been eclipsed by the production and management of information. However, the information economy first identified in the mid-20th century was one of many information economies that have been central to American industrialization, business, and capitalism for over two centuries. The emergence of information economies can be understood in two ways: as a continuous process in which information itself became a commodity, as well as an uneven and contested—not inevitable—process in which economic life became dependent on various forms of information. The production, circulation, and commodification of information has historically been essential to the growth of American capitalism and to creating and perpetuating—and at times resisting—structural racial, gender, and class inequities in American economy and society. Yet information economies, while uneven and contested, also became more bureaucratized, quantified, and commodified from the 18th century to the 21st century. The history of information economies in the United States is also characterized by the importance of systems, networks, and infrastructures that link people, information, capital, commodities, markets, bureaucracies, technologies, ideas, expertise, laws, and ideologies. The materiality of information economies is historically inextricable from production of knowledge about the economy, and the concepts of “information” and “economy” are themselves historical constructs that change over time. The history of information economies is not a teleological story of progress in which increasing bureaucratic rationality, efficiency, predictability, and profit inevitably led to the 21st-century age of Big Data. Nor is it a singular story of a single, coherent, uniform information economy. The creation of multiple information economies—at different scales in different regions—was a contingent, contested, often inequitable process that did not automatically democratize access to objective information.

Article

Chemical and Biological Weapons Policy  

Thomas I. Faith

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Civilian Nuclear Power  

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

Technology and US Foreign Relations  

Michael A. Krysko

Technology is ubiquitous in the history of US foreign relations. Throughout US history, technology has played an essential role in how a wide array of Americans have traveled to and from, learned about, understood, recorded and conveyed information about, and attempted to influence, benefit from, and exert power over other lands and peoples. The challenge for the historian is not to find where technology intersects with the history of US foreign relations, but how to place a focus on technology without falling prey to deterministic assumptions about the inevitability of the global power and influence—or lack thereof—the United States has exerted through the technology it has wielded. “Foreign relations” and “technology” are, in fact, two terms with extraordinarily broad connotations. “Foreign relations” is not synonymous with “diplomacy,” but encompasses all aspects and arenas of American engagement with the world. “Technology” is itself “an unusually slippery term,” notes prominent technology historian David Nye, and can refer to simple tools, more complex machines, and even more complicated and expansive systems on which the functionality of many other innovations depends. Furthermore, processes of technological innovation, proliferation, and patterns of use are shaped by a dizzying array of influences embedded within the larger surrounding context, including but by no means limited to politics, economics, laws, culture, international exchanges, and environment. While some of the variables that have shaped how the United States has deployed its technological capacities were indeed distinctly American, others arose outside the United States and lay beyond any American ability to control. A technology-focused rendering of US foreign relations and global ascendancy is not, therefore, a narrative of uninterrupted progress and achievement, but an accounting of both successes and failures that illuminate how surrounding contexts and decisions have variably shaped, encouraged, and limited the technology and power Americans have wielded.

Article

Skyscrapers and Tall Buildings  

Elihu Rubin

The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.

Article

Contagious Disease and Public Health in the American City  

Daniel Wilson

Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.

Article

Universities and Information Centers in U.S. Cities  

LaDale Winling

The transformation of post-industrial American life in the late 20th and early 21st centuries includes several economically robust metropolitan centers that stand as new models of urban and economic life, featuring well-educated populations that engage in professional practices in education, medical care, design and legal services, and artistic and cultural production. By the early 21st century, these cities dominated the nation’s consciousness economically and culturally, standing in for the most dynamic and progressive sectors of the economy, driven by collections of technical and creative spark. The origins of these academic and knowledge centers are rooted in the political economy, including investments shaped by federal policy and philanthropic ambition. Education and health care communities were and remain frequently economically robust but also rife with racial, economic, and social inequality, and riddled with resulting political tensions over development. These information communities fundamentally incubated and directed the proceeds of the new economy, but also constrained who accessed this new mode of wealth in the knowledge economy.