61-80 of 454 Results

Article

David Blanke

The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.

Article

Tyson Reeder

The United States has shared an intricate and turbulent history with Caribbean islands and nations since its inception. In its relations with the Caribbean, the United States has displayed the dueling tendencies of imperialism and anticolonialism that characterized its foreign policy with South America and the rest of the world. For nearly two and a half centuries, the Caribbean has stood at the epicenter of some of the US government’s most controversial and divisive foreign policies. After the American Revolution severed political ties between the United States and the British West Indies, US officials and traders hoped to expand their political and economic influence in the Caribbean. US trade in the Caribbean played an influential role in the events that led to the War of 1812. The Monroe Doctrine provided a blueprint for reconciling imperial ambitions in the Caribbean with anti-imperial sentiment. During the mid-19th century, Americans debated the propriety of annexing Caribbean islands, especially Cuba. After the Spanish-American War of 1898, the US government took an increasingly imperialist approach to its relations with the Caribbean, acquiring some islands as federal territories and augmenting its political, military, and economic influence in others. Contingents of the US population and government disapproved of such imperialistic measures, and beginning in the 1930s the US government softened, but did not relinquish, its influence in the Caribbean. Between the 1950s and the end of the Cold War, US officials wrestled with how to exert influence in the Caribbean in a postcolonial world. Since the end of the Cold War, the United States has intervened in Caribbean domestic politics to enhance democracy, continuing its oscillation between democratic and imperial impulses.

Article

Carlos Montezuma was one of the most influential Indians of his day and a prominent leader among the Red Progressives of the late 19th and early 20th centuries. Born to Yavapai parents in central Arizona, he was kidnapped by O’odham (Pima) raiders at a young age, and sold soon after into the Indian slave trade that for centuries had engulfed the US-Mexico borderlands. Educated primarily at public schools in Illinois, Montezuma eventually went on to be the first Native American graduate of the University of Illinois (1884) and one of the first Native American doctors (Chicago Medical College, 1889). Montezuma was a lifelong friend of Richard Henry Pratt, the founder of the Carlisle Indian Industrial School, and he firmly believed in the importance of Indian education. He insisted that educated Indians like himself must serve as examples of what Indians were capable of achieving if given the opportunities. He became deeply involved in the pan-Indian reform movements of the day and was one of the founding members of the Society of American Indians. Montezuma had a rocky relationship with the group, however, because many in the organization found his calls for the immediate abolition of the Indian Bureau and an end to the reservation system difficult to accept. From 1916 to 1922, he published his own journal, Wassaja, in which he relentlessly assailed the Indian Bureau, the reservations, and anyone who stood in the way of Indian “progress.” But Montezuma’s most important work was as an advocate for his own people, the Yavapais of Fort McDowell, Arizona, and other Arizona Indian groups. He spent the final decade of his life working to protect their water, land, and culture, and eventually returned to his Arizona homelands to die, in 1923. Although he was largely forgotten by historians and scholars in the decades after his death, Carlos Montezuma is now correctly remembered as one of the most important figures in Native American history during the Progressive Era.

Article

Brooke Bauer

The Catawba Indian Nation of the 1750s developed from the integration of diverse Piedmont Indian people who belonged to and lived in autonomous communities along the Catawba River of North and South Carolina. Catawban-speaking Piedmont Indians experienced many processes of coalescence, where thinly populated groups joined the militarily strong Iswą Indians (Catawba proper) for protection and survival. Over twenty-five groups of Indians merged with the Iswą, creating an alliance or confederation of tribal communities. They all worked together building a unified community through kinship, traditional customs, and a shared history to form a nation, despite the effects of colonialism, which included European settlement, Indian slavery, warfare, disease, land loss, and federal termination. American settler colonialism, therefore, functions to erase and exterminate Native societies through biological warfare (intentional or not), military might, seizure of Native land, and assimilation. In spite of these challenges, the Catawbas’ nation-building efforts have been constant, but in 1960 the federal government terminated its relationship with the Nation. In the 1970s, the Catawba Indian Nation filed a suit to reclaim their land and their federal recognition status. Consequently, the Nation received federal recognition in 1993 and became the only federally recognized tribe in the state of South Carolina. The Nation has land seven miles east of the city of Rock Hill along the Catawba River. Tribal citizenship consists of 3,400 Catawbas including 2,400 citizens of voting age. The tribe holds elections every four years to fill five executive positions—Chief, Assistant Chief, Secretary/Treasurer, and two at-large positions. Scholarship on Southeastern Indians focuses less on the history of the Catawba Indian Nation and more on the historical narratives of the Five Civilized Tribes, which obscures the role Catawbas filled in the history of the development of the South. Finally, a comprehensive Catawba Nation history explains how the people became Catawba and, through persistence, ensured the survival of the Nation and its people.

Article

Margaret McGuinness

The Catholic Church has been a presence in the United States since the arrival of French and Spanish missionaries in the 16th and 17th centuries. The Spanish established a number of missions in what is now the western part of the United States; the most important French colony was New Orleans. Although they were a minority in the thirteen British colonies prior to the American Revolution, Catholics found ways to participate in communal forms of worship when no priest was available to celebrate Mass. John Carroll was appointed superior of the Mission of the United States of America in 1785. Four years later, Carroll was elected the first bishop in the United States; his diocese encompassed the entire country. The Catholic population of the United States began to grow during the first half of the 19th century primarily due to Irish and German immigration. Protestant America was often critical of the newcomers, believing one could not be a good Catholic and a good American at the same time. By 1850, Roman Catholicism was the largest denomination in the United States. The number of Catholics arriving in the United States declined during the Civil War but began to increase after the cessation of hostilities. Catholic immigrants during the late 19th and early 20th centuries were primarily from southern and Eastern Europe, and they were not often welcomed by a church that was dominated by Irish and Irish American leaders. At the same time that the church was expanding its network of parishes, schools, and hospitals to meet the physical and spiritual needs of the new immigrants, other Catholics were determining how their church could speak to issues of social and economic justice. Dorothy Day, Father Charles Coughlin, and Monsignor John A. Ryan are three examples of practicing Catholics who believed that the principles of Catholicism could help to solve problems related to international relations, poverty, nuclear weapons, and the struggle between labor and capital. In addition to changes resulting from suburbanization, the Second Vatican Council transformed Catholicism in the United States. Catholics experienced other changes as a decrease in the number of men and women entering religious life led to fewer priests and sisters staffing parochial schools and parishes. In the early decades of the 21st century, the church in the United States was trying to recover from the sexual abuse crisis. Visiting America in 2015, Pope Francis reminded Catholics of the important teachings of the church regarding poverty, justice, and climate change. It remains to be seen what impact his papacy will have on the future of Catholicism in the United States.

Article

The NAACP, established in 1909, was formed as an integrated organization to confront racism in the United States rather than seeing the issue as simply a southern problem. It is the longest running civil rights organization and continues to operate today. The original name of the organization was The National Negro League, but this was changed to the NAACP on May 30, 1910. Organized to promote racial equality and integration, the NAACP pursued this goal via legal cases, political lobbying, and public campaigns. Early campaigns involved lobbying for national anti-lynching legislation, pursuing through the US Supreme Court desegregation in areas such as housing and higher education, and the pursuit of voting rights. The NAACP is renowned for the US Supreme Court case of Brown v. Board of Education (1954) that desegregated primary and secondary schools and is seen as a catalyst for the civil rights movement (1955–1968). It also advocated public education by promoting African American achievements in education and the arts to counteract racial stereotypes. The organization published a monthly journal, The Crisis, and promoted African American art forms and culture as another means to advance equality. NAACP branches were established all across the United States and became a network of information, campaigning, and finance that underpinned activism. Youth groups and university branches mobilized younger members of the community. Women were also invaluable to the NAACP in local, regional, and national decision-making processes and campaigning. The organization sought to integrate African Americans and other minorities into the American social, political, and economic model as codified by the US Constitution.

Article

Evan D. McCormick

Since gaining independence in 1823, the states comprising Central America have had a front seat to the rise of the United States as a global superpower. Indeed, more so than anywhere else, the United States has sought to use its power to shape Central America into a system that heeds US interests and abides by principles of liberal democratic capitalism. Relations have been characterized by US power wielded freely by officials and non-state actors alike to override the aspirations of Central American actors in favor of US political and economic objectives: from the days of US filibusterers invading Nicaragua in search of territory; to the occupations of the Dollar Diplomacy era, designed to maintain financial and economic stability; to the covert interventions of the Cold War era. For their part, the Central American states have, at various times, sought to challenge the brunt of US hegemony, most effectively when coordinating their foreign policies to balance against US power. These efforts—even when not rejected by the United States—have generally been short-lived, hampered by economic dependency and political rivalries. The result is a history of US-Central American relations that wavers between confrontation and cooperation, but is remarkable for the consistency of its main element: US dominance.

Article

The central business district, often referred to as the “downtown,” was the economic nucleus of the American city in the 19th and 20th centuries. It stood at the core of urban commercial life, if not always the geographic center of the metropolis. Here was where the greatest number of offices, banks, stores, and service institutions were concentrated—and where land values and building heights reached their peaks. The central business district was also the most easily accessible point in a city, the place where public transit lines intersected and brought together masses of commuters from outlying as well as nearby neighborhoods. In the downtown, laborers, capitalists, shoppers, and tourists mingled together on bustling streets and sidewalks. Not all occupants enjoyed equal influence in the central business district. Still, as historian Jon C. Teaford explained in his classic study of American cities, the downtown was “the one bit of turf common to all,” the space where “the diverse ethnic, economic, and social strains of urban life were bound together, working, spending, speculating, and investing.” The central business district was not a static place. Boundaries shifted, expanding and contracting as the city grew and the economy evolved. So too did the primary land uses. Initially a multifunctional space where retail, wholesale, manufacturing, and financial institutions crowded together, the central business district became increasingly segmented along commercial lines in the 19th century. By the early 20th century, rising real estate prices and traffic congestion drove most manufacturing and processing operations to the periphery. Remaining behind in the city center were the bulk of the nation’s offices, stores, and service institutions. As suburban growth accelerated in the mid-20th century, many of these businesses also vacated the downtown, following the flow of middle-class, white families. Competition with the suburbs drained the central business district of much of its commercial vitality in the second half of the 20th century. It also inspired a variety of downtown revitalization schemes that tended to reinforce inequalities of race and class.

Article

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

By the end of the 19th century, the medical specialties of gynecology and obstetrics established a new trend in women’s healthcare. In the 20th century, more and more American mothers gave birth under the care of a university-trained physician. The transition from laboring and delivering with the assistance of female family, neighbors, and midwives to giving birth under medical supervision is one of the most defining shifts in the history of childbirth. By the 1940s, the majority of American mothers no longer expected to give birth at home, but instead traveled to hospitals, where they sought reassurance from medical experts as well as access to pain-relieving drugs and life-saving technologies. Infant feeding followed a similar trajectory. Traditionally, infant feeding in the West had been synonymous with breastfeeding, although alternatives such as wet nursing and the use of animal milks and broths had existed as well. By the early 20th century, the experiences of women changed in relation to sweeping historical shifts in immigration, urbanization, and industrialization, and so too did their abilities and interests in breastfeeding. Scientific study of infant feeding yielded increasingly safer substitutes for breastfeeding, and by the 1960s fewer than 1 in 5 mothers breastfed. In the 1940s and 1950s, however, mothers began to organize and to resist the medical management of childbirth and infant feeding. The formation of childbirth education groups helped spread information about natural childbirth methods and the first dedicated breastfeeding support organization, La Leche League, formed in 1956. By the 1970s, the trend toward medicalized childbirth and infant feeding that had defined the first half of the century was in significant flux. By the end of the 20th century, efforts to harmonize women’s interests in more “natural” motherhood experiences with the existing medical system led to renewed interest in midwifery, home birth, and birth centers. Despite the cultural shift in favor of fewer medical interventions, rates of cesarean sections climbed to new heights by the end of the 1990s. Similarly, although pressures on mothers to breastfeed mounted by the end of the century, the practice itself increasingly relied upon the use of technologies such as the breast pump. By the close of the century, women’s agency in pursuing more natural options proceeded in tension with the technological, social, medical, and political systems that continued to shape their options.

Article

Wilma King

Boys and girls of European and African descent in Colonial America shared commonalities initially as unfree laborers, with promises of emancipation for all. However, as labor costs and demands changed, white servitude disappeared and slavery in perpetuity prevailed for the majority of blacks in the South following the American Revolution. Children were aware of differences in their legal status, social positions, life changing opportunities, and vulnerabilities within an environment where blackness signaled slavery or the absence of liberty, and whiteness garnered license or freedom. Slavery and freedom existed concomitantly, and relationships among children, even black ones, in North America were affected by time and place. Slave societies and societies with slaves determined the nature of interactions among enslaved and emancipated children. To be sure, few, if any, freed or free-born blacks did not have a relative or friend who was not or had never been enslaved, especially in states when gradual emancipation laws liberated family members born after a specific date and left older relatives in thralldom. As a result, free blacks were never completely aloof from their enslaved contemporaries. And, freedom was more meaningful if and when enjoyed by all. Just as interactions among enslaved and free black children varied, slaveholding children were sometimes benevolent and at other times brutal toward those they claimed as property. And, enslaved children did not always assume subservient positions under masters and mistresses in the making. Ultimately, fields of play rather than fields of labor fostered the most fair and enjoyable moments among slaveholding and enslaved children. Play days for enslaved girls and boys ended when they were mature enough to work outside their own abodes. As enslaved children entered the workplace, white boys of means, often within slaveholding families, engaged in formal studies, while white girls across classes received less formal education but honed skills associated with domestic arts. The paths of white and black children diverged as they reached adolescence, but there were instances when they shared facets of literacy, sometimes surreptitiously, and developed genuine friendships that mitigated the harshness of slavery. Even so, the majority of unfree children survived the furies of bondage by inculcating behavior that was acceptable for both a slave and a child.

Article

Ivón Padilla-Rodríguez

Child migration has garnered widespread media coverage in the 21st century, becoming a central topic of national political discourse and immigration policymaking. Contemporary surges of child migrants are part of a much longer history of migration to the United States. In the first half of the 20th century, millions of European and Asian child migrants passed through immigration inspection stations in the New York harbor and San Francisco Bay. Even though some accompanied and unaccompanied European child migrants experienced detention at Ellis Island, most were processed and admitted into the United States fairly quickly in the early 20th century. Few of the European child migrants were deported from Ellis Island. Predominantly accompanied Chinese and Japanese child migrants, however, like Latin American and Caribbean migrants in recent years, were more frequently subjected to family separation, abuse, detention, and deportation at Angel Island. Once inside the United States, both European and Asian children struggled to overcome poverty, labor exploitation, educational inequity, the attitudes of hostile officials, and public health problems. After World War II, Korean refugee “orphans” came to the United States under the Refugee Relief Act of 1953 and the Immigration and Nationality Act. European, Cuban, and Indochinese refugee children were admitted into the United States through a series of ad hoc programs and temporary legislation until the 1980 Refugee Act created a permanent mechanism for the admission of refugee and unaccompanied children. Exclusionary immigration laws, the hardening of US international boundaries, and the United States preference for refugees who fled Communist regimes made unlawful entry the only option for thousands of accompanied and unaccompanied Mexican, Central American, and Haitian children in the second half of the 20th century. Black and brown migrant and asylum-seeking children were forced to endure educational deprivation, labor trafficking, mandatory detention, deportation, and deadly abuse by US authorities and employers at US borders and inside the country.

Article

Patrick William Kelly

The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.

Article

Gregg A. Brazinsky

Throughout the 19th and 20th centuries, America’s relationship with China ran the gamut from friendship and alliance to enmity and competition. Americans have long believed in China’s potential to become an important global actor, primarily in ways that would benefit the United States. The Chinese have at times embraced, at times rejected, and at times adapted to the US agenda. While there have been some consistent themes in this relationship, Sino-American interactions unquestionably increased their breadth in the 20th century. Trade with China grew from its modest beginnings in the 19th and early 20th centuries into a critical part of the global economy by the 21st century. While Americans have often perceived China as a country that offered significant opportunities for mutual benefit, China has also been seen as a threat and rival. During the Cold War, the two competed vigorously for influence in Asia and Africa. Today we see echoes of this same competition as China continues to grow economically while expanding its influence abroad. The history of Sino-American relations illustrates a complex dichotomy of cooperation and competition; this defines the relationship today and has widespread ramifications for global politics.

Article

Chinese were one of the few immigrant groups who brought with them a deep-rooted medical tradition. Chinese herbal doctors and stores came and appeared in California as soon as the Gold Rush began. Traditional Chinese medicine had a long history and was an important part of Chinese culture. Herbal medical knowledge and therapy was popular among Chinese immigrants. Chinese herbal doctors treated American patients as well. Established herbal doctors had more white patients than Chinese patients especially after Chinese population declined due to Chinese Exclusion laws. Chinese herbal medicine attracted American patients in the late 19th and early 20th century because Western medicine could not cure many diseases and symptoms during that period. Thriving Chinese herbal medical business made some doctors of Western medicine upset. California State Board of Medical Examiners did not allow Chinese herbal doctors to practice as medical doctors and had them arrested as practitioners without doctor license. Many of Chinese herbal doctors managed to operate their medical business as merchants selling herbs. Chinese herbal doctors often defended their career in court and newspaper articles. Their profession eventually discontinued when People’s Republic of China was established in 1949 and the United States passed the Trading with Enemy Economy Act in December 1950 that cut herbal medical imports from China.

Article

Comparing Catholic and Protestant missionaries in North America can be a herculean task. It means comparing many religious groups, at least five governments, and hundreds of groups of Indians. But missions to the Indians played important roles in social, cultural, and political changes for Indians, Europeans, and Americans from the very beginning of contact in the 1500s to the present. By comparing Catholic and Protestant missions to the Indians, this article provides a better understanding of the relationship between these movements and their functions in the history of borders and frontiers, including how the missions changed both European and Indian cultures.

Article

The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.

Article

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

The 1969 Supreme Court ruling in Tinker v. Des Moines established that students in public elementary and secondary schools do not “shed their constitutional rights to freedom of speech or expression at the schoolhouse gate.” Before Tinker, students often faced punishment from school officials for their role in protests both on and off campus. A rise in civil rights protests and the role of young people in the social movements of the 1960s led to frequent conflicts between students and school administrators. Many black students were especially vocal in contesting racial discrimination at school in the two decades following the 1954Brown v. Board of Education decision. But before Tinker, students in public elementary and secondary schools were not considered to have any constitutional rights, including the right to free expression. Some of these students brought lawsuits in response to punishments they believed unfairly disciplined them for participating in legitimate protests. The political activism of young people and developments in First Amendment law eventually brought the Constitution into the public school classroom, leading to Tinker and other cases that established students’ rights.