41-60 of 191 Results  for:

  • 20th Century: Post-1945 x
Clear all

Article

Americans almost universally agree on the importance of education to the success of individuals and the strength of the nation. Yet they have long differed over the proper mission of government in overseeing their schools. Before 1945, these debates largely occurred at the local and state levels. Since 1945, as education has become an increasingly national and international concern, the federal government has played a larger role in the nation’s schools. As Americans gradually have come to accept a greater federal presence in elementary and secondary schools, however, members of Congress and presidents from both major parties have continued to argue over the scope and substance of the federal role. From 1945 to 1965, these arguments centered on the quest for equity between rich and poor public school pupils and between public and nonpublic school students. From 1965 to 1989, national lawmakers devoted much of their attention to the goal of excellence in public education. From 1989 to the present, they have quarreled over how best to attain equity and excellence at the same time.

Article

Employers began organizing with one another to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops, workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the 20th century’s first three decades, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was mostly successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right-to-work,” insisting that individual laborers must enjoy freedom from so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, begin passing “right-to-work” laws, which made union organizing more difficult because workers were not obligated to join unions or pay their “fair share” of dues to them. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.

Article

By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.

Article

Rachel Rothschild

The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.

Article

Christoph Nitschke and Mark Rose

U.S. history is full of frequent and often devastating financial crises. They have coincided with business cycle downturns, but they have been rooted in the political design of markets. Financial crises have also drawn from changes in the underpinning cultures, knowledge systems, and ideologies of marketplace transactions. The United States’ political and economic development spawned, guided, and modified general factors in crisis causation. Broadly viewed, the reasons for financial crises have been recurrent in their form but historically specific in their configuration: causation has always revolved around relatively sudden reversals of investor perceptions of commercial growth, stock market gains, monetary availability, currency stability, and political predictability. The United States’ 19th-century financial crises, which happened in rapid succession, are best described as disturbances tied to market making, nation building, and empire creation. Ongoing changes in America’s financial system aided rapid national growth through the efficient distribution of credit to a spatially and organizationally changing economy. But complex political processes—whether Western expansion, the development of incorporation laws, or the nation’s foreign relations—also underlay the easy availability of credit. The relationship between systemic instability and ideas and ideals of economic growth, politically enacted, was then mirrored in the 19th century. Following the “Golden Age” of crash-free capitalism in the two decades after the Second World War, the recurrence of financial crises in American history coincided with the dominance of the market in statecraft. Banking and other crises were a product of political economy. The Global Financial Crisis of 2007–2008 not only once again changed the regulatory environment in an attempt to correct past mistakes, but also considerably broadened the discursive situation of financial crises as academic topics.

Article

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles. The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods. Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.

Article

Changing foodways, the consumption and production of food, access to food, and debates over food shaped the nature of American cities in the 20th century. As American cities transformed from centers of industrialization at the start of the century to post-industrial societies at the end of the 20th century, food cultures in urban America shifted in response to the ever-changing urban environment. Cities remained centers of food culture, diversity, and food reform despite these shifts. Growing populations and waves of immigration changed the nature of food cultures throughout the United States in the 20th century. These changes were significant, all contributing to an evolving sense of American food culture. For urban denizens, however, food choice and availability were dictated and shaped by a variety of powerful social factors, including class, race, ethnicity, gender, and laboring status. While cities possessed an abundance of food in a variety of locations to consume food, fresh food often remained difficult for the urban poor to obtain as the 20th century ended. As markets expanded from 1900 to 1950, regional geography became a less important factor in determining what types of foods were available. In the second half of the 20th century, even global geography became less important to food choices. Citrus fruit from the West Coast was readily available in northeastern markets near the start of the century, and off-season fruits and vegetables from South America filled shelves in grocery stores by the end of the 20th century. Urban Americans became further disconnected from their food sources, but this dislocation spurred counter-movements that embraced ideas of local, seasonal foods and a rethinking of the city’s relationship with its food sources.

Article

Jeffrey F. Taffet

In the first half of the 20th century, and more actively in the post–World War II period, the United States government used economic aid programs to advance its foreign policy interests. US policymakers generally believed that support for economic development in poorer countries would help create global stability, which would limit military threats and strengthen the global capitalist system. Aid was offered on a country-by-country basis to guide political development; its implementation reflected views about how humanity had advanced in richer countries and how it could and should similarly advance in poorer regions. Humanitarianism did play a role in driving US aid spending, but it was consistently secondary to political considerations. Overall, while funding varied over time, amounts spent were always substantial. Between 1946 and 2015, the United States offered almost $757 billion in economic assistance to countries around the world—$1.6 trillion in inflation-adjusted 2015 dollars. Assessing the impact of this spending is difficult; there has long been disagreement among scholars and politicians about how much economic growth, if any, resulted from aid spending and similar disputes about its utility in advancing US interests. Nevertheless, for most political leaders, even without solid evidence of successes, aid often seemed to be the best option for constructively engaging poorer countries and trying to create the kind of world in which the United States could be secure and prosperous.

Article

Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.

Article

Legal aid organizations were first created by a variety of private groups during the Civil War to provide legal advice in civil cases to the poor. The growing need for legal aid was deeply connected to industrialization, urbanization, and immigration. A variety of groups created legal aid organizations in response to labor unrest, the increasing number of women in the workforce, the founding of women’s clubs, and the slow and incomplete professionalization of the legal bar. In fact, before women could practice law, or were accepted into the legal profession, a variety of middle-class women’s groups using lay lawyers provided legal aid to poor women. Yet, this rich story of women’s work was later suppressed by leaders of the bar attempting to claim credit for legal aid, assert a monopoly over the practice of law, and professionalize legal assistance. Across time, the largest number of claims brought to legal aid providers involved workers trying to collect wages, domestic relations cases, and landlord tenant issues. Until the 1960s, legal aid organizations were largely financed through private donations and philanthropic organizations. After the 1960s, the federal government provided funding to support legal aid, creating significant controversy among lawyers, legal aid providers, and activists as to what types of cases legal aid organizations could take, what services could be provided, and who was eligible. Unlike in many other countries or in criminal cases, in the United States there is no constitutional right to have free counsel in civil cases. This leaves many poor and working-class people without legal advice or access to justice. Organizations providing free civil legal services to the poor are ubiquitous across the United States. They are so much part of the modern legal landscape that it is surprising that little historical scholarship exists on such organizations. Yet the history of organized legal aid, which began during the Civil War, is a rich story that brings into view a unique range of historical actors including women’s organizations, lawyers, social workers, community organizations, the state and federal government, and the millions of poor clients who over the last century and a half have sought legal assistance. This history of the development of legal aid is also very much a story about gender, race, professionalization, the development of the welfare state, and ultimately its slow dismantlement. In other words, the history of legal aid provides a window into the larger history of the United States while producing its own series of historical tensions, ironies, and contradictions. Although this narrative demonstrates change over time and various ruptures with the past, there are also important continuities in the history of free legal aid. Deceptively simple questions have plagued legal aid for almost a century and have also driven much of the historical scholarship on legal aid. These include: who should provide legal aid services, who should receive free legal aid, what types of cases should legal aid organizations handle, who should fund legal aid, and who benefits from legal aid.

Article

Sam Lebovic

According to the First Amendment of the US Constitution, Congress is barred from abridging the freedom of the press (“Congress shall make no law . . . abridging the freedom of speech, or of the press”). In practice, the history of press freedom is far more complicated than this simple constitutional right suggests. Over time, the meaning of the First Amendment has changed greatly. The Supreme Court largely ignored the First Amendment until the 20th century, leaving the scope of press freedom to state courts and legislatures. Since World War I, jurisprudence has greatly expanded the types of publication protected from government interference. The press now has broad rights to publish criticism of public officials, salacious material, private information, national security secrets, and much else. To understand the shifting history of press freedom, however, it is important to understand not only the expansion of formal constitutional rights but also how those rights have been shaped by such factors as economic transformations in the newspaper industry, the evolution of professional standards in the press, and the broader political and cultural relations between politicians and the press.

Article

While American gambling has a historical association with the lawlessness of the frontier and with the wasteful leisure practices of Southern planters, it was in large cities where American gambling first flourished as a form of mass leisure, and as a commercial enterprise of significant scale. In the urban areas of the Mid-Atlantic, the Northeast, and the upper Mid-West, for the better part of two centuries the gambling economy was deeply intertwined with municipal politics and governance, the practices of betting were a prominent feature of social life, and controversies over the presence of gambling both legal and illegal, were at the center of public debate. In New York and Chicago in particular, but also in Cleveland, Pittsburgh, Detroit, Baltimore, and Philadelphia, gambling channeled money to municipal police forces and sustained machine politics. In the eyes of reformers, gambling corrupted governance and corroded social and economic interactions. Big city gambling has changed over time, often in a manner reflecting important historical processes and transformations in economics, politics, and demographics. Yet irrespective of such change, from the onset of Northern urbanization during the 19th century, through much of the 20th century, gambling held steady as a central feature of city life and politics. From the poolrooms where recently arrived Irish New Yorkers bet on horseracing after the Civil War, to the corner stores where black and Puerto Rican New Yorkers bet on the numbers game in the 1960s, the gambling activity that covered the urban landscape produced argument and controversy, particularly with respect to drawing the line between crime and leisure, and over the question of where and to what ends the money of the gambling public should be directed.

Article

Jerry Watkins

Regional variation, race, gender presentation, and class differences mean that there are many “Gay Souths.” Same-sex desire has been a feature of the human experience since the beginning, but the meanings, expressions, and ability to organize one’s life around desire have shifted profoundly since the invention of sexuality in the mid-19th century. World War II represented a key transition in gay history, as it gave many people a language for their desires. During the Cold War, government officials elided sex, race, and gender transgression with subversion and punished accordingly by state committees. These forces profoundly shaped gay social life, and rather than a straight line from closet to liberation, gays in the South have meandered. Movement rather than stasis, circulation rather than congregation, and the local rather than the stranger as well as creative uses of space and place mean that the gay South is distinctive, though not wholly unique, from the rest of the country.

Article

Throughout US history, Americans have used ideas about gender to understand power, international relations, military behavior, and the conduct of war. Since Joan Wallach Scott called on scholars in 1986 to consider gender a “useful category of analysis,” historians have looked beyond traditional diplomatic and military sources and approaches to examine cultural sources, the media, and other evidence to try to understand the ideas that Americans have relied on to make sense of US involvement in the world. From casting weak nations as female to assuming that all soldiers are heterosexual males, Americans have deployed mainstream assumptions about men’s and women’s proper behavior to justify US diplomatic and military interventions in the world. State Department pamphlets describing newly independent countries in the 1950s and 1960s featured gendered imagery like the picture of a young Vietnamese woman on a bicycle that was meant to symbolize South Vietnam, a young nation in need of American guidance. Language in news reports and government cables, as well as film representations of international affairs and war, expressed gendered dichotomies such as protector and protected, home front and battlefront, strong and weak leadership, and stable and rogue states. These and other episodes illustrate how thoroughly gender shaped important dimensions about the character and the making of US foreign policy and historians’ examinations of diplomatic and military history.

Article

Throughout American history, gender, meaning notions of essential differences between women and men, has shaped how Americans have defined and engaged in productive activity. Work has been a key site where gendered inequalities have been produced, but work has also been a crucible for rights claims that have challenged those inequalities. Federal and state governments long played a central role in generating and upholding gendered policy. Workers and advocates have debated whether to advance laboring women’s cause by demanding equality with men or different treatment that accounted for women’s distinct responsibilities and disadvantages. Beginning in the colonial period, constructions of dependence and independence derived from the heterosexual nuclear family underscored a gendered division of labor that assigned distinct tasks to the sexes, albeit varied by race and class. In the 19th century, gendered expectations shaped all workers’ experiences of the Industrial Revolution, slavery and its abolition, and the ideology of free labor. Early 20th-century reform movements sought to beat back the excesses of industrial capitalism by defining the sexes against each other, demanding protective labor laws for white women while framing work done by women of color and men as properly unregulated. Policymakers reinforced this framework in the 1930s as they built a welfare state that was rooted in gendered and racialized constructions of citizenship. In the second half of the 20th century, labor rights claims that reasoned from the sexes’ distinctiveness increasingly gave way to assertions of sex equality, even as the meaning of that equality was contested. As the sex equality paradigm triumphed in the late 20th and early 21st centuries, seismic economic shifts and a conservative business climate narrowed the potential of sex equality laws to deliver substantive changes to workers.

Article

The late 20th century saw gender roles transformed as the so-called Second Wave of American feminism that began in the 1960s gained support. By the early 1970s public opinion increasingly favored the movement and politicians in both major political parties supported it. In 1972 Congress overwhelmingly approved the Equal Rights Amendment (ERA) and sent it to the states. Many quickly ratified, prompting women committed to traditional gender roles to organize. However, by 1975 ERA opponents led by veteran Republican activist Phyllis Schlafly, founder of Stop ERA, had slowed the ratification process, although federal support for feminism continued. Congresswoman Bella Abzug (D-NY), inspired by the United Nations’ International Women’s Year (IWY) program, introduced a bill approved by Congress that mandated state and national IWY conferences at which women would produce recommendations to guide the federal government on policy regarding women. Federal funding of these conferences (held in 1977), and the fact that feminists were appointed to organize them, led to an escalation in tensions between feminist and conservative women, and the conferences proved to be profoundly polarizing events. Feminists elected most of the delegates to the culminating IWY event, the National Women’s Conference held in Houston, Texas, and the “National Plan of Action” adopted there endorsed a wide range of feminist goals including the ERA, abortion rights, and gay rights. But the IWY conferences presented conservatives with a golden opportunity to mobilize, and anti-ERA, pro-life, and anti-gay groups banded together as never before. By the end of 1977, these groups, supported by conservative Catholics, Mormons, and evangelical and fundamentalist Protestants, had come together to form a “Pro-Family Movement” that became a powerful force in American politics. By 1980 they had persuaded the Republican Party to drop its support for women’s rights. Afterward, as Democrats continued to support feminist goals and the GOP presented itself as the defender of “family values,” national politics became more deeply polarized and bitterly partisan.

Article

Gentrification is one of the most controversial issues in American cities today. But it also remains one of the least understood. Few agree on how to define it or whether it is boon or curse for cities. Gentrification has changed over time and has a history dating back to the early 20th century. Historically, gentrification has had a smaller demographic impact on American cities than suburbanization or immigration. But since the late 1970s, gentrification has dramatically reshaped cities like Seattle, San Francisco, and Boston. Furthermore, districts such as the French Quarter in New Orleans, New York City’s Greenwich Village, and Georgetown in Washington DC have had an outsized influence on the political, cultural, and architectural history of cities. Gentrification thus must be examined alongside suburbanization as one of the major historical trends shaping the 20th-century American metropolis.

Article

During the 20th century, the black population of the United States transitioned from largely rural to mostly urban. In the early 1900s the majority of African Americans lived in rural, agricultural areas. Depictions of black people in popular culture often focused on pastoral settings, like the cotton fields of the rural South. But a dramatic shift occurred during the Great Migrations (1914–1930 and 1941–1970) when millions of rural black southerners relocated to US cities. Motivated by economic opportunities in urban industrial areas during World Wars I and II, African Americans opted to move to southern cities as well as to urban centers in the Northeast, Midwest, and West Coast. New communities emerged that contained black social and cultural institutions, and musical and literary expressions flourished. Black migrants who left the South exercised voting rights, sending the first black representatives to Congress in the 20th century. Migrants often referred to themselves as “New Negroes,” pointing to their social, political, and cultural achievements, as well as their use of armed self-defense during violent racial confrontations, as evidence of their new stance on race.

Article

Henry Kissinger was the most famous and most controversial American diplomat of the second half of the 20th century. Escaping Nazi persecution in the 1930s, serving in the American Army of occupation in Germany after 1945, and then pursuing a successful academic career at Harvard University, Kissinger had already achieved national prominence as a foreign policy analyst and defense intellectual when he was appointed national security adviser by President Richard Nixon in January 1969. Kissinger quickly became the president’s closest adviser on foreign affairs and worked with Nixon to change American foreign policy in response to domestic upheaval caused by the Vietnam War in the late 1960s and early 1970s. Nixon and Kissinger’s initiatives, primarily détente with the Soviet Union, the opening to the People’s Republic of China, and ending American involvement in the Vietnam War, received strong domestic support and helped to bring about Nixon’s re-election landslide in 1972. In the wake of the Watergate scandal, Nixon appointed Kissinger secretary of state in August 1973. As Nixon’s capacity to govern deteriorated, Kissinger assumed all-but presidential powers, even putting American forces on alert during the Yom Kippur war and then engaging in “shuttle diplomacy” in the Middle East, achieving the first-ever agreements between Israel and Egypt and Israel and Syria. Kissinger retained a dominating influence over foreign affairs during the presidency of Gerald Ford, even as he became a lightning rod for critics on both the left and right of the political spectrum. Although out of public office after 1977, Kissinger remained in the public eye as a foreign policy commentator, wrote three volumes of memoirs as well as other substantial books on diplomacy, and created a successful international business-consulting firm. His only governmental positions were as chair of the Commission on Central America in 1983–1984 and a brief moment on the 9/11 Commission in 2002.

Article

The Hindu Right is a dense network of organizations across the globe that promote Hindutva or Hindu nationalism, a political ideology that advocates for an ethnonationalist Hindu identity and to transform India into a Hindu state governed by majoritarian norms. Hindutva ideology was first articulated in India in the 1920s, and Hindu Right groups began expanding overseas in the 1940s, coming to the United States in 1970. Collectively, the Hindu Right groups that stretch across dozens of nations in the 21st century are known as the Sangh Parivar (the family of Hindutva organizations). From within the United States, Hindu Right groups exercise power within the global Hindutva movement and place pressure on American institutions and liberal values. The major interlinked Hindu Right groups in America focus on a variety of areas, especially politics, religion, outreach, and fundraising. Among other things, they attempt to control educational materials, influence policy makers, defend caste privilege, and whitewash Hindutva violence, a critical tool for many who espouse this exclusive political ideology. The U.S.-based Hindu Right is properly understood within both a transnational context of the global Sangh Parivar and as part of the American landscape, a fertile home for more than fifty years.