1-20 of 28 Results  for:

  • 20th Century: Post-1945 x
  • Labor and Working Class History x
Clear all

Article

Joshua Gleich

Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films. Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.

Article

On January 5, 2014—the fiftieth anniversary of President Lyndon Johnson’s launch of the War on Poverty—the New York Times asked a panel of opinion leaders a simple question: “Does the U.S. Need Another War on Poverty?” While the answers varied, all the invited debaters accepted the martial premise of the question—that a war on poverty had been fought and that eliminating poverty was, without a doubt, a “fight,” or a “battle.” Yet the debate over the manner—martial or not—by which the federal government and public policy has dealt with the issue of poverty in the United States is still very much an open-ended one. The evolution and development of the postwar American welfare state is a story not only of a number of “wars,” or individual political initiatives, against poverty, but also about the growth of institutions within and outside government that seek to address, alleviate, and eliminate poverty and its concomitant social ills. It is a complex and at times messy story, interwoven with the wider historical trajectory of this period: civil rights, the rise and fall of a “Cold War consensus,” the emergence of a counterculture, the Vietnam War, the credibility gap, the rise of conservatism, the end of “welfare,” and the emergence of compassionate conservatism. Mirroring the broader organization of the American political system, with a relatively weak center of power and delegated authority and decision-making in fifty states, the welfare model has developed and grown over decades. Policies viewed in one era as unmitigated failures have instead over time evolved and become part of the fabric of the welfare state.

Article

In 1964, President Lyndon B. Johnson announced an unconditional “war on poverty.” On one of his first publicity tours promoting his antipoverty legislation, he traveled to cities and towns in Appalachia, which would become crucial areas for promoting and implementing the legislation. Johnson soon signed the Economic Opportunity Act, a piece of legislation that provided a structure for communities to institute antipoverty programs, from vocational services to early childhood education programs, and encouraged the creation of new initiatives. In 1965, Johnson signed the Appalachian Regional Development Act, making Appalachia the only region targeted by federal antipoverty legislation, through the creation of the Appalachian Regional Commission. The Appalachian War on Poverty can be described as a set of policies created by governmental agencies, but also crucial to it was a series of community movements and campaigns, led by working-class people, that responded to antipoverty policies. When the War on Poverty began, the language of policymakers suggested that people living below the poverty line would be served by the programs. But as the antipoverty programs expanded and more local people became involved, they spoke openly and in political terms about poverty as a working-class issue. They drew attention to the politics of class in the region, where elites and absentee landowners became wealthy on the backs of working people. They demanded meaningful participation in shaping the War on Poverty in their communities, and, increasingly, when they used the term “poor people,” they did so as a collective class identity—working people who were poor due to a rigged economy. While many public officials focused on economic development policies, men and women living in the region began organizing around issues ranging from surface mining to labor rights and responding to poor living and working conditions. Taking advantage of federal antipoverty resources and the spirit of change that animated the 1960s, working-class Appalachians would help to shape the antipoverty programs at the local and regional level, creating a movement in the process. They did so as they organized around issues—including the environment, occupational safety, health, and welfare rights—and as they used antipoverty programs as a platform to address the systemic inequalities that plagued many of their communities.

Article

On August 4, 1942, the Mexican and US governments launched the binational guest worker program most commonly known as the Bracero Program. An estimated 5 million Mexican men between the ages of nineteen and forty-five separated from their families for three to nine-month cycles at a time, depending on the duration of their labor contract, in anticipation of earning the prevailing US wage this program had promised them. They labored in US agriculture, railroad construction, and forestry with hardly any employment protections or rights in place to support themselves or the families they had left behind in Mexico. The inhumane configuration and implementation of this program prevented most of these men and their families from meeting this goal. Instead, the labor exploitation and alienation that characterized this guest worker program and their program participation paved the way for fragile transnational family relationships. The Bracero Program grew over the course of its twenty-two-year existence, and despite its negative consequences, Mexican men and their families could not afford to settle for unemployment in Mexico nor pass up US employment opportunities of any sort. The Mexican and US governments’ persistently negligent management of the program coupled with their conveniently selective acknowledgment of the severity of the plight of Mexican women and men consistently required Mexican men and their families to shoulder the full extent of the program’s exploitative conditions and terms.

Article

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.

Article

Ivón Padilla-Rodríguez

Child migration has garnered widespread media coverage in the 21st century, becoming a central topic of national political discourse and immigration policymaking. Contemporary surges of child migrants are part of a much longer history of migration to the United States. In the first half of the 20th century, millions of European and Asian child migrants passed through immigration inspection stations in the New York harbor and San Francisco Bay. Even though some accompanied and unaccompanied European child migrants experienced detention at Ellis Island, most were processed and admitted into the United States fairly quickly in the early 20th century. Few of the European child migrants were deported from Ellis Island. Predominantly accompanied Chinese and Japanese child migrants, however, like Latin American and Caribbean migrants in recent years, were more frequently subjected to family separation, abuse, detention, and deportation at Angel Island. Once inside the United States, both European and Asian children struggled to overcome poverty, labor exploitation, educational inequity, the attitudes of hostile officials, and public health problems. After World War II, Korean refugee “orphans” came to the United States under the Refugee Relief Act of 1953 and the Immigration and Nationality Act. European, Cuban, and Indochinese refugee children were admitted into the United States through a series of ad hoc programs and temporary legislation until the 1980 Refugee Act created a permanent mechanism for the admission of refugee and unaccompanied children. Exclusionary immigration laws, the hardening of US international boundaries, and the United States preference for refugees who fled Communist regimes made unlawful entry the only option for thousands of accompanied and unaccompanied Mexican, Central American, and Haitian children in the second half of the 20th century. Black and brown migrant and asylum-seeking children were forced to endure educational deprivation, labor trafficking, mandatory detention, deportation, and deadly abuse by US authorities and employers at US borders and inside the country.

Article

The US working class and the institutional labor movement was shaped by anticommunism. Anticommunism preceded the founding of the Soviet Union and the Cold War, and this early history affected the later experience. It reinforced conservative positions on union issues even in the period before the Cold War, and forged the alliances that influenced the labor movement’s direction, including the campaign to organize the South, the methods and structures of unions, and US labor’s foreign policy positions. While the Communist Party of the USA (CP) was a hierarchical organization straitjacketed by an allegiance to the Soviet Union, the unions it fostered cultivated radical democratic methods, while anticommunism often justified opposition to militancy and obstructed progressive policies. In the hottest moments of the postwar development of domestic anticommunism, unions and their members were vilified and purged from the labor movement, forced to take loyalty oaths, and fired for their association with the CP. The Cold War in the working class removed critical perspectives on capitalism, reinforced a moderate and conservative labor officialdom, and led to conformity with the state on foreign policy issues.

Article

Employers began organizing with one another to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops, workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the 20th century’s first three decades, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was mostly successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right-to-work,” insisting that individual laborers must enjoy freedom from so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, begin passing “right-to-work” laws, which made union organizing more difficult because workers were not obligated to join unions or pay their “fair share” of dues to them. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.

Article

Throughout American history, gender, meaning notions of essential differences between women and men, has shaped how Americans have defined and engaged in productive activity. Work has been a key site where gendered inequalities have been produced, but work has also been a crucible for rights claims that have challenged those inequalities. Federal and state governments long played a central role in generating and upholding gendered policy. Workers and advocates have debated whether to advance laboring women’s cause by demanding equality with men or different treatment that accounted for women’s distinct responsibilities and disadvantages. Beginning in the colonial period, constructions of dependence and independence derived from the heterosexual nuclear family underscored a gendered division of labor that assigned distinct tasks to the sexes, albeit varied by race and class. In the 19th century, gendered expectations shaped all workers’ experiences of the Industrial Revolution, slavery and its abolition, and the ideology of free labor. Early 20th-century reform movements sought to beat back the excesses of industrial capitalism by defining the sexes against each other, demanding protective labor laws for white women while framing work done by women of color and men as properly unregulated. Policymakers reinforced this framework in the 1930s as they built a welfare state that was rooted in gendered and racialized constructions of citizenship. In the second half of the 20th century, labor rights claims that reasoned from the sexes’ distinctiveness increasingly gave way to assertions of sex equality, even as the meaning of that equality was contested. As the sex equality paradigm triumphed in the late 20th and early 21st centuries, seismic economic shifts and a conservative business climate narrowed the potential of sex equality laws to deliver substantive changes to workers.

Article

Perhaps the most important radical labor union in U.S. history, the Industrial Workers of the World (IWW) continues to attract workers, in and beyond the United States. The IWW was founded in 1905 in Chicago—at that time, the greatest industrial city in a country that had become the world’s mightiest economy. Due to the nature of industrial capitalism in what, already, had become a global economy, the IWW and its ideals quickly became a worldwide phenomenon. The Wobblies, as members were and still are affectionately known, never were as numerically large as mainstream unions, but their influence, particularly from 1905 into the 1920s, was enormous. The IWW captured the imaginations of countless rebellious workers with its fiery rhetoric, daring tactics, and commitment to revolutionary industrial unionism. The IWW pledged to replace the “bread and butter” craft unionism of the larger, more mainstream American Federation of Labor (AFL), with massive industrial unions strong enough to take on ever-larger corporations and, ultimately, overthrow capitalism to be replaced with a society based upon people rather than profit. In the United States, the union grew in numbers and reputation, before and during World War I, by organizing workers neglected by other unions—immigrant factory workers in the Northeast and Midwest, migratory farmworkers in the Great Plains, and mine, timber, and harvest workers out West. Unlike most other unions of that era, the IWW welcomed immigrants, women, and people of color; truly, most U.S. institutions excluded African Americans and darker-skinned immigrants as well as women, making the IWW among the most radically inclusive institutions in the country and world. Wobbly ideas, members, and publications soon spread beyond the United States—first to Mexico and Canada, then into the Caribbean and Latin America, and to Europe, southern Africa, and Australasia in rapid succession. The expansion of the IWW and its ideals across the world in under a decade is a testament to the passionate commitment of its members. It also speaks to the immense popularity of anticapitalist tendencies that shared more in common with anarchism than social democracy. However, the IWW’s revolutionary program and class-war rhetoric yielded more enemies than allies, including governments, which proved devastating during and after World War I, though the union soldiered on. Even in 2020, the ideals the IWW espoused continued to resonate among a small but growing and vibrant group of workers, worldwide.

Article

The American labor movement has declined significantly since 1960. Once a powerful part of American life, bringing economic democracy to the nation, organized labor has become a shell of itself, with numbers far lower than a half-century ago. The 1960s began with a powerful movement divided on race but also deeply influenced by the civil rights movement. Deindustrialization and capital mobility cut into labor’s power after 1965 as factories closed. The rise of public sector unionism in the 1970s briefly gave labor new power, but private sector unions faced enormous internal dissension throughout that decade. The Reagan administration ushered in a new era of warfare against organized labor when the president fired the striking air traffic controllers in 1981. Soon, private sector employers engaged in brutal anti-union campaigns. Reforms within labor in the 1990s sought to renew the movement’s long tradition of organizing, but with mixed success at best. Since the 1980s, we have seen more attacks on organized labor, especially Republican-led campaigns against public sector union rights beginning in 2011 that culminated in the 2019 Supreme Court ruling that declared required dues for non-union members unconstitutional. Labor’s decline has led to a new era of income inequality but also brought a stronger class-centric politics back into American life as everyday people seek new answers to the tenuousness of their economic lives.

Article

Donna T. Haverty-Stacke

The first Labor Day parade was held on September 5, 1882, in New York City. It, and the annual holiday demonstrations that followed in that decade and the next, resulted from the growth of the modern organized labor movement that took place in the context of the second industrial revolution. These first Labor Day celebrations also became part of the then ongoing ideological and tactical divisions within that movement. By the early 1900s, workers’ desire to enjoy the fruits of their labor by participating in popular leisure pursuits came to characterize the day. But union leaders, who considered such leisure pursuits a distraction from displays of union solidarity, continued to encourage the organization of parades. With the protections afforded to organized labor by the New Deal, and with the gains made during and after World War II (particularly among unionized white, male, industrial laborers), Labor Day parades declined further after 1945 as workers enjoyed access to mass cultural pursuits, increasingly in suburban settings. This decline was indicative of a broader loss of union movement culture that had served to build solidarity within unions, display working-class militancy to employers, and communicate the legitimacy of organized labor to the American public. From time to time since the late 1970s unions have attempted to reclaim the power of Labor Day to make concerted demands through their display of workers’ united power; but, for most Americans the holiday has become part of a three-day weekend devoted to shopping or leisure that marks the end of the summer season.

Article

If one considers all the links in the food chain—from crop cultivation to harvesting to processing to transportation to provision and service—millions of workers are required to get food from fields and farms to our grocery stores, restaurants, and kitchen tables. One out of every seven workers in the United States performs a job related in some way to food, whether it is in direct on-farm employment, in stores, in eating/drinking establishments, or in other agriculture-related sectors. According to demographic breakdowns of US food labor, people of color and immigrants (of varying legal and citizenship statuses) hold the majority of low-wage jobs in the US food system. Since the late 19th century Latinos (people of Latin American descent living in the United States) have played a tremendous role in powering the nation’s food industry. In the Southwest, Mexicans and Mexican Americans have historically worked as farmworkers, street vendors, restaurateurs, and employees in food factories. The Bracero Program (1942–1964) only strengthened the pattern of hiring Latinos as food workers by importing a steady stream of Mexican guest workers into fields, orchards, and vineyards across all regions of the United States. Meanwhile, mid-20th-century Puerto Rican agricultural guest workers served the farms and food processing factories of the Midwest and East Coast. In the late 20th and early 21st centuries, Central American food labor has become more noticeable in restaurants, the meat and seafood industries, and street food vending. It is deeply ironic, then, that the workers who help to nourish us and get our food to us go so unnourished themselves. Across the board, food laborers lack many privileges and basic rights. There is still no federal minimum wage for the almost three million farmworkers who labor in the nation’s fruit orchards, vineyards, and vegetable fields. Farmworkers (who are overwhelmingly Latino and undocumented) earn very low wages and face various health risks from pesticide exposure, extreme weather, a lack of nutritious, affordable food and potable water, substandard and unsanitary housing conditions, workplace abuse, unsafe transportation, and sexual harassment and assault. Other kinds of food workers—such as restaurant workers and street vendors—experience similar economic precarity and physical/social invisibility. While many of these substandard conditions exist because of employer decisions about costs and the treatment of their workers, American consumers seeking the lowest prices for food are also caught up in this cycle of exploitation. In efforts to stay competitive and profitable in what they give to grocery stores, restaurants, and the American public, farmers and food distributors trim costs wherever they can, which often negatively impacts the wages and conditions of those who are working the hardest at the bottom of the national food chain. To push back against these forms of exploitation, food entrepreneurs, worker unions, and other advocates have vocally supported Latinos in the US food industry and tried to address problems ranging from xenophobia to human trafficking.

Article

Entrepreneurship has been a basic element of Latinx life in the United States since long before the nation’s founding, varying in scale and cutting across race, class, and gender to different degrees. Indigenous forms of commerce pre-dated Spanish contact in the Americas and continued thereafter. Beginning in the 16th century, the raising, trading, and production of cattle and cattle-related products became foundational to Spanish, Mexican, and later American Southwest society and culture. By the 19th century, Latinxs in US metropolitan areas began to establish enterprises in the form of storefronts, warehouses, factories, as well as smaller ventures including peddling. At times, they succeeded previous ethnic owners; in other moments, they established new businesses that shaped everyday life and politics of their respective communities. Whatever the scale of their ventures, Latinx business owners continued to capitalize on the migration of Latinx people to the United States from Latin America and the Caribbean during the 20th century. These entrepreneurs entered business for different reasons, often responding to restricted or constrained labor options, though many sought the flexibility that entrepreneurship offered. Despite an increasing association between Latinx people and entrepreneurship, profits from Latinx ventures produced uneven results during the second half of the 20th century. For some, finance and business ownership has generated immense wealth and political influence. For others at the margins of society, it has remained a tool for achieving sustenance amid the variability of a racially stratified labor market. No monolithic account can wholly capture the vastness and complexity of Latinx economic activity. Latinx business and entrepreneurship remains a vital piece of the place-making and politics of the US Latinx population. This article provides an overview of major trends and pivotal moments in its rich history.

Article

Landon R. Y. Storrs

The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States. The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.

Article

Margaret Garb

Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people. Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness. Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.

Article

The relationship between organized labor and the civil rights movement proceeded along two tracks. At work, the two groups were adversaries, as civil rights groups criticized employment discrimination by the unions. But in politics, they allied. Unions and civil rights organizations partnered to support liberal legislation and to oppose conservative southern Democrats, who were as militant in opposing unions as they were fervent in supporting white supremacy. At work, unions dithered in their efforts to root out employment discrimination. Their initial enthusiasm for Title VII of the 1964 Civil Rights Act, which outlawed employment discrimination, waned the more the new law violated foundational union practices by infringing on the principle of seniority, emphasizing the rights of the individual over the group, and inserting the courts into the workplace. The two souls of postwar liberalism— labor solidarity represented by unions and racial justice represented by the civil rights movement—were in conflict at work. Although the unions and civil rights activists were adversaries over employment discrimination, they united in trying to register southern blacks to vote. Black enfranchisement would end the South’s exceptionalism and the veto it exercised over liberal legislation in Congress. But the two souls of liberalism that were at odds over the meaning of fairness at work would also diverge at the ballot box. As white workers began to defect from the Democratic Party, the political coalition of black and white workers that union leaders had hoped to build was undermined from below. The divergence between the two souls of liberalism in the 1960s—economic justice represented by unions and racial justice represented by civil rights—helps explain the resurgence of conservatism that followed.

Article

In the years after the Civil War, Polish immigrants became an important part of the American working class. They actively participated in the labor movement and played key roles in various industrial strikes ranging from the 1877 Railroad Strike through the rise of the CIO and the post-1945 era of prosperity. Over time, the Polish American working class became acculturated and left its largely immigrant past behind while maintaining itself as an ethnic community. It also witnessed a good deal of upward mobility, especially over several generations. This ethnic community, however, continued to be refreshed with immigrants throughout the 20th century. As with the larger American working class, Polish American workers were hard hit by changes in the industrial structure of the United States. Deindustrialization turned the centers of much of the Polish American community into the Rust Belt. This, despite a radical history, caused many to react by turning toward conservative causes in the late 20th and early 21st centuries.

Article

Gail Radford

Public authorities are agencies created by governments to engage directly in the economy for public purposes. They differ from standard agencies in that they operate outside the administrative framework of democratically accountable government. Since they generate their own operating income by charging users for goods and services and borrow for capital expenses based on projections of future revenues, they can avoid the input from voters and the regulations that control public agencies funded by tax revenues. Institutions built on the public authority model exist at all levels of government and in every state. A few of these enterprises, such as the Tennessee Valley Authority and the Port Authority of New York and New Jersey, are well known. Thousands more toil in relative obscurity, operating toll roads and bridges, airports, transit systems, cargo ports, entertainment venues, sewer and water systems, and even parking garages. Despite their ubiquity, these agencies are not well understood. Many release little information about their internal operations. It is not even possible to say conclusively how many exist, since experts disagree about how to define them, and states do not systematically track them. One thing we do know about public authorities is that, over the course of the 20th century, these institutions have become a major component of American governance. Immediately following the Second World War, they played a minor role in public finance. But by the early 21st century, borrowing by authorities constituted well over half of all public borrowing at the sub-federal level. This change means that increasingly the leaders of these entities, rather than elected officials, make key decisions about where and how to build public infrastructure and steer economic development in the United States

Article

Paul Michel Taillon

Railroad workers occupy a singular place in United States history. Working in the nation’s first “big businesses,” they numbered in the hundreds of thousands, came from a wide range of ethnic and racial groups, included both men and women, and performed a wide range of often esoteric tasks. As workers in an industry that shaped the nation’s financial, technological, and political-economic development, railroaders drove the leading edge of industrialization in the 19th century and played a central role in the nation’s economy for much of the 20th. With the legends of “steel-driving” John Henry and “Cannonball” Casey Jones, railroad workers entered the national folklore as Americans pondered the benefits and costs of progress in an industrial age. Those tales highlighted the glamor and rewards, the risks and disparities, and the gender-exclusive and racially hierarchical nature of railroad work. They also offer insight into the character of railroad unionism, which, from its beginnings in the 1860s, oriented toward craft-based, male-only, white-supremacist forms of organization. Those unions remained fragmented, but they also became among the most powerful in the US labor movement, leveraging their members’ strategic location in a central infrastructural industry, especially those who operated the trains. That strategic location also ensured that any form of collective organization—and therefore potential disruption of the national economy—would lead to significant state intervention. Thus, the epic railroad labor conflict of the late 19th century generated the first federal labor relations laws in US history, which in turn set important precedents for 20th-century national labor relations policy. At the same time, the industry nurtured the first national all-Black, civil-rights-oriented unions, which played crucial roles in the 20th-century African American freedom struggle. By the mid-20th century, however, with technological change and the railroads entering a period of decline, the numbers of railroad workers diminished and with them, too, their once-powerful unions.