1-7 of 7 Results

  • Keywords: income x
Clear all

Article

Brant Abbott and Giovanni Gallipoli

This article focuses on the distribution of human capital and its implications for the accrual of economic resources to individuals and households. Human capital inequality can be thought of as measuring disparity in the ownership of labor factors of production, which are usually compensated in the form of wage income. Earnings inequality is tightly related to human capital inequality. However, it only measures disparity in payments to labor rather than dispersion in the market value of the underlying stocks of human capital. Hence, measures of earnings dispersion provide a partial and incomplete view of the underlying distribution of productive skills and of the income generated by way of them. Despite its shortcomings, a fairly common way to gauge the distributional implications of human capital inequality is to examine the distribution of labor income. While it is not always obvious what accounts for returns to human capital, an established approach in the empirical literature is to decompose measured earnings into permanent and transitory components. A second approach focuses on the lifetime present value of earnings. Lifetime earnings are, by definition, an ex post measure only observable at the end of an individual’s working lifetime. One limitation of this approach is that it assigns a value based on one of the many possible realizations of human capital returns. Arguably, this ignores the option value associated with alternative, but unobserved, potential earning paths that may be valuable ex ante. Hence, ex post lifetime earnings reflect both the genuine value of human capital and the impact of the particular realization of unpredictable shocks (luck). A different but related measure focuses on the ex ante value of expected lifetime earnings, which differs from ex post (realized) lifetime earnings insofar as they account for the value of yet-to-be-realized payoffs along different potential earning paths. Ex ante expectations reflect how much an individual reasonably anticipates earning over the rest of their life based on their current stock of human capital, averaging over possible realizations of luck and other income shifters that may arise. The discounted value of different potential paths of future earnings can be computed using risk-less or state-dependent discount factors.

Article

Leandro Prados de la Escosura and Blanca Sánchez-Alonso

In assessments of modern-day Spain’s economic progress and living standards, inadequate natural resources, inefficient institutions, lack of education and entrepreneurship, and foreign dependency are frequently blamed on poor performance up to the mid-20th century, but no persuasive arguments were provided to explain why such adverse circumstances reversed, giving way to the fast transformation that started in the 1950s. Hence, it is necessary to first inquire how much economic progress has been achieved in Spain and what impact it had on living standards and income distribution since the end of the Peninsular War to the present day, and second to provide an interpretation. Research published in the 2010s supports the view that income per person has improved remarkably, driven by increases in labor productivity, which derived, in turn, from a more intense and efficient use of physical and human capital per worker. Exposure to international competition represented a decisive element behind growth performance. From an European perspective, Spain underperformed until 1950. Thereafter, Spain’s economy managed to catch up with more advanced countries until 2007. Although the distribution of the fruits of growth did not follow a linear trend, but a Kuznetsian inverted U pattern, higher levels of income per capita are matched by lower inequality, suggesting that Spaniards’ material wellbeing improved substantially during the modern era.

Article

Low- and middle-income countries (LMICs) bear a disproportionately high burden of diseases in comparison to high-income countries, partly due to inequalities in the distribution of resources for health. Recent increases in health spending in these countries demonstrate a commitment to tackling the high burden of disease. However, evidence on the extent to which increased spending on health translates to better population health outcomes has been inconclusive. Some studies have reported improvements in population health with an increase in health spending whereas others have either found no effect or very limited effect to justify increased financial allocations to health. Differences across studies may be explained by differences in approaches adopted in estimating returns to health spending in LMICs.

Article

Miles Livingston and Lei Zhou

Credit rating agencies have developed as an information intermediary in the credit market because there are very large numbers of bonds outstanding with many different features. The Securities Industry and Financial Markets Association reports over $20 trillion of corporate bonds, mortgaged-backed securities, and asset-backed securities in the United States. The vast size of the bond markets, the number of different bond issues, and the complexity of these securities result in a massive amount of information for potential investors to evaluate. The magnitude of the information creates the need for independent companies to provide objective evaluations of the ability of bond issuers to pay their contractually binding obligations. The result is credit rating agencies (CRAs), private companies that monitor debt securities/issuers and provide information to investors about the potential default risk of individual bond issues and issuing firms. Rating agencies provide ratings for many types of debt instruments including corporate bonds, debt instruments backed by assets such as mortgages (mortgage-backed securities), short-term debt of corporations, municipal government debt, and debt issued by central governments (sovereign bonds). The three largest rating agencies are Moody’s, Standard & Poor’s, and Fitch. These agencies provide ratings that are indicators of the relative probability of default. Bonds with the highest rating of AAA have very low probabilities of default and consequently the yields on these bonds are relatively low. As the ratings decline, the probability of default increases and the bond yields increase. Ratings are important to institutional investors such as insurance companies, pension funds, and mutual funds. These large investors are often restricted to purchasing exclusively or primarily bonds in the highest rating categories. Consequently, the highest ratings are usually called investment grade. The lower ratings are usually designated as high-yield or “junk bonds.” There is a controversy about the possibility of inflated ratings. Since issuers pay rating agencies for providing ratings, there may be an incentive for the rating agencies to provide inflated ratings in exchange for fees. In the U.S. corporate bond market, at least two and often three agencies provide ratings. Multiple ratings make it difficult for one rating agency to provide inflated ratings. Rating agencies are regulated by the Securities and Exchange Commission to ensure that agencies follow reasonable procedures.

Article

In the early 21st century, the U.S. economy stood at or very near the top of any ranking of the world’s economies, more obviously so in terms of gross domestic product (GDP), but also when measured by GDP per capita. The current standing of any country reflects three things: how well off it was when it began modern economic growth, how long it has been growing, and how rapidly productivity increased each year. Americans are inclined to think that it was the last of these items that accounted for their country’s success. And there is some truth to the notion that America’s lofty status was due to the continual increases in the efficiency of its factors of production—but that is not the whole story. The rate at which the U.S. economy has grown over its long history—roughly 1.5% per year measured by output per capita—has been modest in comparison with most other advanced nations. The high value of GDP per capita in the United States is due in no small part to the fact that it was already among the world’s highest back in the early 19th century, when the new nation was poised to begin modern economic growth. The United States was also an early starter, so has experienced growth for a very long time—longer than almost every other nation in the world. The sustained growth in real GDP per capita began sometime in the period 1790 to 1860, although the exact timing of the transition, and even its nature, are still uncertain. Continual efforts to improve the statistical record have narrowed down the time frame in which the transition took place and improved our understanding of the forces that facilitated the transition, but questions remain. In order to understand how the United States made the transition from a slow-growing British colony to a more rapidly advancing, free-standing economy, it is necessary to know more precisely when it made that transition.

Article

In order to secure effective service access, coverage, and impact, it is increasingly recognized that the introduction of novel health technologies such as diagnostics, drugs, and vaccines may require additional investment to address the constraints under which many health systems operate. Health-system constraints include a shortage of health workers, ineffective supply chains, or inadequate information systems, or organizational constraints such as weak incentives and poor service integration. Decision makers may be faced with the question of whether to invest in a new technology, including the specific health system strengthening needed to ensure effective implementation; or they may be seeking to optimize resource allocation across a range of interventions including investment in broad health system functions or platforms. Investment in measures to address health-system constraints therefore increasingly need to undergo economic evaluation, but this poses several methodological challenges for health economists, particularly in the context of low- and middle-income countries. Designing the appropriate analysis to inform investment decisions concerning new technologies incorporating health systems investment can be broken down into several steps. First, the analysis needs to comprehensively outline the interface between the new intervention and the system through which it is to be delivered, in order to identify the relevant constraints and the measures needed to relax them. Second, the analysis needs to be rooted in a theoretical approach to appropriately characterize constraints and consider joint investment in the health system and technology. Third, the analysis needs to consider how the overarching priority- setting process influences the scope and output of the analysis informing the way in which complex evidence is used to support the decision, including how to represent and manage system wide trade-offs. Finally, there are several ways in which decision analytical models can be structured, and parameterized, in a context of data scarcity around constraints. This article draws together current approaches to health system thinking with the emerging literature on analytical approaches to integrating health-system constraints into economic evaluation to guide economists through these four issues. It aims to contribute to a more health-system-informed approach to both appraising the cost-effectiveness of new technologies and setting priorities across a range of program activities.

Article

During the 18th and 19th centuries, medical spending in the United States rose slowly, on average about .25% faster than gross domestic product (GDP), and varied widely between rural and urban regions. Accumulating scientific advances caused spending to accelerate by 1910. From 1930 to 1955, rapid per-capita income growth accommodated major medical expansion while keeping the health share of GDP almost constant. During the 1950s and 1960s, prosperity and investment in research, the workforce, and hospitals caused a rapid surge in spending and consolidated a truly national health system. Excess growth rates (above GDP growth) were above +5% per year from 1966 to 1970, which would have doubled the health-sector share in fifteen years had it not moderated, falling under +3% in the 1980s, +2% in 1990s, and +1.5% since 2005. The question of when national health expenditure growth can be brought into line with GDP and made sustainable for the long run is still open. A review of historical data over three centuries forces confrontation with issues regarding what to include and how long events continue to effect national health accounting and policy. Empirical analysis at a national scale over multiple decades fails to support a position that many of the commonly discussed variables (obesity, aging, mortality rates, coinsurance) do cause significant shifts in expenditure trends. What does become clear is that there are long and variable lags before macroeconomic and technological events affect spending: three to six years for business cycles and multiple decades for major recessions, scientific discoveries, and organizational change. Health-financing mechanisms, such as employer-based health insurance, Medicare, and the Affordable Care Act (Obamacare) are seen to be both cause and effect, taking years to develop and affecting spending for decades to come.