1-10 of 51 Results  for:

  • Macroeconomics and Monetary Economics x
Clear all


Macroeconomic Announcement Premium  

Hengjie Ai, Ravi Bansal, and Hongye Guo

The macroeconomic announcement premium refers to the fact that a large fraction of the equity market risk premium is realized on a small number of trading days with significant macroeconomic announcements. Examples include monetary policy announcements by the Federal Open Market Committee, unemployment/non-farm payroll reports, the Producer Price Index published by the U.S. Bureau of Labor Statistics, and the gross domestic product reported by the U.S. Bureau of Economic Analysis. During the period 1961–2023, roughly 44 days per year with macroeconomic announcements account for more than 71% of the aggregate equity market risk compensation. The existence of the macroeconomic announcement premium has important implications for modeling risk preferences in economics and finance. It provides strong support for non-expected utility analysis. The study of Ai and Bansal demonstrates that the existence of the macroeconomic announcement premium implies that investors’ preferences cannot have an expected utility representation and must satisfy generalized risk sensitivity, a property shared by many non-expected utility models such as the maxmin expected utility of Gilboa and Schmeidler, the recursive utility of Epstein and Zin, and the robust control preference of Hansen and Sargent. Because the amount of risk compensation is proportional to the magnitude of variations in marginal utility, the macroeconomic announcement premium highlights information as the most important driver of marginal utility. This observation has profound implications for many economic analyses that rely on modeling either time-series variation or cross-sectional heterogeneity in marginal utility across agents, such as consumption risk sharing, the trade-off between equality and efficiency, exchange rate variations, and so on. The link between macroeconomic policy announcements and financial market risk compensation is an important direction for future research.


Foreign Exchange Intervention  

Helen Popper

The practice of central bank foreign exchange intervention for a time ran ahead of either compelling theoretical explanations of its use or persuasive empirical evidence of its effectiveness. Research accelerated when the emerging economy crises of the 1990s and the early 2000s brought fresh data in the form of urgent experimentation with foreign exchange intervention and related policies, and the financial crisis of 2008 propelled serious treatment of financial frictions into models of intervention. Current foreign exchange intervention models combine financial frictions with relevant externalities: with the aggregate demand and pecuniary externalities that inform macroeconomic models more broadly, and with the trade-related learning externalities that are particularly relevant for developing and emerging economies. These models characteristically allow for normative evaluation of the use of foreign exchange intervention, although most (but not all) do so from a single economy perspective. Empirical advances reflect the advantages of more variation in the use of foreign exchange intervention, better data, and novel econometric approaches to addressing endogeneity. Foreign exchange intervention is now widely viewed as influencing exchange rates at least to some extent, and sustained one-sided intervention; and its corresponding reserve accumulation appear to play a role in moderating exchange rate fluctuations and in reducing the likelihood of damaging consequences of financial crises. Key avenues for future research include sorting out which frictions and externalities matter most, and where foreign exchange intervention—and perhaps international cooperation—properly fits (if at all) into the blend of policies that might appropriately address the externalities.


Econometric Methods for Business Cycle Dating  

Máximo Camacho Alonso and Lola Gadea

Over time, the reference cycle of an economy is determined by a sequence of non-observable business cycle turning points involving a partition of the time calendar into non-overlapping episodes of expansions and recessions. Dating these turning points helps develop economic analysis and is useful for economic agents, whether policymakers, investors, or academics. Aiming to be transparent and reproducible, determining the reference cycle with statistical frameworks that automatically date turning points from a set of coincident economic indicators has been the source of remarkable advances in this research context. These methods can be classified into different broad sets of categories. Depending on the assumptions made in the data-generating process, the dating methods are parametric and non-parametric. There are two main approaches to dealing with multivariate data sets: average then date and date then average. The former approach focuses on computing a reference series of the aggregate economy, usually by averaging the indicators across the cross-sectional dimension. Then, the global turning points are dated on the aggregate indicator using one of the business cycle dating models available in the literature. The latter approach consists of dating the peaks and troughs in a set of coincident business cycle indicators separately, assessing the reference cycle itself in those periods where the individual turning points cohere. In the early 21st century, literature has shown that future work on dating the reference cycle will require dealing with a set of challenges. First, new tools have become available, which, being increasingly sophisticated, may enlarge the existing academic–practitioner gap. Compiling the codes that implement the dating methods and facilitating their practical implementation may reduce this gap. Second, the pandemic shock hitting worldwide economies led most industrialized countries to record 2020’s most significant fall and the largest rebound in national economic indicators since records began. Under these influential observations, the outcomes of dating methods could misrepresent the actual reference cycle, especially in the case of parametric approaches. Exploring non-parametric approaches, big data sources, and the classification ability offered by machine learning methods could help improve dating analyses’ performance.


Time Consistent Policies and Quasi-Hyperbolic Discounting  

Łukasz Balbus, Kevin Reffett, and Łukasz Woźny

In dynamic choice models, dynamic inconsistency of preferences is a situation in which a decision-maker’s preferences change over time. Optimal plans under such preferences are time inconsistent if a decision-maker has no incentive to follow in the future the (previously chosen) optimal plan. A typical example of dynamic inconsistency is the case of present bias preferences, where there is a repeated preference toward smaller present rewards versus larger future rewards. The study of dynamic choice of decision-makers who possess dynamically inconsistent preferences has long been the focal point of much work in behavioral economics. Experimental and empirical literatures both point to the importance of various forms of present-bias. The canonical model of dynamically inconsistent preferences exhibiting present-bias is a model of quasi-hyperbolic discounting. A quasi-hyperbolic discounting model is a dynamic choice model, in which the standard exponential discounting is modified by adding an impatience parameter that additionally discounts the immediately succeeding period. A central problem with the analytical study of decision-makers who possess dynamically inconsistent preferences is how to model their choices in sequential decision problems. One general answer to this problem is to characterize and compute (if they exist) constrained optimal plans that are optimal among the set of time consistent sequential plans. Time consistent plans are those among the set of feasible plans that will actually be followed, or not reoptimized, by agents whose preferences change over time. These are called time consistent plans or policies (TCPs). Many results of the existence, uniqueness, and characterization of stationary, or time invariant, TCPs in a class of consumption-savings problems with quasi-hyperbolic discounting, as well as provide some discussion of how to compute TCPs in some extensions of the model are presented, and the role of the generalized Bellman equation operator approach is central. This approach provides sufficient conditions for the existence of time consistent solutions and facilitates their computation. Importantly, the generalized Bellman approach can also be related to a common first-order approach in the literature known as the generalized Euler equation approach. By constructing sufficient conditions for continuously differentiable TCPs on the primitives of the model, sufficient conditions under which a generalized Euler equation approach is valid can be provided. There are other important facets of TCP, including sufficient conditions for the existence of monotone comparative statics in interesting parameters of the decision environment, as well as generalizations of the generalized Bellman approach to allow for unbounded returns and general certainty equivalents. In addition, the case of multidimensional state space, as well as a general self generation method for characterizing nonstationary TCPs must be considered as well.


Central Bank Monetary Policy and Consumer Credit Markets  

Xudong An, Larry Cordell, Raluca A. Roman, and Calvin Zhang

Central banks around the world use monetary policy tools to promote economic growth and stability; for example, in the United States, the Federal Reserve (Fed) uses federal funds rate adjustments, quantitative easing (QE) or tightening, forward guidance, and other tools “to promote effectively the goals of maximum employment, stable prices, and moderate long-term interest rates.” Changes in monetary policy affect both businesses and consumers. For consumers, changes in monetary policy affect bank credit supply, refinancing activity, and home purchases, which in turn affect household consumption and thus economic growth and price stability. The U.S. Fed rate cuts and QE programs during COVID-19 led to historically low interest rates, which spurred a huge wave of refinancings. However, the pass-through of rate savings in the mortgage market declined during the pandemic. The weaker pass-through can be linked to the extraordinary growth of shadow bank mortgage lenders during the COVID-19 pandemic: Shadow bank mortgage lenders charged mortgage borrowers higher rates and fees; therefore, a higher market share of them means a smaller overall pass-through of rate savings to mortgage borrowers. It is important to note that these shadow banks did provide convenience to consumers, and they originated loans faster than banks. The convenience and speed could be valuable to borrowers and important in transmitting monetary policy in a timelier way, especially during a crisis.


Stochastic Volatility in Bayesian Vector Autoregressions  

Todd E. Clark and Elmar Mertens

Vector autoregressions with stochastic volatility (SV) are widely used in macroeconomic forecasting and structural inference. The SV component of the model conveniently allows for time variation in the variance-covariance matrix of the model’s forecast errors. In turn, that feature of the model generates time variation in predictive densities. The models are most commonly estimated with Bayesian methods, most typically Markov chain Monte Carlo methods, such as Gibbs sampling. Equation-by-equation methods developed since 2018 enable the estimation of models with large variable sets at much lower computational cost than the standard approach of estimating the model as a system of equations. The Bayesian framework also facilitates the accommodation of mixed frequency data, non-Gaussian error distributions, and nonparametric specifications. With advances made in the 21st century, researchers are also addressing some of the framework’s outstanding challenges, particularly the dependence of estimates on the ordering of variables in the model and reliable estimation of the marginal likelihood, which is the fundamental measure of model fit in Bayesian methods.


The History of Central Banks  

Eric Monnet

The historical evolution of the role of central banks has been shaped by two major characteristics of these institutions: they are banks and they are linked—in various legal, administrative, and political ways—to the state. The history of central banking is thus an analysis of how central banks have ensured or failed to ensure the stability of the value of money and the credit system while maintaining supportive or conflicting relationships with governments and private banks. Opening the black box of central banks is necessary to understanding the political economy issues that emerge from the implementation of monetary and credit policy and why, in addition to macroeconomic effects, these policies have major consequences on the structure of financial systems and the financing of public debt. It is also important to read the history of the evolution of central banks since the end of the 19th century as a game of countries wanting to adopt a dominant institutional model. Each historical period was characterized by a dominant model that other countries imitated - or pretended to imitate while retaining substantial national characteristics - with a view to greater international political and financial integration. Recent academic research has explored several issues that underline the importance of central banks to the development of the state, the financial system and on macroeconomic fluctuations: (a) the origin of central banks; (b) their role as a lender of last resort and banking supervisor; (c) the justifications and consequences of domestic macroeconomic policy objectives - inflation, output, etc. -of central banks (monetary policy); (d) the special loans of central banks and their role in the allocation of credit (credit policy); (e) the legal and political links between the central bank and the government (independence); (f) the role of central banks concerning exchange rates and the international monetary system; (g) production of economic research and statistics.


Housing and Macroeconomics  

Charles Ka Yui Leung

The earlier literature on macroeconomics focused on determining aggregate variables such as gross domestic product (GDP), inflation rate, and unemployment rate. It had little interaction with the literature on housing. The importance of housing in the macroeconomy has been recently discovered, and the macro-housing field is in development. The recent literature addresses several policy-relevant issues that are important for macroeconomics and housing strands of literature. One of the significant developments is the research on the rental market, as a considerable portion of the world population are renters. For instance, the impact of some macroeconomic policies depends on how easily a unit is converted between rental or owner-occupied housing. Just as failure to keep up with the mortgage payment in owner-occupied housing would lead to bankruptcy, failure to pay rent as the contract described could lead to eviction. The literature has started to investigate the causes and costs of such displacement. Some authors also explore whether public rental housing is a desirable policy. Another active research area is affordability. Some people could afford to rent but not own housing in some cities. Some may move to places where they can be house owners. The literature has started to explore such interactions of the locational choice with the tenure choice (i.e., to rent or to own). The durability of housing makes it a long-term investment. Hence, the timing and pricing of the current period housing transaction depend on the expectations of future prices. Moreover, the recent period transactions in the housing market could, in turn, affect future prices. Therefore, self-fulfilling prophecies are possible, and it is crucial to study the formation and evolution of expectations in the housing market. Some researchers have taken up the challenges and made some progress. Last but not least, the literature has extended from a single-market to a multi-market setting. Emerging literature studies the local housing and labor market, such as the county level, and brings results that challenge conventional wisdom. In response, a few authors have developed sophisticated multi-regional dynamic general equilibrium models to match the cross-sectional and time series facts and maintain the forward-looking assumption in the macroeconomics tradition. Those new models also help us to identify shocks that are not directly observable to econometricians and, at the same time, are essential to account for cross-sectional economic facts. They can bring us closer to the actual situation. In sum, the recent developments in macro-housing literature are exciting and encouraging. They would accompany scholars on the journey of evidence-based public policy formation.


Tariffs and the Macroeconomy  

Xiangtao Meng, Katheryn N. Russ, and Sanjay R. Singh

For hundreds of years, policymakers and academics have puzzled over how to add up the effects of trade and trade barriers on economic activity. The literature is vast. Trade theory generally focuses on the question of whether trade or trade barriers, like tariffs, make people and firms better off using models of the real economy operating at full employment and a net-zero trade balance. They yield powerful fundamental intuition but are not well equipped to address issues such as capital accumulation, the role of exchange rate depreciation, monetary policy, intertemporal optimization by consumers, or current account deficits, which permeate policy debates over tariffs. The literature on open-economy macroeconomics provides additional tools to address some of these issues, but neither literature has yet been able to answer definitively the question of what impact tariffs have on infant industries, current account deficits, unemployment, or inequality, which remain open empirical questions. Trade economists have only begun to understand how multiproduct retailers affect who ultimately pays tariffs and still are struggling to meaningfully model unemployment in a tractable way conducive to fast or uniform application to policy analysis, while macro approaches overlook sectoral complexity. The field’s understanding of the importance of endogenous capital investment is growing, but it has not internalized the importance of the same intertemporal trade-offs between savings and consumption for assessing the distributional impacts of trade on households. Dispersion across assessments of the impacts of the U.S.–China trade war illustrates the frontiers that economists face assessing the macroeconomic impacts of tariffs.


Financial Bubbles in History  

William Quinn and John Turner

Financial bubbles constitute some of history’s most significant economic events, but academic research into the phenomenon has often been narrow, with an excessive focus on whether bubble episodes invalidate or confirm the efficient markets hypothesis. The literature on the topic has also been somewhat siloed, with theoretical, experimental, qualitative, and quantitative methods used to develop relatively discrete bodies of research. In order to overcome these deficiencies, future research needs to move beyond the rational/irrational dichotomy and holistically examine the causes and consequences of bubbles. Future research in financial bubbles should thus use a wider range of investigative tools to answer key questions or attempt to synthesize the findings of multiple research programs. There are three areas in particular that future research should focus on: the role of information in a bubble, the aftermath of bubbles, and possible regulatory responses. While bubbles are sometimes seen as an inevitable part of capitalism, there have been long historical eras in which they were extremely rare, and these eras are likely to contain lessons for alleviating the negative effects of bubbles in the 21st century. Finally, the literature on bubbles has tended to neglect certain regions, and future research should hunt for undiscovered episodes outside of Europe and North America.