While rational expectations (RE) remains the benchmark paradigm in macro-economic modeling, bounded rationality, especially in the form of adaptive learning, has become a mainstream alternative. Under the adaptive learning (AL) approach, economic agents in dynamic, stochastic environments are modeled as adaptive learners forming expectations and making decisions based on forecasting rules that are updated in real time as new data become available. Their decisions are then coordinated each period via the economy’s markets and other relevant institutional architecture, resulting in a time-path of economic aggregates. In this way, the AL approach introduces additional dynamics into the model—dynamics that can be used to address myriad macroeconomic issues and concerns, including, for example, empirical fit and the plausibility of specific rational expectations equilibria.
AL can be implemented as reduced-form learning, that is, the implementation of learning at the aggregate level, or alternatively, as discussed in a companion contribution to this Encyclopedia, Evans and McGough, as agent-level learning, which includes pre-aggregation analysis of boundedly rational decision making.
Typically learning agents are assumed to use estimated linear forecast models, and a central formulation of AL is least-squares learning in which agents recursively update their estimated model as new data become available. Key questions include whether AL will converge over time to a specified RE equilibrium (REE), in which cases we say the REE is stable under AL; in this case, it is also of interest to examine what type of learning dynamics are observed en route. When multiple REE exist, stability under AL can act as a selection criterion, and global dynamics can involve switching between local basins of attraction. In models with indeterminacy, AL can be used to assess whether agents can learn to coordinate their expectations on sunspots.
The key analytical concepts and tools are the E-stability principle together with the E-stability differential equations, and the theory of stochastic recursive algorithms (SRA). While, in general, analysis of SRAs is quite technical, application of the E-stability principle is often straightforward.
In addition to equilibrium analysis in macroeconomic models, AL has many applications. In particular, AL has strong implications for the conduct of monetary and fiscal policy, has been used to explain asset price dynamics, has been shown to improve the fit of estimated dynamic stochastic general equilibrium (DSGE) models, and has been proven useful in explaining experimental outcomes.
Article
Adaptive Learning in Macroeconomics
George W. Evans and Bruce McGough
Article
Agent-Level Adaptive Learning
George W. Evans and Bruce McGough
Adaptive learning is a boundedly rational alternative to rational expectations that is increasingly used in macroeconomics, monetary economics, and financial economics. The agent-level approach can be used to provide microfoundations for adaptive learning in macroeconomics.
Two central issues of bounded rationality are simultaneously addressed at the agent level: replacing fully rational expectations of key variables with econometric forecasts and boundedly optimal decisions-making based on those forecasts. The real business cycle (RBC) model provides a useful laboratory for exhibiting alternative implementations of the agent-level approach. Specific implementations include shadow-price learning (and its anticipated-utility counterpart, iterated shadow-price learning), Euler-equation learning, and long-horizon learning. For each implementation the path of the economy is obtained by aggregating the boundedly rational agent-level decisions.
A linearized RBC can be used to illustrate the effects of fiscal policy. For example, simulations can be used to illustrate the impact of a permanent increase in government spending and highlight the similarities and differences among the various implements of agent-level learning. These results also can be used to expose the differences among agent-level learning, reduced-form learning, and rational expectations.
The different implementations of agent-level adaptive learning have differing advantages. A major advantage of shadow-price learning is its ease of implementation within the nonlinear RBC model. Compared to reduced-form learning, which is widely use because of its ease of application, agent-level learning both provides microfoundations, which ensure robustness to the Lucas critique, and provides the natural framework for applications of adaptive learning in heterogeneous-agent models.
Article
Bayesian Vector Autoregressions: Applications
Silvia Miranda-Agrippino and Giovanni Ricco
Bayesian vector autoregressions (BVARs) are standard multivariate autoregressive models routinely used in empirical macroeconomics and finance for structural analysis, forecasting, and scenario analysis in an ever-growing number of applications.
A preeminent field of application of BVARs is forecasting. BVARs with informative priors have often proved to be superior tools compared to standard frequentist/flat-prior VARs. In fact, VARs are highly parametrized autoregressive models, whose number of parameters grows with the square of the number of variables times the number of lags included. Prior information, in the form of prior distributions on the model parameters, helps in forming sharper posterior distributions of parameters, conditional on an observed sample. Hence, BVARs can be effective in reducing parameters uncertainty and improving forecast accuracy compared to standard frequentist/flat-prior VARs.
This feature in particular has favored the use of Bayesian techniques to address “big data” problems, in what is arguably one of the most active frontiers in the BVAR literature. Large-information BVARs have in fact proven to be valuable tools to handle empirical analysis in data-rich environments.
BVARs are also routinely employed to produce conditional forecasts and scenario analysis. Of particular interest for policy institutions, these applications permit evaluating “counterfactual” time evolution of the variables of interests conditional on a pre-determined path for some other variables, such as the path of interest rates over a certain horizon.
The “structural interpretation” of estimated VARs as the data generating process of the observed data requires the adoption of strict “identifying restrictions.” From a Bayesian perspective, such restrictions can be seen as dogmatic prior beliefs about some regions of the parameter space that determine the contemporaneous interactions among variables and for which the data are uninformative. More generally, Bayesian techniques offer a framework for structural analysis through priors that incorporate uncertainty about the identifying assumptions themselves.
Article
Bayesian Vector Autoregressions: Estimation
Silvia Miranda-Agrippino and Giovanni Ricco
Vector autoregressions (VARs) are linear multivariate time-series models able to capture the joint dynamics of multiple time series. Bayesian inference treats the VAR parameters as random variables, and it provides a framework to estimate “posterior” probability distribution of the location of the model parameters by combining information provided by a sample of observed data and prior information derived from a variety of sources, such as other macro or micro datasets, theoretical models, other macroeconomic phenomena, or introspection.
In empirical work in economics and finance, informative prior probability distributions are often adopted. These are intended to summarize stylized representations of the data generating process. For example, “Minnesota” priors, one of the most commonly adopted macroeconomic priors for the VAR coefficients, express the belief that an independent random-walk model for each variable in the system is a reasonable “center” for the beliefs about their time-series behavior. Other commonly adopted priors, the “single-unit-root” and the “sum-of-coefficients” priors are used to enforce beliefs about relations among the VAR coefficients, such as for example the existence of co-integrating relationships among variables, or of independent unit-roots.
Priors for macroeconomic variables are often adopted as “conjugate prior distributions”—that is, distributions that yields a posterior distribution in the same family as the prior p.d.f.—in the form of Normal-Inverse-Wishart distributions that are conjugate prior for the likelihood of a VAR with normally distributed disturbances. Conjugate priors allow direct sampling from the posterior distribution and fast estimation. When this is not possible, numerical techniques such as Gibbs and Metropolis-Hastings sampling algorithms are adopted.
Bayesian techniques allow for the estimation of an ever-expanding class of sophisticated autoregressive models that includes conventional fixed-parameters VAR models; Large VARs incorporating hundreds of variables; Panel VARs, that permit analyzing the joint dynamics of multiple time series of heterogeneous and interacting units. And VAR models that relax the assumption of fixed coefficients, such as time-varying parameters, threshold, and Markov-switching VARs.
Article
Bootstrapping in Macroeconometrics
Helmut Herwartz and Alexander Lange
Unlike traditional first order asymptotic approximations, the bootstrap is a simulation method to solve inferential issues in statistics and econometrics conditional on the available sample information (e.g. constructing confidence intervals, generating critical values for test statistics). Even though econometric theory yet provides sophisticated central limit theory covering various data characteristics, bootstrap approaches are of particular appeal if establishing asymptotic pivotalness of (econometric) diagnostics is infeasible or requires rather complex assessments of estimation uncertainty. Moreover, empirical macroeconomic analysis is typically constrained by short- to medium-sized time windows of sample information, and convergence of macroeconometric model estimates toward their asymptotic limits is often slow. Consistent bootstrap schemes have the potential to improve empirical significance levels in macroeconometric analysis and, moreover, could avoid explicit assessments of estimation uncertainty. In addition, as time-varying (co)variance structures and unmodeled serial correlation patterns are frequently diagnosed in macroeconometric analysis, more advanced bootstrap techniques (e.g., wild bootstrap, moving-block bootstrap) have been developed to account for nonpivotalness as a results of such data characteristics.
Article
Capital Controls: A Survey of the New Literature
Alessandro Rebucci and Chang Ma
This paper reviews selected post–Global Financial Crisis theoretical and empirical contributions on capital controls and identifies three theoretical motives for the use of capital controls: pecuniary externalities in models of financial crises, aggregate demand externalities in New Keynesian models of the business cycle, and terms of trade manipulation in open-economy models with pricing power. Pecuniary and demand externalities offer the most compelling case for the adoption of capital controls, but macroprudential policy can also address the same distortions. So capital controls generally are not the only instrument that can do the job. If evaluated through the lenses of the new theories, the empirical evidence reviewed suggests that capital controls can have the intended effects, even though the extant literature is inconclusive as to whether the effects documented amount to a net gain or loss in welfare terms. Terms of trade manipulation also provides a clear-cut theoretical case for the use of capital controls, but this motive is less compelling because of the spillover and coordination issues inherent in the use of control on capital flows for this purpose. Perhaps not surprisingly, only a handful of countries have used capital controls in a countercyclical manner, while many adopted macroprudential policies. This suggests that capital control policy might entail additional costs other than increased financing costs, such as signaling the bad quality of future policies, leakages, and spillovers.
Article
Central Bank Monetary Policy and Consumer Credit Markets
Xudong An, Larry Cordell, Raluca A. Roman, and Calvin Zhang
Central banks around the world use monetary policy tools to promote economic growth and stability; for example, in the United States, the Federal Reserve (Fed) uses federal funds rate adjustments, quantitative easing (QE) or tightening, forward guidance, and other tools “to promote effectively the goals of maximum employment, stable prices, and moderate long-term interest rates.” Changes in monetary policy affect both businesses and consumers. For consumers, changes in monetary policy affect bank credit supply, refinancing activity, and home purchases, which in turn affect household consumption and thus economic growth and price stability. The U.S. Fed rate cuts and QE programs during COVID-19 led to historically low interest rates, which spurred a huge wave of refinancings. However, the pass-through of rate savings in the mortgage market declined during the pandemic. The weaker pass-through can be linked to the extraordinary growth of shadow bank mortgage lenders during the COVID-19 pandemic: Shadow bank mortgage lenders charged mortgage borrowers higher rates and fees; therefore, a higher market share of them means a smaller overall pass-through of rate savings to mortgage borrowers. It is important to note that these shadow banks did provide convenience to consumers, and they originated loans faster than banks. The convenience and speed could be valuable to borrowers and important in transmitting monetary policy in a timelier way, especially during a crisis.
Article
China’s Housing Policy and Housing Boom and Their Macroeconomic Impacts
Kaiji Chen
The house price boom that has been present in most Chinese cities since the early 2000s has triggered substantial interest in the role that China’s housing policy plays in its housing market and macroeconomy, with an extensive literature employing both empirical and theoretical perspectives developed over the past decade. This research finds that the privatization of China’s housing market, which encouraged households living in state-owned housing to purchase their homes at prices far below their market value, contributed to a rapid increase in homeownership beginning in the mid-1990s. Housing market privatization also has led to a significant increase in both housing and nonhousing consumption, but these benefits are unevenly distributed across households. With the policy goal of making homeownership affordable for the average household, the Housing Provident Fund contributes positively to homeownership rates. By contrast, the effectiveness of housing policies to make housing affordable for low-income households has been weaker in recent years. Moreover, a large body of empirical research shows that the unintended consequences of housing market privatization have been a persistent increase in housing prices since the early 2000s, which has been accompanied by soaring land prices, high vacancy rates, and high price-to-income and price-to-rent ratios. The literature has differing views regarding the sustainability of China’s housing boom. On a theoretical front, economists find that rising housing demand, due to both consumption and investment purposes, is important to understanding China’s prolonged housing boom, and that land-use policy, which influences the supply side of the housing market, lies at the center of China’s housing boom. However, regulatory policies, such as housing purchase restrictions and property taxes, have had mixed effects on the housing market in different cities. In addition to China’s housing policy and its direct effects on the nation’s housing market, research finds that China’s housing policy impacts its macroeconomy via the transmission of house price dynamics into the household and corporate sectors. High housing prices have a heterogenous impact on the consumption and savings of different types of households but tend to discourage household labor supply. Meanwhile, rising house prices encourage housing investment by non–real-estate firms, which crowds out nonhousing investment, lowers the availability of noncollateralized business loans, and reduces productive efficiency via the misallocation of capital and managerial talent.
Article
The Cointegrated VAR Methodology
Katarina Juselius
The cointegrated VAR approach combines differences of variables with cointegration among them and by doing so allows the user to study both long-run and short-run effects in the same model. The CVAR describes an economic system where variables have been pushed away from long-run equilibria by exogenous shocks (the pushing forces) and where short-run adjustments forces pull them back toward long-run equilibria (the pulling forces). In this model framework, basic assumptions underlying a theory model can be translated into testable hypotheses on the order of integration and cointegration of key variables and their relationships. The set of hypotheses describes the empirical regularities we would expect to see in the data if the long-run properties of a theory model are empirically relevant.
Article
Consumer Debt and Default: A Macro Perspective
Florian Exler and Michèle Tertilt
Consumer debt is an important means for consumption smoothing. In the United States, 70% of households own a credit card, and 40% borrow on it. When borrowers cannot (or do not want to) repay their debts, they can declare bankruptcy, which provides additional insurance in tough times. Since the 2000s, up to 1.5% of households declared bankruptcy per year. Clearly, the option to default affects borrowing interest rates in equilibrium. Consequently, when assessing (welfare) consequences of different bankruptcy regimes or providing policy recommendations, structural models with equilibrium default and endogenous interest rates are needed. At the same time, many questions are quantitative in nature: the benefits of a certain bankruptcy regime critically depend on the nature and amount of risk that households bear. Hence, models for normative or positive analysis should quantitatively match some important data moments.
Four important empirical patterns are identified: First, since 1950, consumer debt has risen constantly, and it amounted to 25% of disposable income by 2016. Defaults have risen since the 1980s. Interestingly, interest rates remained roughly constant over the same time period. Second, borrowing and default clearly depend on age: both measures exhibit a distinct hump, peaking around 50 years of age. Third, ownership of credit cards and borrowing clearly depend on income: high-income households are more likely to own a credit card and to use it for borrowing. However, this pattern was stronger in the 1980s than in the 2010s. Finally, interest rates became more dispersed over time: the number of observed interest rates more than quadrupled between 1983 and 2016.
These data have clear implications for theory: First, considering the importance of age, life cycle models seem most appropriate when modeling consumer debt and default. Second, bankruptcy must be costly to support any debt in equilibrium. While many types of costs are theoretically possible, only partial repayment requirements are able to quantitatively match the data on filings, debt levels, and interest rates simultaneously. Third, to account for the long-run trends in debts, defaults, and interest rates, several quantitative theory models identify a credit expansion along the intensive and extensive margin as the most likely source. This expansion is a consequence of technological advancements.
Many of the quantitative macroeconomic models in this literature assess welfare effects of proposed reforms or of granting bankruptcy at all. These welfare consequences critically hinge on the types of risk that households face—because households incur unforeseen expenditures, not-too-stringent bankruptcy laws are typically found to be welfare superior to banning bankruptcy (or making it extremely costly) but also to extremely lax bankruptcy rules.
There are very promising opportunities for future research related to consumer debt and default. Newly available data in the United States and internationally, more powerful computational resources allowing for more complex modeling of household balance sheets, and new loan products are just some of many promising avenues.