Increasingly, professional forecasters and academic researchers in economics present model-based and subjective or judgment-based forecasts that are accompanied by some measure of uncertainty. In its most complete form this measure is a probability density function for future values of the variable or variables of interest. At the same time, combinations of forecast densities are being used in order to integrate information coming from multiple sources such as experts, models, and large micro-data sets. Given the increased relevance of forecast density combinations, this article explores their genesis and evolution both inside and outside economics. A fundamental density combination equation is specified, which shows that various frequentist as well as Bayesian approaches give different specific contents to this density. In its simplest case, it is a restricted finite mixture, giving fixed equal weights to the various individual densities. The specification of the fundamental density combination equation has been made more flexible in recent literature. It has evolved from using simple average weights to optimized weights to “richer” procedures that allow for time variation, learning features, and model incompleteness. The recent history and evolution of forecast density combination methods, together with their potential and benefits, are illustrated in the policymaking environment of central banks.
Article
The Evolution of Forecast Density Combinations in Economics
Knut Are Aastveit, James Mitchell, Francesco Ravazzolo, and Herman K. van Dijk
Article
Methodology of Macroeconometrics
Aris Spanos
The current discontent with the dominant macroeconomic theory paradigm, known as Dynamic Stochastic General Equilibrium (DSGE) models, calls for an appraisal of the methods and strategies employed in studying and modeling macroeconomic phenomena using aggregate time series data. The appraisal pertains to the effectiveness of these methods and strategies in accomplishing the primary objective of empirical modeling: to learn from data about phenomena of interest. The co-occurring developments in macroeconomics and econometrics since the 1930s provides the backdrop for the appraisal with the Keynes vs. Tinbergen controversy at center stage. The overall appraisal is that the DSGE paradigm gives rise to estimated structural models that are both statistically and substantively misspecified, yielding untrustworthy evidence that contribute very little, if anything, to real learning from data about macroeconomic phenomena. A primary contributor to the untrustworthiness of evidence is the traditional econometric perspective of viewing empirical modeling as curve-fitting (structural models), guided by impromptu error term assumptions, and evaluated on goodness-of-fit grounds. Regrettably, excellent fit is neither necessary nor sufficient for the reliability of inference and the trustworthiness of the ensuing evidence. Recommendations on how to improve the trustworthiness of empirical evidence revolve around a broader model-based (non-curve-fitting) modeling framework, that attributes cardinal roles to both theory and data without undermining the credibleness of either source of information. Two crucial distinctions hold the key to securing the trusworthiness of evidence. The first distinguishes between modeling (specification, misspeification testing, respecification, and inference), and the second between a substantive (structural) and a statistical model (the probabilistic assumptions imposed on the particular data). This enables one to establish statistical adequacy (the validity of these assumptions) before relating it to the structural model and posing questions of interest to the data. The greatest enemy of learning from data about macroeconomic phenomena is not the absence of an alternative and more coherent empirical modeling framework, but the illusion that foisting highly formal structural models on the data can give rise to such learning just because their construction and curve-fitting rely on seemingly sophisticated tools. Regrettably, applying sophisticated tools to a statistically and substantively misspecified DSGE model does nothing to restore the trustworthiness of the evidence stemming from it.
Article
Reduced Rank Regression Models in Economics and Finance
Gianluca Cubadda and Alain Hecq
Reduced rank regression (RRR) has been extensively employed for modelling economic and financial time series. The main goals of RRR are to specify and estimate models that are capable of reproducing the presence of common dynamics among variables such as the serial correlation common feature and the multivariate autoregressive index models. Although cointegration analysis is likely the most prominent example of the use of RRR in econometrics, a large body of research is aimed at detecting and modelling co-movements in time series that are stationary or that have been stationarized after proper transformations. The motivations for the use of RRR in time series econometrics include dimension reductions, which simplify complex dynamics and thus make interpretations easier, as well as the pursuit of efficiency gains in both estimation and prediction. Via the final equation representation, RRR also makes the nexus between multivariate time series and parsimonious marginal ARIMA (autoregressive integrated moving average) models. RRR’s drawback, which is common to all of the dimension reduction techniques, is that the underlying restrictions may or may not be present in the data.