A test of the joint efficiency of macroeconomic forecasts using multivariate random forests

Date01 August 2018
AuthorMarian Risse,Christian Pierdzioch,Christoph Behrens
DOIhttp://doi.org/10.1002/for.2520
Published date01 August 2018
Received: 30 July 2017 Revised: 25 January 2018 Accepted: 31 January 2018
DOI: 10.1002/for.2520
RESEARCH ARTICLE
A test of the joint efficiency of macroeconomic forecasts
using multivariate random forests
Christoph Behrens Christian Pierdzioch Marian Risse
Department of Economics, Helmut
Schmidt University, Hamburg, Germany
Correspondence
Christian Pierdzioch, Department of
Economics, Helmut Schmidt University,
Holstenhofweg 85, PO Box 700822, 22008
Hamburg, Germany.
Email: c.pierdzioch@hsu-hh.de
Funding information
Deutsche Forschungsgemeinschaft
(German Science Foundation), Project
Macroeconomic Forecasting in Great
Crises, Grant/AwardNumber: FR 2677/4-1
Abstract
We contribute to recent research on the joint evaluation of the properties of
macroeconomic forecasts in a multivariate setting. The specific property of
forecasts that we are interested in is their joint efficiency. We study the joint
efficiency of forecasts by means of multivariate random forests, which we use
to model the links between forecast errors and predictor variables in a fore-
caster's information set. We then use permutation tests to study whether the
Mahalanobis distance between the predicted forecast errors for the growth and
inflation forecasts of four leading German economic research institutes and
actual forecast errors is significantly smaller than under the null hypothesis of
forecast efficiency. We reject joint efficiency in several cases, but also document
heterogeneity across researchinstitutes with regard to the joint efficiency of their
forecasts.
KEYWORDS
forecast efficiency, multivariate random forests, macroeconomic forecasts
1INTRODUCTION
Researchers typically use three concepts to characterize
the quality of forecasts: bias, efficiency, and accuracy. In a
univariate setting, the bias and efficiency of forecasts can
be studied using a least-squares regression model by test-
ing whether a constant and the lagged forecast error (weak
efficiency) and other variables in a forecaster's informa-
tion set (strong efficiency) have predictive value for the
forecast error (see, e.g., Fildes & Stekler, 2002; Holden &
Peel, 1990). The accuracy of forecasts can be assessed in
a univariate setting by means of the (root) mean-squared
error or some other popular accuracy metric (for a detailed
analysis of the pros and cons of various metrics, see Hyn-
dman & Koehler, 2006). While the evaluation of forecasts
in a univariate setting offers many important insights,
researchers have been increasingly interested in recent
years in evaluating the quality of forecasts in a multivari-
ate setting because often forecasters publish forecasts for
more than one variable.
Researchers have developed various new techniques tai-
lored to study the quality of a vector of forecasts in a
multivariate setting. Sinclair, Stekler, and Carnow (2012)
generalize the Holden and Peel (1990) approach to a mul-
tivariate setting. They estimate a vector autoregressive
(VAR) model on Survey of Professional Forecaster con-
sensus forecasts of gross domestic product (GDP) growth,
inflation, and the unemployment rate to study the bias and
weak efficiency of forecasts, and use the Mahalanobis dis-
tance to measure forecast accuracy. Sinclair and Stekler
(2013) use this approach, which combines a VAR model
and the Mahalanobis distance, to assess the quality of ini-
tial estimates of macroeconomic variables, and Sinclair,
Stekler,and Carnow (2015) use the same approach to study
the bias and accuracy of a vector of forecasts published
by the Federal Reserve. Other researchers have used the
Mahalanobis distance as a metric of multivariate forecast
dispersion (Banternghansa & McCracken, 2009) and as a
metric to rank forecasters in a multivariate setting (Bauer,
Eisenbeis, Waggoner, & Zha, 2003; Eisenbeis, Waggoner,
560 Copyright © 2018 John Wiley & Sons, Ltd. wileyonlinelibrary.com/journal/for Journal of Forecasting. 2018;37:560–572.

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT