« Back to Results
Marriott Marquis, Catalina
Hosted By:
Econometric Society
Macroeconometrics
Paper Session
Sunday, Jan. 5, 2020 8:00 AM - 10:00 AM (PDT)
- Chair: Lutz Kilian, Federal Reserve Bank of Dallas
Using Arbitrary Precision Arithmetic to Sharpen Identification Analysis for DSGE Models
Abstract
This paper is at the intersection of macroeconomics and modern computer arithmetic. It seeks to apply arbitrary precision arithmetic to resolve practical difficulties arising in the identification analysis of log linearized DSGE models. The main focus is on methods in Qu and Tkachenko (2012, 2017) since the framework appears to be the most comprehensive to date. Working with this arithmetic, we develop the following three-step procedure for analyzing local and global identification. (1) The DSGE model solution algorithm is modified so that all the relevant objects are computed as multiprecision entities allowing for indeterminacy. (2) The rank condition and the Kullback-Leibler distance are computed using arbitrary precision Gauss-Legendre quadrature. (3) Minimization is carried out by combining double precision global and arbitrary precision local search algorithms, where the criterion for convergence is set based on the chosen precision level, so that it can be effectively examined whether the minimized value equals zero. In an application to a model featuring monetary and fiscal policy interactions (Leeper, 1991 and Tan and Walker, 2015), we find that the arithmetic removes all ambiguity in the analysis. As a result, we reach clear conclusions showing observational equivalence both within the same policy regime and across different policy regimes under generic parameter values. We further illustrate the application of the method to medium scale DSGE models by considering the model of Schmitt-Grohé and Uribe (2012), where the use of extended precision again helps remove ambiguity in cases where near observational equivalence is detected.Local Projections and VARs Estimate the Same Impulse Responses
Abstract
We prove that local projections (LPs) and Vector Autoregressions (VARs) estimate the same impulse responses. This nonparametric result only requires unrestricted lag structures. We discuss several implications: (i) LP and VAR estimators are not conceptually separate procedures; instead, they belong to a spectrum of dimension reduction techniques with common estimand but different bias-variance properties. (ii) VAR-based structural estimation can equivalently be performed using LPs, and vice versa. (iii) Structural estimation with an instrument (proxy) can be carried out by ordering the instrument first in a recursive VAR, even under non-invertibility. (iv) Linear VARs are as robust to non-linearities as linear LPs.Boosting the Hodrick-Prescott Filter
Abstract
The Hodrick-Prescott (HP) filter is one of the most widely used econometric methods in applied macroeconomic research. The technique is nonparametric and seeks to decompose a time series into a trend and a cyclical component unaided by economic theory or prior trend specification. Like all nonparametric methods, the HP filter depends critically on a tuning parameter that controls the degree of smoothing. Yet in contrast to modern nonparametric methods and applied work with these procedures, empirical practice with the HP filter almost universally relies on standard settings for the tuning parameter that have been suggested largely by experimentation with macroeconomic data and heuristic reasoning about the form of economic cycles and trends. As recent research has shown, standard settings may not be adequate in removing trends, particularly stochastic trends, in economic data. This paper proposes an easy-to-implement practical procedure of iterating the HP smoother that is intended to make the filter a smarter smoothing device for trend estimation and trend elimination. We call this iterated HP technique the boosted HP filter in view of its connection to L2-boosting in machine learning. The paper develops limit theory to show that the boosted HP filter asymptotically recovers trend mechanisms that involve unit root processes, deterministic polynomial drifts, and polynomial drifts with structural breaks -- the most common trends that appear in macroeconomic data and current modeling methodology. In doing so, the boosted filter provides a new mechanism for consistently estimating multiple structural breaks. A stopping criterion is used to automate the iterative HP algorithm, making it a data-determined method that is ready for modern data-rich environments in economic research. The methodology is illustrated using three real data examples that highlight the differences between simple HP filtering, the data-determined boosted filter, and an alternative autoregressive approach. These examples show that the boosted HP filter is helpful in analyzing a large collection of heterogeneous macroeconomic time series that manifest various degrees of persistence, trend behavior, and volatility.JEL Classifications
- C3 - Multiple or Simultaneous Equation Models; Multiple Variables