As the Federal Reserve’s 2015 Comprehensive Capital Analysis and Review stress testing exercise moves to its conclusion, a steady stream of well-intended but incorrect models are coming into public view. In particular, many analysts have been using lagged default probabilities as inputs to their 13 quarter stress tests, a modeling strategy that Professors Joshua Angrist and Jorn-Steffen Pischke label “forbidden models” in their classic econometrics text “ Mostly Harmless Econometrics: An Empiricist’s Companion” (2009).
We explain why such models, however well intended, are usually invalid and unacceptable from a model validation point of view using quotes from Angrist and Pischke.
Lagged Variables:
The Differences between Forecasting GDP and a CCAR Stress Test
The use of lagged variables in econometric forecasting has been standard practice in econometrics for more than five decades. A standard application dating back to the late 1960s and 1970s is a large multi-equation system to forecast economic statistics like gross domestic product for many years forward. It is safe to say that lagged variables were almost always present in such models. The best practice econometrics for their use and related estimation are very well-known. The key aspect of this traditional use of lagged variables is that the system of equations progressively simulates the value of variables forward based only on (a) inputs that were known at time zero and (b) successive forecasts of how those variables have changed. At no time are we presented with inputs from mixed points in time, as we are in the Fed’s CCAR process.
The CCAR process induces analysts to mix data from time zero (for example, company specific inputs like financial ratios, stock returns, and time zero default probabilities) and the value of 28 macro-economic factors (and functions of them) at 13 forward points in time. These points in time are 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 and 13 quarters forward. The question at hand is this: for estimating the default probability of company ABC 2 quarters forward, is it legitimate to use a forecast of ABC Company’s default probability in quarter 1 as an input to the function that estimates ABC’s default probabilities in quarter 2 in each of the Fed’s 3 stress test scenarios?
The function employed by many analysts for linking the default probability data at quarter 1 with macro factors known at quarter 2 is a two stage procedure:
First stage: forecast the default probability in period 1 PD(1), typically using a lagged specification such as
where the coefficients a and b are estimated using standard econometric techniques. Other variables, typically macro factors, may be present as explanatory variables. There is nothing wrong with this standard use of lagged data known at time zero to make a one period forecast. If this were the only econometric exercise, there would be nothing wrong with it. Unfortunately, there is another stage to the estimation process:
Second stage: using one of the functions below, combine the forecasted quarter 1 default probability with the Federal Reserve’s macro factors in quarter 2 for a revised estimate of the default probability in quarter 2, PD(2). The functions normally used for the second state include one of the following:
The function “ln” is the natural log, “N” is the cumulative normal distribution function, and “L” is the cumulative logistic function. The natural log assures us that the estimated default probability can never be negative. The latter two functions assure us that the estimated default probabilities can never be outside of the range from 0% to 100% and represent best practice. This process is continued for 13 quarters forward. In each subsequent quarter, the lagged default probability is simulated by a stage 1 function.
The fact that the estimation has two stages is not necessarily a problem. When the first and second stage are both linear, two stage least squares regression jointly estimates both the first and second stage linear functions. The problem stems from the fact that at least one of our two stages is non-linear.
Angrist and Pischke on “Forbidden Models”
Angrist and Pischke describe the consequences of this modeling choice. Angrist is a professor of economics at MIT and was listed by Thomson Reuters as being on the short list for the Nobel Prize in 2013. Pischke is a professor at the London School of Economics. Here is their comment on the validity of this modeling strategy from Angrist and Pischke (2009, p. 190):
“Forbidden regressions were forbidden by MIT professor Jerry Hausman in 1975, and while they occasionally resurface in an under-supervised thesis, they are still technically off-limits. A forbidden regression crops up when researchers apply 2SLS [two stage least squares] reasoning directly to non-linear models.”
They go on to say (2009, page 192)
“As a rule, naively plugging in first-stage fitted values in non-linear models is a bad idea. This includes models with a non-linear second stage as well as those where the [conditional expectation function] for the first stage is non-linear. “
Earlier (page 122), Angrist and Pischke address the simpler case where both the first stage and the second stage are linear:
“The [two stage least squares] name notwithstanding, we don’t usually construct 2SLS estimates in two steps. For one thing, the resulting standard errors are wrong…Typically, we let specialized software routines…do the calculation for us. This gets the standard errors right and helps to avoid other mistakes.”
We refer interested readers to Section 4.6 of Angrist and Pischke (2009) for the full details of the problems with this pseudo-two stage approach to generating 13 quarter scenarios for the default probabilities of ABC Company.
Proper Econometric Procedures and Stress Testing
Angrist and Pischke devote much of Chapter 4 to estimation using a proper two stage approach to the econometrics. In the case of CCAR projections, however, we feel strongly that a two stage approach is not necessary. There are two important issues that must be done at the outset. The first issue is to select the proper functional form (see above for 3 candidates) that will generate simulated default probabilities that are between 0% and 100% in each of the 13 quarters for which we need estimates. We call this function f(x). The second issue is to avoid the use of candidate variables in the relevant regressions that are statistical estimates, eliminating the stage 1 equation from our procedures. Let us assume we have a large default probability data base like that of Kamakura Risk Information Services. For each company in the database, assume that we know the default probability at time zero PD(0), the net income to assets ratio at time zero NI/A(0), and the one quarter excess return on the stock at time zero R(0). One approach that avoids a two stage estimation problem like the forbidden regressions above is to use the logistic, probit, or non-linear regression functions in standard statistical software to fit one relationship for each period for which we need a forecasted default probability at quarter k given macro factors at time k and our company specific inputs at time 0. One such system of equations that generates the relevant 13 quarters of forecasts as a function of our time zero inputs and the 28 macro factors Xi specified in CCAR is given here. The macro factors are not lagged unless the econometric process reveals that lags are helpful. Negative numbers in parentheses denote lagged values, which effectively cause the time 0 values to be used as explanatory variables:
Normally, in practice, as the time horizon lengthens it becomes more and more likely that the time zero inputs lose their statistical significance and are dropped from the model.
Conclusions
Well-intentioned modelers executing the required CCAR stress tests often simulate default probabilities forward using lagged values of the default probability in the previous period as an input. Unfortunately, this mixes the current values of the macro economic variables specified by the Federal Reserve with a first stage projection of lagged default probabilities in a non-linear two stage regression. This specification, as explained by Angrist and Pischke, is a “forbidden model” that should be invalid for use in the CCAR process.
A series of individual non-linear regressions is a much more attractive alternative because it avoids the problems of a non-linear two-stage process and relies only on information that is known (not simulated) for CCAR forecasting over the 13 quarter time horizon. We provide an example in a forthcoming note.
References
Angrist, Joshua D. and Jörn-Steffen Pischke, Mostly Harmless Econometrics, Princeton University Press, Princeton, New Jersey, 2009.
Appendix: Professor Pischke on Causation and Correlation
Professor Pischke provided this helpful visual aid on his website at the London School of Economics:
Top of Form
Copyright ©2014 Donald van Deventer