ABOUT THE AUTHOR

Donald R. Van Deventer, Ph.D.

Don founded Kamakura Corporation in April 1990 and currently serves as Co-Chair, Center for Applied Quantitative Finance, Risk Research and Quantitative Solutions at SAS. Don’s focus at SAS is quantitative finance, credit risk, asset and liability management, and portfolio management for the most sophisticated financial services firms in the world.

Read More

ARCHIVES

Modeling Default for Credit Portfolio Management and CDO Valuation: A Menu of Alternatives

04/18/2009 04:29 AM

Our recent post on the problems with the copula approach for CDO valuation (The Copula Approach to CDO Valuation: A Post Mortem, April 9, 2009) summaries the areas where simplifying assumptions produced tractable answers that had little link to reality.  If the copula approach has serious problems, what are the alternatives?  We explore that question in this post.

The Basel II capital accords have been much criticized as irrelevant to the prediction and resolution of the current credit crisis.  Within the hundreds of pages of Basel documents, however, one prescription stands out.  In the context of credit risk analysis, the Basel II documents are very clear that it’s not enough to have a model that one believes to be good.  One has to prove that it’s better than the alternatives, and the only way to do that is to implement multiple models and to measure their performance.  As the April 9 post suggests, modeling error can be significant so we shouldn’t conclude that even the best model is perfect and doesn’t need improvement. Indeed, continuous improvement is essential to best practice risk management.

In that spirit, we outline a series of alternative approaches for modeling default given default probabilities from any source, whether they are Merton-based or the more modern approach.  Given a source for default probabilities that have been proven to be accurate, what are the modeling choices for simulating default?  Here is a comprehensive menu of alternatives:

1.  Constant default probabilities with random occurrence of defaults
a. No default correlation
b. Non-random drift of default probabilities to match a term structure of default probabilities
c. Copula method defaults with default events driven by a common macro factor

2. Random default probabilities
a.  Historical random sampling
b.  Macro factor driven default probabilities, including the explicit incorporation of the random impacts of idiosyncratic risk

Kamakura Corporation has implemented all of these techniques in both Kamakura Risk Manager and the web-based credit portfolio management tool KRIS-cdo. Here we note that the usage of default modeling techniques should be the sample for both traditional credit portfolio management of fixed income (and equity) securities and for collateralized debt obligations.  After all, traditional credit portfolio management is just the simulation of a CDO where the loss bands run from 0 to 100%.  We now discuss each of the five modeling alternatives above.

1a.  Constant default probabilities with no default correlation

This is the simplest approach of the five alternatives mentioned above, but it has a number of advantages.  First, it provides a simple baseline for comparison with other approaches that may provide less intuitive answers.  It’s an aid to understanding all of the approaches on the menu above.  It’s intuitive and easy to understand. The assumption that default probabilities are constant is also a common assumption, particularly in the copula approach (approach 1c), so from that perspective it’s not a rare modeling choice.

This approach has a number of disadvantages, however, which are very serious.  First, it will severely underestimate the worst case losses, not a career-enhancing feature for most risk analysts.  Second, this means that it will overvalue the most senior CDO tranches, a very common problem in the current credit crisis. Third, Kamakura’s view is that this approach is so simple that it’s not likely to be approved for use by auditors or regulators, although both groups seemed to have averted their eyes when scrutinizing the copula approach for many years. Fourth, the choice of maturity of the default probability to be used is not obvious even though this approach is simple overall.  Generally, we’d say that the right maturity of default probability to choose should be that which best matches the maturity of the assets being analyzed.  It’s more complicated than that, however, because the term structure of default probabilities can be upward or downward sloping, and it rises and falls over the business cycle.

1b. Non-random drift in default probabilities

As noted in the paragraph above, the choice of the correct maturity of default probabilities is not a “no-brainer” if a constant default probability is assumed.  To avoid this problem, many analysts allow the default probabilities to drift over time in a non-random manner to fit the term structure of default probabilities.  In the KRIS default probability service, for example, there is a one month default probability and 59 forward one month default probabilities that are assembled to form the term structure of default probabilities.  This method uses those 60 monthly default probabilities rather than assuming one constant level.  Please note, however, that although the default probabilities drift, they are not random.  They are 100% determined by the term structure of defaults that prevails at time zero.  Many analysts make the mistake of assuming that the term structure of credit default swaps is a term structure of default probabilities when performing this form of analysis.  That’s a serious error, as we point out in our March 18 blog entry (Credit Default Swaps and Default Probabilities: What is the Linkage?).  We now turn to the pros and cons of this approach.

The advantages of this approach are many.  Like constant default probabilities, it provides a simple and intuitive approach that is a useful benchmark for assessing the realism of other techniques.  It avoids the potentially dangerous need to choose just one number for each counterparty that is held constant over the modeling period.  Finally, it has the advantage of precedent as many of the most thoughtful users of the copula method embedded this drift in their analysis.

The disadvantages of this approach are serious.  It underestimates the worst case losses and overvalues the most senior tranches of CDOs because of the implicit assumption that defaults are not correlated.

1c. Copula method defaults with defaults driven by a common factor

The third method is the copula approach, which we discussed in our post on April 9, 2009, “The Copula Approach to CDO Valuation: A Post Mortem.” We refer readers to that post for a very detailed discussion of the simplifying assumptions that made the copula method popular, tractable and wrong.

The advantages of the copula method are its widespread use, the simplicity of the assumption that default probabilities are constant, and the intuitive linkages to the 1973 Merton model of risky debt.

The disadvantages of the copula approach are many.  First, it too dramatically underestimates worst case losses and overvalues senior tranches of CDOs.  Second, it has been widely criticized in many respected publications, including a page 1 story in the Wall Street Journal on August 12, 2005.  This makes it very difficult to defend as a modeling choice when auditors or regulators raise questions about it.   The copula method erroneously implies there is one common macro factor that drives the timing of defaults, not the level of default probabilities.  Like Black-Scholes, the model implies that the correlation with this common factor should correctly price all tranches of a CDO, but the evidence from the market place has been that this implication is dramatically untrue.

2a.  Historical random sampling

Many analysts who come from a traditional fixed income background are new to the historical sampling approach, which is traditional in equity return modeling.  In this approach, a date from the past is randomly chosen, say January 1992.  Default probabilities from that month are assigned to call counterparties and default/no default is simulated for month 1 in the analysis.  For month 2, another date is randomly chosen, say April 2006.  Default probabilities from that point in time are chosen and default/no default is simulated for month 2.  This continues over the full modeling period.  This approach is “bog standard” for equity return modeling in part because there is no “model” of equity returns that meets many analysts’ need for accuracy, other than history itself.

The advantages of this method are its wide acceptance by equity portfolio managers.  On average, it will more accurately reflect the level of default probabilities that an arbitrary choice of default probabilities prevailing at one point in time, which are then held constant.

The disadvantages of this approach are two.  First, it “scrambles” the business cycle so that extreme losses from the worst point in the cycle are dispersed over time, rather than bunched together like a typical recession.  This technique also does not explicitly identify the macro factors driving default, so it is incapable of successfully answering the most important credit modeling question: “What’s the hedge?”

2b.  Macro factor driven default probabilities

For background on this approach please see the Kamakura blog on “reduced reduced form models” posted on March 19, 2009. This approach fits an equation linking historical default variations to macro factors and a single idiosyncratic risk factor for each counterparty in the portfolio.  These equations are available by subscription to KRIS-cdo for 24,000 public firms in 30 countries.  The macro factors are then simulated forward, along with the idiosyncratic risk factor, to get a default probability in period N.  Default/no default is then simulated.

The advantages of this approach are many.  First, it is consistent with the Federal Reserve and FDIC-mandated stress tests announced in March 2009.  Second, it is consistent with modern reduced form modeling of default risk.  Third, it explicitly recognizes that multiple factors drive default risk, unlike the copula method which assumes only one common macro factor.  In the KRIS-cdo implementation, for example, macro factors are selected from a menu of 40 factors.  The typical company default probabilities are driven by 5-10 factors.   The fourth advantage of this approach is that it allows macro factors to be derived from stress tests, exactly like the Black-Scholes “delta.”  Fifth, the approach is consistent with the FDIC Loss Distribution Model authored by Robert A. Jarrow with 4 co-authors and published on December 10, 2003.  Finally, the solution is produced by a straightforward monte carlo simulation.

The disadvantage of this method is that it goes beyond the capabilities of typical spreadsheet software.  It requires a rich multi-period default-adjusted simulation capability like that in KRIS-cdo or Kamakura Risk Manager.  A second disadvantage is that it makes it very important to choose the right characteristics of the macro factors simulated forward.

For questions or comments on this post, please contact us at info@kamakuraco.com.

Donald R. van Deventer
Kamakura Corporation
Honolulu, April 20, 2009

ABOUT THE AUTHOR

Donald R. Van Deventer, Ph.D.

Don founded Kamakura Corporation in April 1990 and currently serves as Co-Chair, Center for Applied Quantitative Finance, Risk Research and Quantitative Solutions at SAS. Don’s focus at SAS is quantitative finance, credit risk, asset and liability management, and portfolio management for the most sophisticated financial services firms in the world.

Read More

ARCHIVES