Donald R. Van Deventer, Ph.D.

Don founded Kamakura Corporation in April 1990 and currently serves as Co-Chair, Center for Applied Quantitative Finance, Risk Research and Quantitative Solutions at SAS. Don’s focus at SAS is quantitative finance, credit risk, asset and liability management, and portfolio management for the most sophisticated financial services firms in the world.

Read More


Yield Curve Smoothing: Nelson-Siegel versus Spline Technologies, Part 1

07/21/2009 04:30 AM

The 1987 yield curve formulation proposed by Charles R. Nelson and Andrew F. Siegel remains very popular among financial market participants and central bank economists. This post explains, in a nontechnical way, the kinds of errors that result from use of the Nelson-Siegel formulation and what techniques provide a superior result.

On January 22, 2004, one of us posted an article entitled “Evaluating Yield Curve Smoothing Techniques with Implications for Credit Spreads” on RiskCenter.com (see http://www.riskcenter.com/story.php?id=7985) explaining the key criteria for “best” in creating yield curves, the fundamental building block of all financial analysis.  In the FAS 157 era, which accuracy is paramount, yield curve smoothing is more important than ever. A number of approaches have been proposed to creating a smooth zero coupon yield curve and related forward rate and zero coupon bond price curves.  One of the challenges of deciding how to do this is the literally infinite number of possibilities.  Consider the following example were we have four observable continuously compounded zero coupon bond yields:

1 day: 5%
1 year: 6.3%
10 years: 6.9%
30 years: 7.4%

There is an infinite number of yield curves that can be drawn through these four points.  This is a problem that is not unique to finance.  In fact, millions of children have been enthralled by movies using computer animation where computer graphic artists use sophisticated tools to build images, most of which feature smooth surfaces, in an automated way.  These tools, for the most part, are from the spline family. There are more than 182,000 web pages identified on google for the search +”computer graphics” +spline.  One of those hits is


which describes the book An Introduction to Splines for Use in Computer Graphics and Geometric Modeling by Richard Bartels, John Beatty, and Brian Barsky (Elsevier, 2009). Instead of these flexible modeling systems, financial economists, for the most part, have proposed an array of analytically convenient and simplified functional forms among the infinite number that would fit the simple example above.  Huston McCulloch proposed the use of cubic splines in the 1970s.  In the early 1980s, Gifford Fong and Oldrich Vasicek proposed using exponential splines.  In 1987, Charles Nelson and Andrew Siegel proposed a four parameter forward rate model in which there are two exponential terms.  For the full text of the Nelson Siegel model, see this link:


In the article, the authors make it clear on page 479 that their objective is simplicity rather than accuracy:

“It is also quite clear from figure 4 that no set of values of the parameters
would fit the data perfectly, nor is it our objective to find a model
that would do so. A more highly parameterized model that could follow
all the wiggles in the data is less likely to predict well, in our view, than
a more parsimonious model that assumes more smoothness in the
underlying relation than one observes in the data.”

The objective was simplicity: the authors wished to generate a curve that could be easily parameterized with very few coefficients, rather than create a more flexible, more accurate, but admittedly more complex relationship.  The objective when estimating functions with splines is accuracy, not parsimony: splines can be used to consistently estimate any functional relationship, and are not constrained to fit a given form.

A useful analogy to the Nelson-Siegel model and objectives is the Copula approach in CDO pricing.  Copulas are easily parameterized by a single number, but are restricted to operate within their strict set of assumptions (these include restrictions on distributions, fixed correlation coefficients, etc.), and when those assumptions fail, the models produce prices that verge on nonsense.

Defining “the best” yield curve

Rather than arbitrarily “assuming more smoothness than the data,” can we define a set of criteria for choosing an optimal yield curve technology?  There are a number:

•    Accuracy, defined as an exact fit to all data that is “good”
•    Ease of Use
•    Best among all accurate alternatives by some criterion
•    Implied pricing must be consistent with no arbitrage

In the computer graphics area, the family of splines is the standard choice.  To understand why, we turn to Adams and van Deventer (1994), van Deventer and Imai (1997, which contains a corrected version of Adams and van Deventer), and van Deventer, Imai and Mesler (2004).  As those authors show, it is a mathematical fact that the smoothest line that can be drawn between the four yield curve points above is a cubic spline, a series of three cubic functions of years to maturity.  Indeed, van Deventer and Imai can be interpreted to show that the insights in Huston McCulloch’s article are completely consistent with the literature from computer graphics.

Splines can also be applied to zero coupon bond prices. While Fong and Vasicek (1982) criticized the cubic spline approach, saying it resulted in implausible forward rates that were not smooth, Shea (1985), criticized the exponential spline approach of Fong and Vasicek for the same reason.  Adams and van Deventer, as corrected in van Deventer and Imai, derived the maximum smoothness forward rate approach with an important contribution by Oldrich Vasicek.  The authors used Vasicek’s insights and went through these steps:

•    “Best” was defined as the function which produces the smoothest forward rate curve which (a) is consistent with the “good” data and (b) is consistent with constraints on the first or second derivatives on both sides of the yield curve
•    The mathematical expression for smoothness was borrowed from the computer graphics literature and engineering
•    The functional relationship which provides “maximum smoothness” was derived.

It turns out that the smoothest forward rate function is a quartic spline of forward rates.  As shown in van Deventer and Imai, this maximum smoothness forward rate function can be derived with a simple matrix inversion.

Comparing the Maximum Smoothness Forward Rate Approach to Nelson Siegel

Both the maximum smoothness forward rate approach and the Nelson Siegel approach have been widely implemented.  The maximum smoothness approach and various cubic splines are now used in enterprise risk management software that is used in over 30 countries.  As indicated in a 2005 paper by the Bank for International Settlements, the Nelson Siegel approach is the dominant choice of European Central Banks.  Confining ourselves to the situation where all of the input data to the smoothing process is “good,” how do the two techniques compare?  We compare them here:

The maximum smoothness forward rate approach can fit yield curves generated from any number of “good” data points.  The Nelson-Siegel approach cannot.  At best, Nelson Siegel can perfectly fit yield curves with 4 observations, since there are 4 parameters, and some yield curve shapes may be simply be inconsistent with the Nelson-Siegel functional form

Ease of Estimation:
The maximum smoothness forward rate approach requires one to invert a matrix.  The Nelson-Siegel approach requires a two-step process that combines iterating on one parameter and estimating the best fitting values for the other three parameters using ordinary least squares.  Accurately calculating confidence intervals and executing hypothesis tests with such iterative procedures is an equally involved process.  Alternatively, a non-linear equation fitting technique that minimizes the pricing errors can be used to select best fitting parameters, though calculating confidence remains relatively complex.  Most observers would argue that one matrix inversion is easier to do than the Nelson-Siegel procedure.

Consistency with No-Arbitrage:
Filipovic (“A Note on the Nelson-Siegel Family,” Mathematical Finance, 1999) notes that the Nelson-Siegel family, in the words of Robert Jarrow, is “is inconsistent with arbitrage free term structure models.” For a copy of the Filipovic article, see


In addition to the academic implications of this conclusion, in practical implementation it means that one could assemble a portfolio of long and short positions that seem to guarantee profit with no risk.  This apparent profit, however, is simply the result of errors in yield curve smoothing with the Nelson-Siegel approach.

The maximum smoothness forward rate technique does not suffer from this inconsistency, as it can fit any yield curve and any functional form that is consistent with no arbitrage.

We now turn to the case where the data is “bad” in some dimension.

The Case of Bad Data

Some argue that the Nelson Siegel approach is superior in the case of markets that are in their early stages of development and data is sparse.  In fact, the maximum smoothness forward rate approach was developed for exactly this reason in Japan in the early 1990s.  At that time, there was active trading in only one Japanese government bond issue, the cheapest bond to deliver on JGB futures contracts.  In this case, the maximum smoothness forward rate technique can still be applied, say fitting the one traded issue and the overnight rate.  The Nelson Siegel approach, however, requires four parameters.  What can we do if there are only 2 data points?  One has to apply some criterion, like smoothness, for choosing among the many Nelson Siegel parameter combinations that would fit these two points.

Another type of “bad data” is the case where a perfect fit to the observable points is impossible.  This could come about because of a lack of simultaneity in quotes, bid offered spreads, or simply data errors.  Here is an example:

1 day: 5%
1 year: 6.3%
10 years: 6.9%
10 years: 7.1%
30 years: 7.4%

We have two inconsistent quotes at the 10 year point.  How do the two techniques compare in this regard?  If one minimizes the mispricing errors, the maximum smoothness forward rate approach will always win (or tie): even if the Nelson Siegel functional form is right by coincidence, the maximum smoothness forward rate approach will replicate its shape in the limit.  The converse is not true.

Another type of bad data is illustrated by this example:

1 day: 5%
1 year: 0.3%
10 years: 6.9%
30 years: 7.4%

Clearly, the one year quotation is highly unlikely to be “good data.”  The Nelson Siegel approach won’t be able to fit these points exactly, and therefore the resulting yield curve will be smoother than the maximum smoothness forward rate function that DOES fit the data exactly.  That’s not an apples to apples comparison, however.  The correct solution is to throw out the bad data (the one year observation) and THEN fit the two functions.  In this case, of course, the maximum smoothness forward rate approach will fit the data perfectly and produce the smoothest forward rate function.  The Nelson-Siegel approach will not.

On a fully automated basis, many authors have proposed methods for selecting the “good data” from a collection of data points, some of which are bad. Often, success in finance depends on being able to tell the difference between good data and bad data as much as identifying good models and good technology, though identifying reliable data is a topic for another blog post.

In a separate blog post, we will compare the Nelson Siegel approach and the maximum smoothness forward rate approach using a worked example.  In the meantime, we close with some key references on yield curve smoothing.


Kenneth J. Adams and Donald R. van Deventer, 1994, Fitting Yield Curves and Forward Rate Curves with Maximum Smoothness, The Journal of Fixed Income, June 1994, 52-62.

Bank for International Settlements, Monetary and Economic Department, “Zero Coupon Yield Curves: Technical Documentation,” 2005.

Mark Buono, Russell B. Gregory-Allen, and Uzi Yaari, 1992, The Efficacy of Term Structure Estimation Techniques: A Monte Carlo Study,  The Journal of Fixed Income 1, 52-59.

Damir Filipovic, “A Note on the Nelson-Siegel Family,” Mathematical Finance, October, 1999, pp. 349-359.

F. B. Hildebrand,  1987, Introduction to Numerical Analysis (Dover Publications Inc., New York).

J. Huston McCulloch, 1975, The Tax Adjusted Yield Curve,  Journal of Finance 30, 811-29.

Charles R. Nelson and Andrew F. Siegel, “Parsimonious Modeling of Yield Curves” The Journal of Business, Vol. 60, No. 4. (Oct., 1987), pp. 473-489.

P. M. Penter, 1989, Splines and Variational Methods (John Wiley & Sons, New York).

H. R. Schwartz, 1989, Numerical Methods: A Comprehensive Introduction (John Wiley & Sons, New York).

Gary S. Shea, 1985, Term Structure Estimation with Exponential Splines,  Journal of Finance 40, 319-325.

Donald R. van Deventer and Kenji Imai, Financial Risk Analytics: A Term Structure Model Approach for Banking, Insurance, and Investment Management, Irwin Professional Publishing, Chicago, 1997.

Donald R. van Deventer, Kenji Imai, and Mark Mesler, Advanced Financial Risk Management, John Wiley & Sons, 2004. Translated into modern Chinese and published by China Renmin University Press, Beijing, 2007.  See especially chapters 8 and 18.

Oldrich A. Vasicek and H. Gifford Fong, 1982, Term Structure Modeling Using Exponential Splines,  Journal of Finance 37, 339-56.

Comments and suggestions are welcomed at info@kamakuraco.com.

Sean Klein and Donald R. van Deventer
Kamakura Corporation
Honolulu, July 21, 2009



Donald R. Van Deventer, Ph.D.

Don founded Kamakura Corporation in April 1990 and currently serves as Co-Chair, Center for Applied Quantitative Finance, Risk Research and Quantitative Solutions at SAS. Don’s focus at SAS is quantitative finance, credit risk, asset and liability management, and portfolio management for the most sophisticated financial services firms in the world.

Read More