Ridge regression

<< Click to Display Table of Contents >>

Navigation:  Regression and smoothing >

Ridge regression

In the previous discussion of least squares procedures we noted that the ordinary least squares solution to an over-determined set of equations modeled as:

has the general form:

However, if the design matrix, X, is singular or near singular it cannot be inverted and the parameters will not be determined. One approach to solving such problems is to amend the design matrix in such a way as the reduce the proximity to singularity. In ridge regression this is achieved by augmenting the inversion expression with an adjustment or ridge factor, k:

If k=0 there is no adjustment, but for small values of k>0 (and k<1) a matrix that might otherwise have been impossible to invert becomes better behaved or conditioned. The resulting estimates for the parameters are biased but may result in an improved least squares fit as compared to ordinary least squares, assuming the latter can be obtained. In ridge regression the ridge parameter is incremented from very small values until the estimated (standardized) parameter values stabilize. A ridge trace (plot of the coefficients versus the ridge parameter) may help to identify this stabilization pattern. Marquardt and Snee [(1975, [MAR1]) provide an excellent discussion of ridge regression with worked examples from industry and agriculture.


[MAR1] Marquardt D W, Snee R D (1975) Ridge Regression in Practice. The American Statistician, 29(1), 3–20