﻿ Regression and smoothing > Polynomial regression

# Polynomial regression

In the previous section on simple linear regression we noted that the expression shown below is also described as linear since it remains linear in the coefficients, β, even if it is not linear in the predictor variable or variables, x. For example, with a single predictor variable we can write a pth order polynomial expression of the form:

or more generally:

as before, but in this case we have:

This is a particular case of the more general linear regression model in which the design matrix X is given by:

As with linear regression, the parameters of a polynomial can be determined by ordinary least squares fitting assuming that there are more observations than parameters, with the solution being:

Also, as with simple linear regression a standard polynomial regression is a global procedure, i.e. the parameters determined are constant across the full domain of the independent variable. For polynomials of order p>3 this can result in some very undesirable effects, both within the sample domain (i.e. the range over which the parameters were estimated from the data) and more particularly, outside the sample range. Estimating the true value of the dependent variable within the sample domain for an unsampled value x=a is a form of interpolation. Estimating the true value beyond the sample range is a form of extrapolation or prediction. When extrapolating with higher order polynomials it is very common for the results to diverge very rapidly from 'acceptable' values and behavior, and this is one reason why alternative forms of curve fitting are often used. A further issue is in the interpretation of the parameters, since the individual terms are often highly correlated. A general recommendation is to opt for the lowest order polynomial fit that makes sense for the problem at hand, accept that the procedure is global and interpret it in this context, and only use the resulting fitted curve for interpolation rather than extrapolation.

Polynomial regression (and linear regression) can be modified from a global to a local operation in a number of ways. The simplest involves identifying clear break points in the sample data and modeling the data either side of the break(s). For example, with soil samples that have been taken from areas of differing vegetation cover the dependent variable (e.g. moisture content) may exhibit distinct breaks that correspond to these different forms of cover (grassland, woodland etc.). Another option, increasingly used for relatively dense datasets for which non-parametric data fitting rather than parametric modeling is important, is known as LOESS. This idea was developed by Cleveland and Devlin (1988, [CLE1]) and is a form of local regression (so consists of a series of regressions on subsets of the data) coupled with weighting that emphasizes nearby points over points that are further away. The result is a computationally intensive regression that exhibits smoothness and a good fit to the data across the sample domain. In spatial analysis geographically weighted regression (GWR) is a similar technique, applied specifically to spatial problems.

References

[CLE1] Cleveland W S, Devlin S J (1988) Locally-Weighted Regression: An Approach to Regression Analysis by Local Fitting. J of the American Statistical Association, 83(403), 596–610