<< Click to Display Table of Contents >> Navigation: Statistical concepts > Degrees of freedom |
The term degrees of freedom, often denoted DF or df, was introduced by R A Fisher in 1925. The simplest way to understand the concept is to consider a simple arithmetic expression, such as the sum of a set of n positive integers. If you know the total or the mean value (which is the total divided by n), then choosing any (n-1) integers will determine what the one remaining value must be in order to equal the total. In other words, you have the freedom to choose n-1 of the numbers, so there are n-1 degrees of freedom. If you know two such facts about a set of numbers, then there will be n-2 degrees of freedom. For certain probability distributions, notably the t-distribution, F-distribution and chi-square distribution, the number of degrees of freedom is the parameter (or parameters) that determine the distribution shape, and hence the df-value is required if these distributions are to be used in hypothesis testing and inference.
In some instances (certain forms of statistical modeling) the degrees of freedom are needed in a computation, but the determination of the correct value to use is not immediately obvious. Typically a procedure is applied (often involving the use of the trace matrix operator) that produces a value to use that is described as the effective degrees of freedom.