public class OLS extends java.lang.Object implements Regression<double[]>, java.io.Serializable
The OLS estimator is consistent when the independent variables are exogenous and there is no multicollinearity, and optimal in the class of linear unbiased estimators when the errors are homoscedastic and serially uncorrelated. Under these conditions, the method of OLS provides minimumvariance meanunbiased estimation when the errors have finite variances.
There are several different frameworks in which the linear regression model can be cast in order to make the OLS technique applicable. Each of these settings produces the same formulas and same results, the only difference is the interpretation and the assumptions which have to be imposed in order for the method to give meaningful results. The choice of the applicable framework depends mostly on the nature of data at hand, and on the inference task which has to be performed.
Least squares corresponds to the maximum likelihood criterion if the experimental errors have a normal distribution and can also be derived as a method of moments estimator.
Once a regression model has been constructed, it may be important to confirm the goodness of fit of the model and the statistical significance of the estimated parameters. Commonly used checks of goodness of fit include the Rsquared, analysis of the pattern of residuals and hypothesis testing. Statistical significance can be checked by an Ftest of the overall fit, followed by ttests of individual parameters.
Interpretations of these diagnostic tests rest heavily on the model assumptions. Although examination of the residuals can be used to invalidate a model, the results of a ttest or Ftest are sometimes more difficult to interpret if the model's assumptions are violated. For example, if the error term does not have a normal distribution, in small samples the estimated parameters will not follow normal distributions and complicate inference. With relatively large samples, however, a central limit theorem can be invoked such that hypothesis testing may proceed using asymptotic approximations.
Modifier and Type  Class and Description 

static class 
OLS.Trainer
Trainer for linear regression by ordinary least squares.

Constructor and Description 

OLS(double[][] x,
double[] y)
Constructor.

OLS(double[][] x,
double[] y,
boolean SVD)
Constructor.

Modifier and Type  Method and Description 

double 
adjustedRSquared()
Returns adjusted R^{2} statistic.

double[] 
coefficients()
Returns the linear coefficients (without intercept).

int 
df()
Returns the degreeoffreedom of residual standard error.

double 
error()
Returns the residual standard error.

double 
ftest()
Returns the Fstatistic of goodnessoffit.

double 
intercept()
Returns the intercept.

double 
predict(double[] x)
Predicts the dependent variable of an instance.

double 
pvalue()
Returns the pvalue of goodnessoffit test.

double[] 
residuals()
Returns the residuals, that is response minus fitted values.

double 
RSquared()
Returns R^{2} statistic.

double 
RSS()
Returns the residual sum of squares.

java.lang.String 
toString() 
double[][] 
ttest()
Returns the ttest of the coefficients (including intercept).

clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
predict
public OLS(double[][] x, double[] y)
x
 a matrix containing the explanatory variables. NO NEED to include a constant column of 1s for bias.y
 the response values.public OLS(double[][] x, double[] y, boolean SVD)
x
 a matrix containing the explanatory variables. NO NEED to include a constant column of 1s for bias.y
 the response values.SVD
 If true, use SVD to fit the model. Otherwise, use QR decomposition. SVD is slower than QR but
can handle randdeficient matrix.public double[][] ttest()
public double[] coefficients()
public double intercept()
public double[] residuals()
public double RSS()
public double error()
public int df()
public double RSquared()
In the case of ordinary leastsquares regression, R^{2} increases as we increase the number of variables in the model (R^{2} will not decrease). This illustrates a drawback to one possible use of R^{2}, where one might try to include more variables in the model until "there is no more improvement". This leads to the alternative approach of looking at the adjusted R^{2}.
public double adjustedRSquared()
public double ftest()
public double pvalue()
public double predict(double[] x)
Regression
predict
in interface Regression<double[]>
x
 the instance.public java.lang.String toString()
toString
in class java.lang.Object