public class GaussianProcessRegression<T> extends java.lang.Object implements Regression<T>, java.io.Serializable
A Gaussian process can be used as a prior probability distribution over functions in Bayesian inference. Given any set of N points in the desired domain of your functions, take a multivariate Gaussian whose covariance matrix parameter is the Gram matrix of N points with some desired kernel, and sample from that Gaussian. Inference of continuous values with a Gaussian process prior is known as Gaussian process regression.
The fitting is performed in the reproducing kernel Hilbert space with the "kernel trick". The loss function is squarederror. This also arises as the kriging estimate of a Gaussian random field in spatial statistics.
A significant problem with Gaussian process prediction is that it typically scales as O(n^{3}). For large problems (e.g. n > 10,000) both storing the Gram matrix and solving the associated linear systems are prohibitive on modern workstations. An extensive range of proposals have been suggested to deal with this problem. A popular approach is the reducedrank Approximations of the Gram Matrix, known as Nystrom approximation. Greedy approximation is another popular approach that uses an active set of training points of size m selected from the training set of size n > m. We assume that it is impossible to search for the optimal subset of size m due to combinatorics. The points in the active set could be selected randomly, but in general we might expect better performance if the points are selected greedily w.r.t. some criterion. Recently, researchers had proposed relaxing the constraint that the inducing variables must be a subset of training/test cases, turning the discrete selection problem into one of continuous optimization.
Modifier and Type  Class and Description 

static class 
GaussianProcessRegression.Trainer<T>
Trainer for Gaussian Process for Regression.

Constructor and Description 

GaussianProcessRegression(T[] x,
double[] y,
MercerKernel<T> kernel,
double lambda)
Constructor.

GaussianProcessRegression(T[] x,
double[] y,
T[] t,
MercerKernel<T> kernel,
double lambda)
Constructor.

Modifier and Type  Method and Description 

double[] 
coefficients()
Returns the coefficients.

double 
predict(T x)
Predicts the dependent variable of an instance.

double 
shrinkage()
Returns the shrinkage parameter.

clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
predict
public GaussianProcessRegression(T[] x, double[] y, MercerKernel<T> kernel, double lambda)
x
 the training dataset.y
 the response variable.kernel
 the Mercer kernel.lambda
 the shrinkage/regularization parameter.public GaussianProcessRegression(T[] x, double[] y, T[] t, MercerKernel<T> kernel, double lambda)
x
 the training dataset.y
 the response variable.t
 the inducing input, which are preselected or inducing samples
acting as active set of regressors. In simple case, these can be chosen
randomly from the training set or as the centers of kmeans clustering.kernel
 the Mercer kernel.lambda
 the shrinkage/regularization parameter.public double[] coefficients()
public double shrinkage()
public double predict(T x)
Regression
predict
in interface Regression<T>
x
 the instance.