Class RBFNetwork<T>
 Type Parameters:
T
 the data type of samples.
 All Implemented Interfaces:
Serializable
,ToDoubleFunction<T>
,Regression<T>
In its basic form, radial basis function network is in the form
y(x) = Σ w_{i} φ(xc_{i})
where the approximating function y(x) is represented as a sum of N radial basis functions φ, each associated with a different center c_{i}, and weighted by an appropriate coefficient w_{i}. For distance, one usually chooses Euclidean distance. The weights w_{i} can be estimated using the matrix methods of linear least squares, because the approximating function is linear in the weights.
The points c_{i} are often called the centers of the RBF networks, which can be randomly selected from training data, or learned by some clustering method (e.g. kmeans), or learned together with weight parameters undergo a supervised learning processing (e.g. errorcorrection learning).
Popular choices for φ comprise the Gaussian function and the
socalled thin plate splines. The advantage of the thin plate splines is that
their conditioning is invariant under scaling. Gaussian, multiquadric
and inverse multiquadric are infinitely smooth and involve a scale
or shape parameter, r_{0} > 0
. Decreasing
r_{0} tends to flatten the basis function. For a
given function, the quality of approximation may strongly depend on this
parameter. In particular, increasing r_{0} has the
effect of better conditioning (the separation distance of the scaled points
increases).
A variant on RBF networks is normalized radial basis function (NRBF) networks, in which we require the sum of the basis functions to be unity. NRBF arises more naturally from a Bayesian statistical perspective. However, there is no evidence that either the NRBF method is consistently superior to the RBF method, or vice versa.
References
 Simon Haykin. Neural Networks: A Comprehensive Foundation (2nd edition). 1999.
 T. Poggio and F. Girosi. Networks for approximation and learning. Proc. IEEE 78(9):14841487, 1990.
 Nabil Benoudjit and Michel Verleysen. On the kernel widths in radialbasis function networks. Neural Process, 2003.
 See Also:

Nested Class Summary
Nested classes/interfaces inherited from interface smile.regression.Regression
Regression.Trainer<T,
M extends Regression<T>> 
Constructor Summary

Method Summary
Modifier and TypeMethodDescriptionstatic RBFNetwork
<double[]> fit
(double[][] x, double[] y, Properties params) Fits an RBF network.static <T> RBFNetwork
<T> Fits an RBF network.static <T> RBFNetwork
<T> Fits an RBF network.boolean
Returns true if the model is normalized.double
Predicts the dependent variable of an instance.Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface smile.regression.Regression
applyAsDouble, online, predict, predict, predict, update, update, update

Constructor Details

RBFNetwork
Constructor. Parameters:
rbf
 the radial basis functions.w
 the weights of RBFs.normalized
 True if this is a normalized RBF network.


Method Details

fit
Fits an RBF network. Type Parameters:
T
 the data type of samples. Parameters:
x
 the training dataset.y
 the response variable.rbf
 the radial basis functions. Returns:
 the model.

fit
Fits an RBF network. Type Parameters:
T
 the data type of samples. Parameters:
x
 the training dataset.y
 the response variable.rbf
 the radial basis functions.normalized
 true for the normalized RBF network. Returns:
 the model.

fit
Fits an RBF network. Parameters:
x
 training samples.y
 the response variable.params
 the hyperparameters. Returns:
 the model.

isNormalized
public boolean isNormalized()Returns true if the model is normalized. Returns:
 true if the model is normalized.

predict
Description copied from interface:Regression
Predicts the dependent variable of an instance. Specified by:
predict
in interfaceRegression<T>
 Parameters:
x
 an instance. Returns:
 the predicted value of dependent variable.
