public class RBFNetwork<T> extends java.lang.Object implements Classifier<T>
In its basic form, radial basis function network is in the form
y(x) = Σ w_{i} φ(xc_{i})
where the approximating function y(x) is represented as a sum of N radial basis functions φ, each associated with a different center c_{i}, and weighted by an appropriate coefficient w_{i}. For distance, one usually chooses Euclidean distance. The weights w_{i} can be estimated using the matrix methods of linear least squares, because the approximating function is linear in the weights.
The centers c_{i} can be randomly selected from training data, or learned by some clustering method (e.g. kmeans), or learned together with weight parameters undergo a supervised learning processing (e.g. errorcorrection learning).
The popular choices for φ comprise the Gaussian function and the so called thin plate splines. The advantage of the thin plate splines is that their conditioning is invariant under scalings. Gaussian, multiquadric and inverse multiquadric are infinitely smooth and and involve a scale or shape parameter, r_{0} > 0. Decreasing r_{0} tends to flatten the basis function. For a given function, the quality of approximation may strongly depend on this parameter. In particular, increasing r_{0} has the effect of better conditioning (the separation distance of the scaled points increases).
A variant on RBF networks is normalized radial basis function (NRBF) networks, in which we require the sum of the basis functions to be unity. NRBF arises more naturally from a Bayesian statistical perspective. However, there is no evidence that either the NRBF method is consistently superior to the RBF method, or vice versa.
SVMs with Gaussian kernel have similar structure as RBF networks with Gaussian radial basis functions. However, the SVM approach "automatically" solves the network complexity problem since the size of the hidden layer is obtained as the result of the QP procedure. Hidden neurons and support vectors correspond to each other, so the center problems of the RBF network is also solved, as the support vectors serve as the basis function centers. It was reported that with similar number of support vectors/centers, SVM shows better generalization performance than RBF network when the training data size is relatively small. On the other hand, RBF network gives better generalization performance than SVM on large training data.
RadialBasisFunction
,
SVM
,
MLP
,
Serialized FormConstructor and Description 

RBFNetwork(int k,
RBF<T>[] rbf,
Matrix w,
boolean normalized)
Constructor.

RBFNetwork(int k,
RBF<T>[] rbf,
Matrix w,
boolean normalized,
IntSet labels)
Constructor.

Modifier and Type  Method and Description 

static <T> RBFNetwork<T> 
fit(T[] x,
int[] y,
RBF<T>[] rbf)
Fits a RBF network.

static <T> RBFNetwork<T> 
fit(T[] x,
int[] y,
RBF<T>[] rbf,
boolean normalized)
Fits a RBF network.

boolean 
isNormalized()
Returns true if the model is normalized.

int 
predict(T x)
Predicts the class label of an instance.

clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
applyAsDouble, applyAsInt, predict, score
public RBFNetwork(int k, RBF<T>[] rbf, Matrix w, boolean normalized)
k
 the number of classes.rbf
 the radial basis functions.w
 the weights of RBFs.normalized
 True if this is a normalized RBF network.public static <T> RBFNetwork<T> fit(T[] x, int[] y, RBF<T>[] rbf)
x
 training samples.y
 training labels in [0, k), where k is the number of classes.rbf
 the radial basis functions.public static <T> RBFNetwork<T> fit(T[] x, int[] y, RBF<T>[] rbf, boolean normalized)
x
 training samples.y
 training labels in [0, k), where k is the number of classes.rbf
 the radial basis functions.normalized
 true for the normalized RBF network.public boolean isNormalized()
public int predict(T x)
Classifier
predict
in interface Classifier<T>
x
 the instance to be classified.