T
 the type of input object.public class SVM<T> extends java.lang.Object implements OnlineClassifier<T>, SoftClassifier<T>, java.io.Serializable
If there exists no hyperplane that can perfectly split the positive and negative instances, the soft margin method will choose a hyperplane that splits the instances as cleanly as possible, while still maximizing the distance to the nearest cleanly split instances.
The nonlinear SVMs are created by applying the kernel trick to maximummargin hyperplanes. The resulting algorithm is formally similar, except that every dot product is replaced by a nonlinear kernel function. This allows the algorithm to fit the maximummargin hyperplane in a transformed feature space. The transformation may be nonlinear and the transformed space be high dimensional. For example, the feature space corresponding Gaussian kernel is a Hilbert space of infinite dimension. Thus though the classifier is a hyperplane in the highdimensional feature space, it may be nonlinear in the original input space. Maximum margin classifiers are well regularized, so the infinite dimension does not spoil the results.
The effectiveness of SVM depends on the selection of kernel, the kernel's parameters, and soft margin parameter C. Given a kernel, best combination of C and kernel's parameters is often selected by a gridsearch with cross validation.
The dominant approach for creating multiclass SVMs is to reduce the single multiclass problem into multiple binary classification problems. Common methods for such reduction is to build binary classifiers which distinguish between (i) one of the labels to the rest (oneversusall) or (ii) between every pair of classes (oneversusone). Classification of new instances for oneversusall case is done by a winnertakesall strategy, in which the classifier with the highest output function assigns the class. For the oneversusone approach, classification is done by a maxwins voting strategy, in which every classifier assigns the instance to one of the two classes, then the vote for the assigned class is increased by one vote, and finally the class with most votes determines the instance classification.
Modifier and Type  Class and Description 

static class 
SVM.Multiclass
The type of multiclass SVMs.

static class 
SVM.Trainer<T>
Trainer for support vector machines.

Constructor and Description 

SVM(MercerKernel<T> kernel,
double C)
Constructor of binary SVM.

SVM(MercerKernel<T> kernel,
double Cp,
double Cn)
Constructor of binary SVM.

SVM(MercerKernel<T> kernel,
double C,
double[] weight,
SVM.Multiclass strategy)
Constructor of multiclass SVM.

SVM(MercerKernel<T> kernel,
double C,
int k,
SVM.Multiclass strategy)
Constructor of multiclass SVM.

Modifier and Type  Method and Description 

void 
finish()
Process support vectors until converge.

void 
learn(T[] x,
int[] y)
Trains the SVM with the given dataset for one epoch.

void 
learn(T[] x,
int[] y,
double[] weight)
Trains the SVM with the given dataset for one epoch.

void 
learn(T x,
int y)
Online update the classifier with a new training instance.

void 
learn(T x,
int y,
double weight)
Online update the classifier with a new training instance.

int 
predict(T x)
Predicts the class label of an instance.

int 
predict(T x,
double[] prob)
Predicts the class label of an instance and also calculate a posteriori
probabilities.

SVM 
setTolerance(double tol)
Sets the tolerance of convergence test.

void 
trainPlattScaling(T[] x,
int[] y)
After calling finish, the user should call this method
to train Platt Scaling to estimate posteriori probabilities.

public SVM(MercerKernel<T> kernel, double C)
kernel
 the kernel function.C
 the soft margin penalty parameter.public SVM(MercerKernel<T> kernel, double Cp, double Cn)
kernel
 the kernel function.Cp
 the soft margin penalty parameter for positive instances.Cn
 the soft margin penalty parameter for negative instances.public SVM(MercerKernel<T> kernel, double C, int k, SVM.Multiclass strategy)
kernel
 the kernel function.C
 the soft margin penalty parameter.k
 the number of classes.public SVM(MercerKernel<T> kernel, double C, double[] weight, SVM.Multiclass strategy)
kernel
 the kernel function.C
 the soft margin penalty parameterweight
 class weight. Must be positive. The soft margin penalty
of class i will be weight[i] * C.public SVM setTolerance(double tol)
tol
 the tolerance of convergence test.public void learn(T x, int y)
OnlineClassifier
learn
in interface OnlineClassifier<T>
x
 training instance.y
 training label.public void learn(T x, int y, double weight)
x
 training instance.y
 training label.weight
 instance weight. Must be positive. The soft margin penalty
parameter for instance will be weight * C.public void learn(T[] x, int[] y)
Object.finalize()
to further process support vectors.x
 training instances.y
 training labels in [0, k), where k is the number of classes.public void learn(T[] x, int[] y, double[] weight)
Object.finalize()
to further process support vectors.x
 training instances.y
 training labels in [0, k), where k is the number of classes.weight
 instance weight. Must be positive. The soft margin penalty
parameter for instance i will be weight[i] * C.public void finish()
public void trainPlattScaling(T[] x, int[] y)
x
 training samples.y
 training labels.public int predict(T x)
Classifier
predict
in interface Classifier<T>
x
 the instance to be classified.public int predict(T x, double[] prob)
SoftClassifier
predict
in interface SoftClassifier<T>
x
 the instance to be classified.prob
 the array to store a posteriori probabilities on output.