public class LogisticRegression extends java.lang.Object implements SoftClassifier<double[]>, java.io.Serializable
Goodnessoffit tests such as the likelihood ratio test are available as indicators of model appropriateness, as is the Wald statistic to test the significance of individual independent variables.
Logistic regression has many analogies to ordinary least squares (OLS) regression. Unlike OLS regression, however, logistic regression does not assume linearity of relationship between the raw values of the independent variables and the dependent, does not require normally distributed variables, does not assume homoscedasticity, and in general has less stringent requirements.
Compared with linear discriminant analysis, logistic regression has several advantages:
Logistic regression also has strong connections with neural network and maximum entropy modeling. For example, binary logistic regression is equivalent to a onelayer, singleoutput neural network with a logistic activation function trained under log loss. Similarly, multinomial logistic regression is equivalent to a onelayer, softmaxoutput neural network.
Logistic regression estimation also obeys the maximum entropy principle, and thus logistic regression is sometimes called "maximum entropy modeling", and the resulting classifier the "maximum entropy classifier".
NeuralNetwork
,
Maxent
,
LDA
,
Serialized FormModifier and Type  Class and Description 

static class 
LogisticRegression.Trainer
Trainer for logistic regression.

Constructor and Description 

LogisticRegression(double[][] x,
int[] y)
Constructor.

LogisticRegression(double[][] x,
int[] y,
double lambda)
Constructor.

LogisticRegression(double[][] x,
int[] y,
double lambda,
double tol,
int maxIter)
Constructor.

Modifier and Type  Method and Description 

double 
loglikelihood()
Returns the loglikelihood of model.

int 
predict(double[] x)
Predicts the class label of an instance.

int 
predict(double[] x,
double[] posteriori)
Predicts the class label of an instance and also calculate a posteriori
probabilities.

public LogisticRegression(double[][] x, int[] y)
x
 training samples.y
 training labels in [0, k), where k is the number of classes.public LogisticRegression(double[][] x, int[] y, double lambda)
x
 training samples.y
 training labels in [0, k), where k is the number of classes.lambda
 λ > 0 gives a "regularized" estimate of linear
weights which often has superior generalization performance, especially
when the dimensionality is high.public LogisticRegression(double[][] x, int[] y, double lambda, double tol, int maxIter)
x
 training samples.y
 training labels in [0, k), where k is the number of classes.lambda
 λ > 0 gives a "regularized" estimate of linear
weights which often has superior generalization performance, especially
when the dimensionality is high.tol
 the tolerance for stopping iterations.maxIter
 the maximum number of iterations.public double loglikelihood()
public int predict(double[] x)
Classifier
predict
in interface Classifier<double[]>
x
 the instance to be classified.public int predict(double[] x, double[] posteriori)
SoftClassifier
predict
in interface SoftClassifier<double[]>
x
 the instance to be classified.posteriori
 the array to store a posteriori probabilities on output.