Class CRF
- All Implemented Interfaces:
Serializable
A CRF is a Markov random field that was trained discriminatively. Therefore, it is not necessary to model the distribution over always observed variables, which makes it possible to include arbitrarily complicated features of the observed variables into the model.
This class implements an algorithm that trains CRFs via gradient tree boosting. In tree boosting, the CRF potential functions are represented as weighted sums of regression trees, which provide compact representations of feature interactions. So the algorithm does not explicitly consider the potentially large parameter space. As a result, gradient tree boosting scales linearly in the order of the Markov model and in the order of the feature interactions, rather than exponentially as in previous algorithms based on iterative scaling and gradient descent.
References
- J. Lafferty, A. McCallum and F. Pereira. Conditional random fields: Probabilistic models for segmenting and labeling sequence data. ICML, 2001.
- Thomas G. Dietterich, Guohua Hao, and Adam Ashenfelter. Gradient Tree Boosting for Training Conditional Random Fields. JMLR, 2008.
- See Also:
-
Constructor Summary
ConstructorDescriptionCRF
(StructType schema, RegressionTree[][] potentials, double shrinkage) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionstatic CRF
Fits a CRF model.static CRF
fit
(Tuple[][] sequences, int[][] labels, int ntrees, int maxDepth, int maxNodes, int nodeSize, double shrinkage) Fits a CRF model.static CRF
fit
(Tuple[][] sequences, int[][] labels, Properties params) Fits a CRF model.int[]
Returns the most likely label sequence given the feature sequence by the forward-backward algorithm.int[]
Labels sequence with Viterbi algorithm.
-
Constructor Details
-
CRF
Constructor.- Parameters:
schema
- the schema of features.potentials
- the potential functions.shrinkage
- the learning rate.
-
-
Method Details
-
viterbi
Labels sequence with Viterbi algorithm. Viterbi algorithm returns the whole sequence label that has the maximum probability, which makes sense in applications (e.g.part-of-speech tagging) that require coherent sequential labeling. The forward-backward algorithm labels a sequence by individual prediction on each position. This usually produces better accuracy although the results may not be coherent.- Parameters:
x
- the sequence.- Returns:
- the sequence labels.
-
predict
Returns the most likely label sequence given the feature sequence by the forward-backward algorithm.- Parameters:
x
- a sequence.- Returns:
- the most likely label sequence.
-
fit
Fits a CRF model.- Parameters:
sequences
- the training data.labels
- the training sequence labels.- Returns:
- the model.
-
fit
Fits a CRF model.- Parameters:
sequences
- the training data.labels
- the training sequence labels.params
- the hyper-parameters.- Returns:
- the model.
-
fit
public static CRF fit(Tuple[][] sequences, int[][] labels, int ntrees, int maxDepth, int maxNodes, int nodeSize, double shrinkage) Fits a CRF model.- Parameters:
sequences
- the training data.labels
- the training sequence labels.ntrees
- the number of trees/iterations.maxDepth
- the maximum depth of the tree.maxNodes
- the maximum number of leaf nodes in the tree.nodeSize
- the number of instances in a node below which the tree will not split, setting nodeSize = 5 generally gives good results.shrinkage
- the shrinkage parameter in (0, 1] controls the learning rate of procedure.- Returns:
- the model.
-