Package smile.math.distance
Class JensenShannonDistance
java.lang.Object
smile.math.distance.JensenShannonDistance
- All Implemented Interfaces:
Serializable
,ToDoubleBiFunction<double[],
,double[]> Distance<double[]>
,Metric<double[]>
The Jensen-Shannon divergence is a popular method of measuring the
similarity between two probability distributions. It is also known
as information radius or total divergence to the average.
The Jensen-Shannon divergence is a symmetrized and smoothed version of the Kullback-Leibler divergence . It is defined by
J(P||Q) = (D(P||M) + D(Q||M)) / 2where M = (P+Q)/2 and D(·||·) is KL divergence. Different from the Kullback-Leibler divergence, it is always a finite value.
The square root of the Jensen-Shannon divergence is a metric, which is calculated by this class.
- See Also:
-
Constructor Details
-
JensenShannonDistance
public JensenShannonDistance()Constructor.
-
-
Method Details