Package smile.manifold
Class MDS
java.lang.Object
smile.manifold.MDS
Classical multidimensional scaling, also known as principal coordinates
analysis. Given a matrix of dissimilarities (e.g. pairwise distances), MDS
finds a set of points in low dimensional space that well-approximates the
dissimilarities. We are not restricted to using Euclidean
distance metric. However, when Euclidean distances are used MDS is
equivalent to PCA.
- See Also:
-
Field Summary
Modifier and TypeFieldDescriptionfinal double[][]
The principal coordinates.final double[]
The proportion of variance contained in each principal component.final double[]
The component scores. -
Constructor Summary
ConstructorDescriptionMDS
(double[] scores, double[] proportion, double[][] coordinates) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionstatic MDS
of
(double[][] proximity) Fits the classical multidimensional scaling.static MDS
of
(double[][] proximity, int k) Fits the classical multidimensional scaling.static MDS
of
(double[][] proximity, int k, boolean positive) Fits the classical multidimensional scaling.static MDS
of
(double[][] proximity, Properties params) Fits the classical multidimensional scaling.
-
Field Details
-
scores
public final double[] scoresThe component scores. -
coordinates
public final double[][] coordinatesThe principal coordinates. -
proportion
public final double[] proportionThe proportion of variance contained in each principal component.
-
-
Constructor Details
-
MDS
public MDS(double[] scores, double[] proportion, double[][] coordinates) Constructor.- Parameters:
scores
- the component scores.proportion
- the proportion of variance contained in each principal component.coordinates
- the principal coordinates
-
-
Method Details
-
of
Fits the classical multidimensional scaling. Map original data into 2-dimensional Euclidean space.- Parameters:
proximity
- the non-negative proximity matrix of dissimilarities. The diagonal should be zero and all other elements should be positive and symmetric. For pairwise distances matrix, it should be just the plain distance, not squared.- Returns:
- the model.
-
of
Fits the classical multidimensional scaling.- Parameters:
proximity
- the non-negative proximity matrix of dissimilarities. The diagonal should be zero and all other elements should be positive and symmetric. For pairwise distances matrix, it should be just the plain distance, not squared.k
- the dimension of the projection.- Returns:
- the model.
-
of
Fits the classical multidimensional scaling.- Parameters:
proximity
- the non-negative proximity matrix of dissimilarities. The diagonal should be zero and all other elements should be positive and symmetric. For pairwise distances matrix, it should be just the plain distance, not squared.params
- the hyperparameters.- Returns:
- the model.
-
of
Fits the classical multidimensional scaling.- Parameters:
proximity
- the non-negative proximity matrix of dissimilarities. The diagonal should be zero and all other elements should be positive and symmetric. For pairwise distances matrix, it should be just the plain distance, not squared.k
- the dimension of the projection.positive
- if true, estimate an appropriate constant to be added to all the dissimilarities, apart from the self-dissimilarities, that makes the learning matrix positive semi-definite. The other formulation of the additive constant problem is as follows. If the proximity is measured in an interval scale, where there is no natural origin, then there is not a sympathy of the dissimilarities to the distances in the Euclidean space used to represent the objects. In this case, we can estimate a constant c such that proximity + c may be taken as ratio data, and also possibly to minimize the dimensionality of the Euclidean space required for representing the objects.- Returns:
- the model.
-