Class Matrix.EVD

java.lang.Object
smile.math.matrix.Matrix.EVD
All Implemented Interfaces:
Serializable
Enclosing class:
Matrix

public static class Matrix.EVD extends Object implements Serializable
Eigenvalue decomposition. Eigen decomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors:

     A = V*D*V<sup>-1</sup>
 
If A is symmetric, then A = V*D*V' where the eigenvalue matrix D is diagonal and the eigenvector matrix V is orthogonal.

Given a linear transformation A, a non-zero vector x is defined to be an eigenvector of the transformation if it satisfies the eigenvalue equation


     A x = &lambda; x
 
for some scalar λ. In this situation, the scalar λ is called an eigenvalue of A corresponding to the eigenvector x.

The word eigenvector formally refers to the right eigenvector, which is defined by the above eigenvalue equation A x = λ x, and is the most commonly used eigenvector. However, the left eigenvector exists as well, and is defined by x A = λ x.

Let A be a real n-by-n matrix with strictly positive entries aij > 0. Then the following statements hold.

  1. There is a positive real number r, called the Perron-Frobenius eigenvalue, such that r is an eigenvalue of A and any other eigenvalue λ (possibly complex) is strictly smaller than r in absolute value, |λ| < r.
  2. The Perron-Frobenius eigenvalue is simple: r is a simple root of the characteristic polynomial of A. Consequently, both the right and the left eigenspace associated to r is one-dimensional.
  3. There exists a left eigenvector v of A associated with r (row vector) having strictly positive components. Likewise, there exists a right eigenvector w associated with r (column vector) having strictly positive components.
  4. The left eigenvector v (respectively right w) associated with r, is the only eigenvector which has positive components, i.e. for all other eigenvectors of A there exists a component which is not positive.

A stochastic matrix, probability matrix, or transition matrix is used to describe the transitions of a Markov chain. A right stochastic matrix is a square matrix each of whose rows consists of non-negative real numbers, with each row summing to 1. A left stochastic matrix is a square matrix whose columns consist of non-negative real numbers whose sum is 1. A doubly stochastic matrix where all entries are non-negative and all rows and all columns sum to 1. A stationary probability vector π is defined as a vector that does not change under application of the transition matrix; that is, it is defined as a left eigenvector of the probability matrix, associated with eigenvalue 1: πP = π. The Perron-Frobenius theorem ensures that such a vector exists, and that the largest eigenvalue associated with a stochastic matrix is always 1. For a matrix with strictly positive entries, this vector is unique. In general, however, there may be several such vectors.

See Also:
  • Field Summary

    Fields
    Modifier and Type
    Field
    Description
    final Matrix
    The left eigenvectors.
    final Matrix
    The right eigenvectors.
    final double[]
    The imaginary part of eigenvalues.
    final double[]
    The real part of eigenvalues.
  • Constructor Summary

    Constructors
    Constructor
    Description
    EVD(double[] wr, double[] wi, Matrix Vl, Matrix Vr)
    Constructor.
    EVD(double[] w, Matrix V)
    Constructor.
  • Method Summary

    Modifier and Type
    Method
    Description
    Returns the block diagonal eigenvalue matrix whose diagonal are the real part of eigenvalues, lower subdiagonal are positive imaginary parts, and upper subdiagonal are negative imaginary parts.
    Sorts the eigenvalues in descending order and reorders the corresponding eigenvectors.

    Methods inherited from class java.lang.Object

    clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
  • Field Details

    • wr

      public final double[] wr
      The real part of eigenvalues. By default the eigenvalues and eigenvectors are not always in sorted order. The sort function puts the eigenvalues in descending order and reorder the corresponding eigenvectors.
    • wi

      public final double[] wi
      The imaginary part of eigenvalues.
    • Vl

      public final Matrix Vl
      The left eigenvectors.
    • Vr

      public final Matrix Vr
      The right eigenvectors.
  • Constructor Details

    • EVD

      public EVD(double[] w, Matrix V)
      Constructor.
      Parameters:
      w - eigenvalues.
      V - eigenvectors.
    • EVD

      public EVD(double[] wr, double[] wi, Matrix Vl, Matrix Vr)
      Constructor.
      Parameters:
      wr - the real part of eigenvalues.
      wi - the imaginary part of eigenvalues.
      Vl - the left eigenvectors.
      Vr - the right eigenvectors.
  • Method Details

    • diag

      public Matrix diag()
      Returns the block diagonal eigenvalue matrix whose diagonal are the real part of eigenvalues, lower subdiagonal are positive imaginary parts, and upper subdiagonal are negative imaginary parts.
      Returns:
      the diagonal eigenvalue matrix.
    • sort

      public Matrix.EVD sort()
      Sorts the eigenvalues in descending order and reorders the corresponding eigenvectors.
      Returns:
      sorted eigen decomposition.