See: Description
Class  Description 

IsoMap 
Isometric feature mapping.

LaplacianEigenmap 
Laplacian Eigenmap.

LLE 
Locally Linear Embedding.

TSNE 
The tdistributed stochastic neighbor embedding.

UMAP 
Uniform Manifold Approximation and Projection.

Some prominent approaches are locally linear embedding (LLE), Hessian LLE, Laplacian eigenmaps, and LTSA. These techniques construct a lowdimensional data representation using a cost function that retains local properties of the data, and can be viewed as defining a graphbased kernel for Kernel PCA. More recently, techniques have been proposed that, instead of defining a fixed kernel, try to learn the kernel using semidefinite programming. The most prominent example of such a technique is maximum variance unfolding (MVU). The central idea of MVU is to exactly preserve all pairwise distances between nearest neighbors (in the inner product space), while maximizing the distances between points that are not nearest neighbors.
An alternative approach to neighborhood preservation is through the minimization of a cost function that measures differences between distances in the input and output spaces. Important examples of such techniques include classical multidimensional scaling (which is identical to PCA), Isomap (which uses geodesic distances in the data space), diffusion maps (which uses diffusion distances in the data space), tSNE (which minimizes the divergence between distributions over pairs of points), and curvilinear component analysis.