Constraints on the Neighborhood Size in LLE

Zhengming MA  Jing CHEN  Shuaibin LIAN  

Publication
IEICE TRANSACTIONS on Information and Systems   Vol.E94-D   No.8   pp.1636-1640
Publication Date: 2011/08/01
Online ISSN: 1745-1361
DOI: 10.1587/transinf.E94.D.1636
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Pattern Recognition
Keyword: 
locally linear embedding,  nonlinear dimensionality reduction,  manifold learning,  principle component analysis,  

Full Text: PDF(248.1KB)
>>Buy this Article


Summary: 
Locally linear embedding (LLE) is a well-known method for nonlinear dimensionality reduction. The mathematical proof and experimental results presented in this paper show that the neighborhood sizes in LLE must be smaller than the dimensions of input data spaces, otherwise LLE would degenerate from a nonlinear method for dimensionality reduction into a linear method for dimensionality reduction. Furthermore, when the neighborhood sizes are larger than the dimensions of input data spaces, the solutions to LLE are not unique. In these cases, the addition of some regularization method is often proposed. The experimental results presented in this paper show that the regularization method is not robust. Too large or too small regularization parameters cannot unwrap S-curve. Although a moderate regularization parameters can unwrap S-curve, the relative distance in the input data will be distorted in unwrapping. Therefore, in order to make LLE play fully its advantage in nonlinear dimensionality reduction and avoid multiple solutions happening, the best way is to make sure that the neighborhood sizes are smaller than the dimensions of input data spaces.