Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Bayesian nonlinear principal component analysis using random fields
Lian H. IEEE Transactions on Pattern Analysis and Machine Intelligence31 (4):749-754,2009.Type:Article
Date Reviewed: Aug 19 2009

The complexity of data involved in business and scientific applications has increased tremendously recently, and this trend appears to be continuing. Thus, there is an urgent need for tools that can analyze and characterize large high-dimensional data. One approach is to reduce the dimensionality of the data, so that it becomes easier for humans to see it and for computers to process it.

A classic technique for the dimensionality reduction problem is the principle component analysis (PCA). Given observations in a high-dimensional space, PCA computes a projection matrix, with which the observations can be projected to a lower-dimension space (latent space), with a minimum degree of distortion. Since the projection operation is linear, the technique is not effective when the underlying data presents some curved (or nonlinear) structures. An extensive number of tools have recently been proposed to extend the classic PCA to handle nonlinear structures. This paper presents one such tool.

Same as other nonlinear PCA extensions, the proposed technique allows a different projection matrix at each observation point, resulting in a nonlinear latent space that is locally linear. The difference is that all parameters of the model--including a set of projection matrices and latent variables--are given prior probability distributions and inference of these parameters is done in a fully Bayesian fashion. An advantage is that no neighborhood structure, at each observation, needs to be assigned a priori, as it will be derived as part of the inference. A disadvantage is its high computational cost, as the inference requires an expensive Markov chain Monte Carlo (MCMC) process.

This interesting approach solves both self-organization and dimensionality reduction. However, the simple (isotropic) prior models may limit its potential and make the technique ineffective in handling highly nonlinear structures.

Reviewer:  T. Kubota Review #: CR137213 (1003-0304)
Bookmark and Share
  Featured Reviewer  
 
Statistical (I.5.1 ... )
 
 
Markov Processes (G.3 ... )
 
Would you recommend this review?
yes
no
Other reviews under "Statistical": Date
A formulation and comparison of two linear feature selection techniques applicable to statistical classification
Young D., Odell P. Pattern Recognition 17(3): 331-337, 1984. Type: Article
Mar 1 1985
Remarks on some statistical properties of the minimum spanning forest
Dubes R., Hoffman R. Pattern Recognition 19(1): 49-53, 1986. Type: Article
Dec 1 1987
Statistical pattern recognition--early development and recent progress
Chen C. International Journal of Pattern Recognition and Artificial Intelligence 1(1): 43-51, 1987. Type: Article
May 1 1988
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy