Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Independent component analysis : a tutorial introduction
Stone J., MIT Press, Cambridge, MA, 2004. Type: Book (9780262693158)
Date Reviewed: Mar 23 2005

The independent component analysis (ICA) method is introduced in this excellent book, through examples from signal processing. It is addressed to beginners, as well as professionals, engineers, and scientists.

The book is divided into 11 chapters, arranged into five parts. The first part, “Independent Component Analysis and Blind Source Separation,” is introductory, and contains two overview chapters. This overview starts with an intuitive example: multiple sound signals mixed and recorded through multiple microphones. The goal of the ICA method is to reconstruct the source signals from mixed signals. ICA is seen as part of the larger field of “blind source separation,” where little is known about the nature of the source signal. Some of the main assumptions are explained here in simple terms. For example, mixed signals are different from source signals, being more Gaussian and more complex. These same applications, mentioned only briefly in this part, are detailed in a separate chapter.

The second part, “The Geometry of Mixtures,” contains three chapters, which introduce the essential mathematical concepts. One chapter is dedicated to mathematical representations of signals: vectors, vector variables, and operations with vectors. Another presents linear transformations, inner products, matrices, geometric transformations, and orthogonal projections. The last chapter defines the basic statistical concepts used in ICA: histograms, probability density, central limit theorem, cumulative density functions, moments, mean, variance, skewness, and kurtosis.

The third and most consistent part is called “Methods for Blind Source Separation.” Chapter 6 describes the projection pursuit version of ICA, which extracts source signals such that the resulting source signals are as non-Gaussian as possible. Chapter 7 introduces independence and entropy in statistical terms, and the infomax version of the ICA method. It seeks to maximize the entropy of extracted signals after they have been transformed by a specific model of a joint cumulative density function (CDF). The assumption on the joint CDF is that the marginal CDFs are independent. As a method equivalent to infomax, the author describes the usage of the maximum likelihood function in the separation process. In chapter 8, “Complexity Pursuit,” the Kolmogorov approach to complexity is discussed. In contrast to projection pursuit, this version of ICA does not ignore signal structure. Chapter 9 describes the gradient ascent method, which is used to find the parameters that maximize a merit function, such as maximum likelihood or complexity. The last chapter of this part is dedicated to discussing principal component analysis (PCA) and factor analysis (FA). PCA assumes that source signals are Gaussian and statistically uncorrelated, which is different from ICA’s condition of independence. FA is viewed as PCA with an additional model for noise, and the number of signals can be different from the number of sources.

Part 4 contains only one chapter: “Applications of ICA.” Some of the mentioned applications are: voice extraction, electroencephalograms, functional magnetic resonance imaging, fetal heart monitoring, and learning stereo disparity.

Finally, there are the appendices; the mathematics is more concentrated here, and is accompanied by MATLAB code examples. At the end of the book, readers will find a list of resources, recommended readings, and references.

This is a clearly written and well-organized book. Explained concepts are supported by essential proofs and helpful graphical diagrams. The text strikes a balance between the general theory of statistics, theory of signals, and exemplifications of ICA. Even the repetitions, inherent in an introductory book, always clarify and unveil new facets of concepts, and make reading the book a real pleasure.

Reviewer:  Adrian Pasculescu Review #: CR131028
Bookmark and Share
 
Multivariate Statistics (G.3 ... )
 
 
Connectionism And Neural Nets (I.2.6 ... )
 
 
Singular Value Decomposition (G.1.3 ... )
 
 
Numerical Linear Algebra (G.1.3 )
 
Would you recommend this review?
yes
no
Other reviews under "Multivariate Statistics": Date
An algorithm and fortran program for multivariate LAD (l1 of l2) regression
Kaufman E., Taylor G., Mielke P., Berry K. Computing 68(3): 275-287, 2002. Type: Article
Jan 21 2003
 Influence diagnostics for generalized linear mixed models: applications to clustered data
Xiang L., Tse S., Lee A. Computational Statistics & Data Analysis 40(4): 759-774, 2002. Type: Article
Apr 17 2003
Asymptotic theory for multivariate GARCH processes
Comte F., Lieberman O. Journal of Multivariate Analysis 84(1): 61-84, 2003. Type: Article
Jul 31 2003
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy