Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
A constrained growing grid neural clustering model
Hung C. Applied Intelligence43 (1):15-31,2015.Type:Article
Date Reviewed: Aug 24 2015

The growing grid (GG) algorithm reduces the dimensional complexity of a dataset, irrespective of its topological structure. This model grows its units with a fixed rate, and fine-tunes its topological structure in growing and fine-tuning stages with greater and smaller learning rates, respectively. This paper neutralizes this complexity.

According to the author, “The constrained GG [CGG] emphasizes the effect of the lateral connections between output units in a grid, and neutralizes the effect on the distance between the input vector and neighbors of the best matching unit (BMU).” He further extends GG to fine-tuned CGG and full CGG to constrain input vectors only at the fine-tuning stage and at both growing and fine-tuning stages, respectively, to test the constrained neural learning rule. CGG is a new approach to implementing the GG algorithm; this is the paper’s core contribution.

For evaluations, “the average quantization error (AQE), error entropy, average map scope, BMU activation rate (BAR), and topographic error (TE) are evaluated for the explanation ability, smooth hyper surface, border effect, BMU activation, and topology preservation of the GG algorithm, respectively.” The author uses 15 datasets, as well as models like the self-organizing map (SOM), constrained SOM, GG, fine-tuned CGG, and full CGG. Using a t-test to test the statistical significance, the author’s full CGG model outperforms other models with an original learning rule, which makes this paper an interesting read for those working in this area.

The author further considers five future issues:

(1) how the approach can be applied to neural gas (NG), growing neural gas (GNG), GNG with utility (GNG-U), grow when required (GWR), dynamic adaptive self-organizing hybrid (DASH), and so on;

(2) exploring and using receiver operating characteristics and other datasets;

(3) using a twist-free mechanism to avoid building a twisted topological map;

(4) learning length and effectiveness of topological preservation; and

(5) having online and offline specifications of algorithms.

Furthermore, the author proposes using and comparing constrained neural learning in offline as well as online mapping algorithms.

Reviewer:  Lalit Saxena Review #: CR143720 (1511-0982)
Bookmark and Share
 
Artificial Intelligence (I.2 )
 
 
Algorithms (I.5.3 ... )
 
 
Neural Nets (I.5.1 ... )
 
 
Clustering (I.5.3 )
 
Would you recommend this review?
yes
no
Other reviews under "Artificial Intelligence": Date
Theory of genetic algorithms
Schmitt L. Theoretical Computer Science 259(1-2): 1-61, 2001. Type: Article
Mar 1 2002
Artificial intelligence: a modern approach
Russell S., Norvig P., Pearson Education, 2003.  1132, Type: Book (9780137903955), Reviews: (1 of 2)
Jul 16 2003
Artificial intelligence: a modern approach
Russell S., Norvig P., Pearson Education, 2003.  1132, Type: Book (9780137903955), Reviews: (2 of 2)
Jan 6 2005
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy