Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Artificial neural networks - ICANN 2006 : 16th International Conference, Athens, Greece, September 10-14, 2006, Proceedings, Part I (Lecture Notes in Computer Science 4131)
Kollias S., Staffylopatis A., Duch W., Oja E., Springer-Verlag New York, Inc., Secaucus, NJ, 2006. 1008 pp. Type: Book (9783540386254)
Date Reviewed: Feb 22 2008

Since 1991, the International Conference on Artificial Neural Networks (ICANN) has been the premier gathering for its field. The support comes from a number of societies, including the European Neural Network Society (ENNS), the International Neural Network Society (INNS), the Japanese Neural Network Society (JNNS), and the Institute of Electrical and Electronics Engineers (IEEE) Computational Intelligence Society (CIS). Part 1 consists of a collection of papers presented at ICANN 2006, held in Athens, Greece, during September 2006. The acceptance rate of this conference is about 4 out of 9 of all submissions. For this particular meeting, 208 out of the 475 papers submitted were accepted for presentation at the meeting, and for subsequent publication in its proceedings. The structure of the proceedings parallels that of the conference, which contained 21 regular sessions and ten special sessions.

Although it would be beyond the scope of a review to discuss every paper in these proceedings, it is useful to the reader to know the overall topics that the papers cover in this volume. These topics include: ensemble learning and learning algorithms; self organization of neural networks (NNs); connectionist cognitive science and hybrid architectures; computational neuroscience; and neural science and complex systems. The editors divided the proceedings into two volumes so that the number of sections and papers covered by each volume would be roughly half of the entire proceedings. The editors should be congratulated on accomplishing what is clearly such a huge task.

When describing an environment for the purpose of statistical or neural learning algorithms, a vector of features or attributes is provided in order to determine or predict the target value. Often, the descriptive process involves many variable parameters, and, for reasons of accuracy and efficiency, reorganizing or removing features is a consideration that can help the learning algorithmic process focus on the essential variables that capture the target value under analysis.

The first six papers describe various methods and approaches for accomplishing this. Kwak and Kim use a statistical approach, whereas Kurszynski et al. use a probabilistic approach with genetic algorithms. The second section’s six papers deal with learning algorithms. Ioannou et al. present a fascinating algorithm of entire NN adaptive training to extract facial expressions, coupled with audio modalities for determining the emotional account of a personal delivery of a speech. The third section, a continuation of learning algorithms, consists of five papers. It concentrates on the construction and architecture of NNs. Advances in NN learning is a special session at the conference, and the nine papers accepted for this session are presented in Section 4. In these papers, classic approaches, such as k-separability, kernel methods, entropy minimization, and the use of Hebbian attributes, are enhanced for better results and performance.

Ensemble learning algorithms, presented in Section 5, play an important role in building NN architectures for learning. Ensembles incorporate sets of examples by selecting predictive measures from a theoretical space and constructing a decision function from a diverse set of potential outputs. Mohammed et al. consider incremental learning that introduces new concept classes as new data arrives. Ñanculef et al. incorporate local diversity into the learning process. Prudêncio and Ludermir construct the decision function by deriving weights for a linear combination of forecasts. The next section, containing four papers, is dedicated to a special session on random NN and stochastic agents. This is aptly followed by a section on hybrid architectures, containing six papers. The section on self organization, with eight papers, is a blend of theoretical musings and practical applications on this most important topic.

The remainder of the proceedings contains a number of sections that group together papers under some overall theme. The first four sections concentrate on neuroscience and related issues, with a total of 32 papers: connectionist cognitive science (six papers); a special session on cognitive networks (nine papers); neural dynamics and complex analysis (nine papers); and computational neuroscience (eight papers). Robotic applications span the subsequent two sections, with nine and six papers, respectively. The final section, seven papers, considers specialized NNs that were inspired by biological models.

Reviewer:  Minette Carl Review #: CR135292 (0812-1162)
Bookmark and Share
 
General (I.2.0 )
 
 
Conference Proceedings (A.0 ... )
 
 
Self-Modifying Machines (F.1.1 ... )
 
 
General (F.1.0 )
 
 
General (I.5.0 )
 
 
General (I.6.0 )
 
  more  
Would you recommend this review?
yes
no
Other reviews under "General": Date
Artificial experts: social knowledge and intelligent machines
Collins H., MIT Press, Cambridge, MA, 1990. Type: Book (9780262031684)
Apr 1 1991
Catalogue of artificial intelligence techniques
Bundy A., Springer-Verlag New York, Inc., New York, NY, 1990. Type: Book (9780387529592)
Aug 1 1991
Knowledge and inference
Nagao M., Academic Press Prof., Inc., San Diego, CA, 1990. Type: Book (9780125136624)
Oct 1 1991
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy