Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Multilayer neural networks : a generalized net perspective
Krawczak M., Springer Publishing Company, Incorporated, New York, NY, 2013. 194 pp. Type: Book (978-3-319002-47-7)
Date Reviewed: Apr 16 2014

While many books have been published on multilayer neural networks, this book distinguishes itself with new insights on the structure and functioning of these systems, stated in terms of more general concepts and methods coming from the theory of dynamical systems. This book aims to provide a suitable framework allowing the embedding of the multilayer neural networks viewed as multistage systems, in an extension of Petri net theory called the theory of generalized nets. The use of concepts and tools from the generalized net methodology allows for the derivation of some new, appropriate descriptions of the functioning of discrete systems as the simulation of neural networks and systems of neural networks together with different classes of learning algorithms.

The book includes seven chapters. Following the brief presentation of the basics of the classical theory of multilayer networks, the fundamentals of generalized nets together with several algebraic and operator features are summarized in chapter 2. According to the types of connections of the subsystems of a multilayer neural network, the author classifies the systems into three categories: systems without aggregation, where each neuron is treated as a subsystem; systems where the neurons are aggregated within each layer; and full aggregated systems, where neither separate neurons nor layers are distinguished.

The simulation process of neural networks is next described in terms of the generalized net methodology, where the execution steps are presented in some detail for each of these classes. The final section of chapter 3 is devoted to systems of neural networks, where the properties of some specific operators make the description of their simulation process possible. Chapter 4 examines the process of supervised learning stated in terms of the minimization of the mean error criterion function. The classic gradient descent learning schemes, the delta rule, the generalized delta rule, and back-propagation (BP), are briefly recalled. In the fourth section, the author provides an interesting description of the BP learning algorithm in terms of generalized nets.

The developments presented in chapter 5 aim to establish that the learning process of a neural network can be viewed as a multistage control problem. Consequently, the problem of getting an optimal set of weights can be treated as a problem of optimal control. It can, therefore, be solved using the dynamic programming method. A series of interesting developments are given in the fifth section of this chapter; good approximations of the weights are derived on the basis of nonlinear programming methods based on dynamic programming, also called differential dynamic programming. The last section of this chapter describes the first-order differential programming algorithm.

An almost linearization of the output of a neuron is proposed in chapter 6 by including a small value gain parameter in the expression of the sigmoidal output function. This modified output function allows for solving the learning process problem using linear-quadratic system optimization techniques. Chapter 7 covers adjoint neural networks where the problem of learning is treated in terms of graph theory.

The long list of bibliographical references contains the most representative papers and books published in this area. Moreover, the final section of each chapter includes comments concerning the evolution of ideas together with the most outstanding published work in the respective areas. The developments presented in the book are both interesting and important, and open new perspectives for research in the area of simulation and learning processes in multilayer neural networks.

From a theoretical as well as from a practical point of view, the book is of real value to researchers in the field of neural networks. It is also useful for students studying computer science and engineering.

Reviewer:  L. State Review #: CR142183 (1407-0504)
Bookmark and Share
  Reviewer Selected
 
 
Self-Modifying Machines (F.1.1 ... )
 
 
Connectionism And Neural Nets (I.2.6 ... )
 
Would you recommend this review?
yes
no
Other reviews under "Self-Modifying Machines": Date
Complex systems dynamics
Weisbuch G., Ryckebushe S. (trans.), Addison-Wesley Longman Publishing Co., Inc., Boston, MA, 1991. Type: Book (9780201528879)
Dec 1 1991
On the computational efficiency of symmetric neural networks
Wiedermann J. Theoretical Computer Science 80(2): 337-345, 1991. Type: Article
Mar 1 1992
Introduction to the theory of neural computation
Hertz J., Krogh A., Palmer R., Addison-Wesley Longman Publishing Co., Inc., Boston, MA, 1991. Type: Book (9780201503951)
Jan 1 1993
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy