Although intended as a course book, this work can also provide ideas for research in a new approach to artificial neural networks. The main goal, recognizing 2 n combinations with n neurons, immediately suggests some variation of the Hopfield model. Similarities exist, but the approach taken to analyze and express the so-called memory model neural network is different. People acquainted with automatic control theory will be most comfortable with the mathematical concepts used throughout the text, since they are based on the Lyapunov theory of stability in dynamical systems. Each desired state is stored, rather than learned in the sense used in more common models. The system represents a memory as a constant attractor trajectory, which, given a suitable choice of some parameters (such as gain), is the only stable trajectory. The common problem of spurious attractors (as in Hopfield memory) is alleviated, as is the problem of a state equidistant from two attractors. The speed of convergence, a crucial parameter for practical use of the network, is high.
A virtue of the book is that it contains solutions to all the exercises. Surprisingly, the author has chosen to use spreadsheets (such as Lotus 1-2-3) for exercises requiring numerical computations. This decision is probably a matter of personal taste, and a good appendix is supplied to help the reader. Another appendix discusses dynamic systems theory. After an introductory chapter 0 on the neural network approach to problem solving, chapter 1 treats neural networks as dynamical systems. The next two chapters, “Hypergraphs and Neural Networks” and “The Memory Model,” delve more deeply into the idea of dynamical systems. Chapter4 presents code and image recognition applications. Chapter 5 extends the material in chapter 1 to develop a denser storage strategy. The final chapter discusses applications to some problems in operations research.