A statistical treatment of a common neural net structure, the multilayer perceptron, in a language that is familiar to working statisticians is addressed in this book. Several questions arise when statisticians are confronted with such a model, and the book tries to provide thorough answers. Moreover, the author is able to treat many important (practical) topics in greater depth by concentrating on the pattern recognition aspects of neural networks.
The book is organized into 13 chapters. Chapters 1 through 5 describe the multilayer perceptron model (MPM) and some comparative studies with other statistical models used for classification tasks. Chapters 6 and 7 describe possible adaptations of MPM to conditions “with a large number of classes” and adaptations to “some image problems.” Chapters 8 and 9 are dedicated to analyzing the robustness of the MPM by considering the influence curves for the weights in it. Chapters 10 through 13 illustrate some extensions and adjustments of MPM: how to make the model more robust, how to use spectral data, and how to modify the model weights during a fitting procedure.
Some chapters include complements and exercises that are very useful for understanding several abstract ideas. Unfortunately, no answers to the exercises are provided--this is my only major complaint.
This volume is valuable for scientists and practitioners interested in neural networks that are used in pattern recognition. The most important contribution is the solid mathematical approach, which is a sign of the increasing maturity in the field.
In summary, the book offers a deep perspective on a topic that is widely used, and underlines the pattern recognition theme in a way that makes it strongly recommendable to the artificial intelligence (AI) community.