Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Brains, machines, and mathematics (2nd ed.)
Arbib M., Springer-Verlag New York, Inc., New York, NY, 1987. Type: Book (9789780387965390)
Date Reviewed: Sep 1 1988

The second edition of Brains, machines, and mathematics follows the first edition by 23 years. This has been a complex time in the areas addressed by this book; paradigms have come and gone or changed substantially. Although the purpose of this latest monograph is not to be a chronicle of those events, some of the shifts of direction are inextricably linked to the topics of the book, and several of them are discussed explicitly. The purpose of the book is to provide an update on the topics of the first edition and to discuss salient and common features of brains, machines, and mathematics. There are few who could undertake a project of this nature and bring more experience and competence to the task than Michael Arbib. Original references that he has written or coauthored are cited at the end of each chapter.

This book is most accessible to those having a reasonable knowledge of set theory, predicate calculus, and partial differential equations. Set theory is accorded a brief appendix, and predicate calculus is given a brief review, but there is no similar aid for the encounter with partial differential equations in the chapter on learning networks. For those without a background in these areas, some details will remain obscure; but some of the discussions and general ideas will nonetheless be understandable without major difficulty.

Changes from the first edition include a diminution of the space directly devoted to cybernetics, although its dissolution to other areas is discussed briefly. The material on the correction of errors in communication and computation, including Shannon’s communication theory and his fundamental theorem for a discrete noisy channel, has been eliminated. In the second edition a short chapter on historical perspective has been added, there is a briefer version of the material on neural nets and finite automata in chapter 2, and the material on feedback and pattern recognition has been reorganized. There are also entire new chapters on learning networks (chapter 5), Turing machines and effective computations (chapter 6), and automata that construct (and can self-reproduce) as well as compute (chapter 7). Both editions end with a chapter on Gödel’s incompleteness theorem (chapter 5 in the first edition, and chapter 8 in the second edition). These chapters end with a persuasive discussion of the mind-machine controversy, in which Arbib espouses his philosophy that Gödel’s work is as limiting for brains (minds) as it is for machines. Scriven’s contributions are quoted and acknowledged in the first edition, but are not mentioned in the second (although Arbib’s philosophy, which is described in considerable detail, seems to be the same as his).

Some specific comments about the second edition follow. Chapter 2, which covers neural nets and finite automata, and chapter 3, which covers feedback and realization, are quite brief and omit much relevant material that could have been included. Nevertheless, the material selected is definitional, with just enough suggestion as to its applicability for later use. Chapter 4, on pattern recognition, is somewhat more inclusive, although the topics are severely restricted to the direct goals of the monograph, as expected. The discussion of perceptrons includes the work of Minsky and Papert [1], which Arbib puts in proper perspective. The work of Winograd [2], Spira [3], and Spira and Arbib [4] on computation times is also discussed, providing insights not available in other monographs known to me. The discussion on connectionism in chapter 5 begins with a suitable motivation from biological systems and some simple examples from the recent literature (cf. [5,6]). Current topics discussed (albeit very briefly) include synaptic matrices, which provide for distributed, associative memory, and autoassociative nets, which have a continuous output, with the output fed back to the input. Hopfield nets and Boltzmann machines are also discussed, and the concepts of hidden units and energy are introduced. Learning is introduced using the Hebb rule and Francis Crick’s suggestion to prevent saturation of synaptic weights. Back-propagation is also introduced as a paradigm for learning, with an associated convergence theorem. The descriptions of connectionist models are extremely brief but are sufficient to impart some of the major concepts. Chapters 6 and 7 cover topics that are more foundational than current (Turing machines and automata, respectively), but that nevertheless contribute to the overall objectives of the monograph by providing support for the discussion of Gödel’s work and the brain-machine controversy in chapter 8. For those desiring more detailed expositions of the material in this monograph, Arbib has authored or coauthored five other recent and relevant books (for example, see [7]). Relationships between this monograph and those books are discussed briefly in the preface to the second edition.

In summary, this monograph is a brief but wide-reaching exposition of some important issues in brain theory (and of artificial intelligence, in a fundamental sense) and the association between brain theory and machines. Its scope is surprising in light of its brevity. It provides suitable introductions to several areas important to the modeling of brains, and it delineates some of the research issues involved.

Reviewer:  D.W. Dearholt Review #: CR112337
1) Minsky, M. L., and Papert, S. Perceptrons: An Essay in Computational Geometry. The MIT Press, Cambridge, MA, 1969.
2) Winograd, S.On the time required to perform multiplication. J. ACM 14, 4 (1967), 798–802. See <CR> 9, 5 (May 1968), Rev. 14,376.
3) Spira, P. M.The time required for group multiplication. J. ACM 16 (1969), 235–243. See <CR> 10, 11 (Nov. 1969), Rev. 17,920.
4) Spira, P. M. and Arbib, M. A.Computation times for finite groups, semigroups and automata. In Proceedings of the IEEE 8th Annual Symposium on Switching and Automata Theory. (1967) 291–295.
5) Rumelhart, D., and McClelland, J. (Eds.) Parallel distributed processing: explorations in the microstructure of cognition. The MIT Press/Bradford Books, Cambridge, MA, 1986.
6) McClelland, J. L., and Rumelhart, D. E. An interactive activation model of context effects in letter perception: Part 1. An account of basic findings. Psych. Rev. 88, (1981), 375–407.
7) Arbib, M. A.Computers and the cybernetic society. Academic Press, Orlando, FL, 1984.
Bookmark and Share
 
Models Of Computation (F.1.1 )
 
 
Human Information Processing (H.1.2 ... )
 
 
General (I.2.0 )
 
 
Mathematical Logic (F.4.1 )
 
Would you recommend this review?
yes
no
Other reviews under "Models Of Computation": Date
Communication and concurrency
Milner R., Prentice-Hall, Inc., Upper Saddle River, NJ, 1989. Type: Book (9780131150072)
Jan 1 1990
The social metaphor for distributed processing
Stark W., Kotin L. Journal of Parallel and Distributed Computing 7(1): 125-147, 1989. Type: Article
Dec 1 1990
The language of machines
Floyd R., Beigel R. (ed), Computer Science Press, Inc., New York, NY, 1994. Type: Book (9780716782667)
Jun 1 1996
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy