Computing Reviews

Learning in parallel networks: simulating learning in a probabilistic system
Hinton G. (ed) BYTE10(4):265-273,1985.Type:Article
Date Reviewed: 11/01/85

This is a short, popular account of some current work in machine learning, introduced with a description of convergence in perceptron-like networks. The author discusses Hopfield nets, which seek “minimum energy” patterns such that active units have maximal interfacilitation, and his own Boltzmann Machines, in which unit activation is probabilistic and the network settles to a “thermal equilibrium” in the presence of such background noise. The reference list contains a couple of more substantial papers.

Reviewer:  J. R. Sampson Review #: CR109534

Reproduction in whole or in part without permission is prohibited.   Copyright 2024 ComputingReviews.com™
Terms of Use
| Privacy Policy