Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
An introduction to machine learning (2nd ed.)
Kubat M., Springer International Publishing, New York, NY, 2017. 348 pp. Type: Book (978-3-319639-12-3)
Date Reviewed: Mar 9 2018

In his introduction, the author states that “machine learning has come of age.” In many ways, if the scope and methods are those established in the book, it is true. But from the point of view of recent research and especially industrial initiatives, this is not true in terms of approaches, performance, and scalability.

The volume has 17 chapters, starting at a low level of designing a two-class classifier, and then moving to k-nearest neighbor classifiers, neural networks, decision trees, task attributes, unsupervised learning around k-means, rule sets, genetic learning, and reinforcement learning. It is appreciated that the style is progressive and pedagogical, with chapters containing sections entitled “What Have You Learned” and “Solidifying Your Knowledge,” some examples or applications, and sometimes some exercises.

But to a very large extent, the content matches the state of research in the 1980s, as reflected in a survey like the one by Jain et al. [1]. The range of approaches and corresponding analysis does not match established textbooks from that period [2,3]; for example, image understanding is not covered and learning set size is not addressed from a sample statistics viewpoint.

Looking forward, the volume does not address at an introductory level the many other facets that machine learning has recently incorporated from its predecessors pattern recognition and knowledge representation, such as data sorting and partitioning for induction, incremental graph relational methods, annealing, dynamic clusters with or without subspaces, the huge variety of learning algorithms found in freeware, and so on. This evolution was driven by data diversity and scalability, as well as the need to characterize the learned classes at several levels.

In summary, this volume offers a pedagogical and refreshing overhaul of older established textbooks in pattern recognition and neural networks, but misses out on recent developments and their data structures. In the bibliography, most of the references are from well before the middle of the years of the last decade. There is a short index.

Reviewer:  Prof. L.-F. Pau, CBS Review #: CR145908 (1806-0287)
1) Jain, A. K.; Duin, R. P. W.; Mao, J. Statistical pattern recognition: a review. IEEE Trans. on Pattern Analysis and Machine Intelligence 22, 1(2000), 4–37.
2) Duda, R. O.; Hart, P. E. Pattern classification and scene analysis. John Wiley & Sons, New York, NY, 1973.
3) Devijver, P. A.; Kittler, J. Pattern recognition: a statistical approach. Prentice Hall, London, UK, 1982.
Bookmark and Share
  Reviewer Selected
 
 
Learning (I.2.6 )
 
Would you recommend this review?
yes
no
Other reviews under "Learning": Date
Learning in parallel networks: simulating learning in a probabilistic system
Hinton G. (ed) BYTE 10(4): 265-273, 1985. Type: Article
Nov 1 1985
Macro-operators: a weak method for learning
Korf R. Artificial Intelligence 26(1): 35-77, 1985. Type: Article
Feb 1 1986
Inferring (mal) rules from pupils’ protocols
Sleeman D.  Progress in artificial intelligence (, Orsay, France,391985. Type: Proceedings
Dec 1 1985
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy