Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
A generalized class of certainty and information measures.
van der Lubbe J., Boekee D. Information Sciences32 (3):187-215,1984.Type:Article
Date Reviewed: Jan 1 1985

This paper apparently derived from the first author’s PhD thesis [1], takes as its starting point a definition of information based on the certainty, rather than the usual uncertainty, of a given experimental outcome. This leads to three different, and parametric, information measure functions: logarithmic (:Ia la Shannon’s [2]) linear, and hyperbolic. These functions have acceptable analytic properties, and various choices for the parameters yield, as special cases, many other information measures which have appeared over the years in the (mathematical) information sciences (including those of Renyi [3], Vajda [4], among others). The authors thus properly claim a conceptual unification of diverse information measures.:P The authors are, however, silent on how (if at all) this theory might be used in the field of computer science. The most obvious possibility is in pattern recognition, where extended ranges of the functions provide more sensitivity of small changes in domain values. More generally, computer scientists should be aware that there are other ways to measure information than Shannon’s base 2 logarithm See [5] for yet another possibility which might be of interest to AI-types interested in common sense inferencing. This paper is not obviously related to work in the physics of information (e.g., [6]) or similar thermodynamic issues.

REFERENCES

:4T[1]VAN DER LLUBBE, J. C. A. A generalized probabilistic theory of the measurement of certainty and information, PhD Thesis, Dept. of Electrical Engineering, Delft Univ. of Technology, Delft, The Netherlands, 1981. :4T[2]SHANNON, C. E. The mathematical theory of communication, :IBell Syst. Tech. J. :1A27, 1948, 379:U423, 623:U656. :4T[3]RENYI, A. On measures of entroppy and information, in :IProc. of the fourth Berkeley symposium on mathematical statistics and probability, (1960), 547:U561. :4T[4]VAJDA, I. Bounds on the minimal error probability on checking a finite or countable number of hypotheses, :IProblemy Peredachi Informatsii :1A4, 1968, 9:U19. :4T[5]DRETSKE, F. I. :IKnowledge and the flow of information, MIT Press, Cambridge, MA, 1981. See :ICR Rev. 8401:U0009. :4T[6]ROBINSON, A. L. Computing without dissipating energy, :IScience :1A223, (March 1984), 1164:U1166.

Reviewer:  M. Manthey Review #: CR108788
Bookmark and Share
 
Information Theory (H.1.1 ... )
 
Would you recommend this review?
yes
no
Other reviews under "Information Theory": Date
Information in the enterprise
Darnton G., Giacoletto S., Digital Press, Newton, MA, 1992. Type: Book (9780131761735)
Sep 1 1993
Information theory for information technologists
Usher M., Macmillan Press Ltd., Basingstoke, UK, 1984. Type: Book (9789780333367032)
Sep 1 1985
Thirty years of information theory
Tribus M., John Wiley & Sons, Inc., New York, NY, 1983. Type: Book (9780471887171)
Jul 1 1987
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy