This paper apparently derived from the first author’s PhD thesis [1], takes as its starting point a definition of information based on the certainty, rather than the usual uncertainty, of a given experimental outcome. This leads to three different, and parametric, information measure functions: logarithmic (:Ia la Shannon’s [2]) linear, and hyperbolic. These functions have acceptable analytic properties, and various choices for the parameters yield, as special cases, many other information measures which have appeared over the years in the (mathematical) information sciences (including those of Renyi [3], Vajda [4], among others). The authors thus properly claim a conceptual unification of diverse information measures.:P The authors are, however, silent on how (if at all) this theory might be used in the field of computer science. The most obvious possibility is in pattern recognition, where extended ranges of the functions provide more sensitivity of small changes in domain values. More generally, computer scientists should be aware that there are other ways to measure information than Shannon’s base 2 logarithm See [5] for yet another possibility which might be of interest to AI-types interested in common sense inferencing. This paper is not obviously related to work in the physics of information (e.g., [6]) or similar thermodynamic issues.
REFERENCES
:4T[1]VAN DER LLUBBE, J. C. A. A generalized probabilistic theory of the measurement of certainty and information, PhD Thesis, Dept. of Electrical Engineering, Delft Univ. of Technology, Delft, The Netherlands, 1981. :4T[2]SHANNON, C. E. The mathematical theory of communication, :IBell Syst. Tech. J. :1A27, 1948, 379:U423, 623:U656. :4T[3]RENYI, A. On measures of entroppy and information, in :IProc. of the fourth Berkeley symposium on mathematical statistics and probability, (1960), 547:U561. :4T[4]VAJDA, I. Bounds on the minimal error probability on checking a finite or countable number of hypotheses, :IProblemy Peredachi Informatsii :1A4, 1968, 9:U19. :4T[5]DRETSKE, F. I. :IKnowledge and the flow of information, MIT Press, Cambridge, MA, 1981. See :ICR Rev. 8401:U0009. :4T[6]ROBINSON, A. L. Computing without dissipating energy, :IScience :1A223, (March 1984), 1164:U1166.