Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Information is energy: definition of a physically based concept of information
Pagel L., Springer International Publishing, Cham, Switzerland, 2023. 238 pp. Type: Book (3658408618)
Date Reviewed: Apr 8 2024

Shannon’s theory of information is one of the cornerstones of modern computer science (CS). The formalism he developed for quantifying channel capacity was previously known from statistical thermodynamics as entropy, and the significance of this formal alignment between computation and thermodynamics has invited frequent discussion and speculation. A practitioner of either discipline who is interested in this alignment might well collect a notebook of thoughts and references on the subject, and Professor Pagel’s book is an excellent example of what such a notebook might look like. Rich in citations from other researchers and wide-ranging in the subjects it discusses (including CS, physics, biology, philosophy, self-organization, consciousness, and cosmology), it engages the reader in an ongoing conversation with different strands of thought around its theme. This diffuse organization also means that it does not present a detailed development of a unified theory of its own. But it is a valuable compendium of thoughts and references to both stimulate and support the development of a more complete theory.

The title, Information is energy, intentionally challenges the longstanding view in information science that information is entropy. In physics, the units of entropy are joules (energy) per degree Kelvin (temperature), which is formally the heat capacity of a substance. At a constant temperature, physical entropy is indeed energy, but Pagel emphasizes the importance of recognizing this in computation. He asserts early on, “The central property of information is transferability,” and ”Information that is not transmissible ... can be considered irrelevant or non-existent.” This commitment leads directly to an analysis of the quantum mechanical limits on information transfer imposed by the Heisenberg uncertainty principle, and the definition of information as entropy per unit time.

An introductory chapter surveys the domains impacted by the concepts to be explored. Chapter 2 argues that information must be defined physically, in view of its transmissibility, with heavy reliance on quantum theoretical foundations. Chapter 3 surveys a wide range of definitions and theories of information, such as algorithmic information theory, the relation between information and knowledge, semantic versus syntactic information, and on and on. Chapter 4 offers a comparable review of the various contexts in which entropy has been applied, not only in information theory and thermodynamics but also in quantum mechanics. Chapter 5 returns to the theme of dynamic information to explore how the notion of information transfer can be applied to thermodynamic systems, thus completing the conceptual cycle begun by importing the thermodynamic concept of entropy into information theory.

Chapters 6 through 8 explore the applicability of these concepts more broadly to other fields. Chapter 6 discusses structure formation through entropy export in open systems. Chapter 7 speculates on the nature of consciousness and the contribution of information theory, broadly understood, to this problem, while chapter 8 considers relativistic limits on the movement of information. The book closes with a brief summary, a bibliography of 112 items through 2022, and a very brief index.

The wide range of disciplines that the book integrates makes it challenging to read, and the treatment often focuses on comparing the contributions and approaches of different researchers who have confronted the subject rather than developing a single coherent theory. But it is unparalleled as a collection of references on the relation of computation and physics, and will certainly contribute to further, more systematic work in this field.

Reviewer:  H. Van Dyke Parunak Review #: CR147739
Bookmark and Share
  Editor Recommended
Featured Reviewer
 
 
Information Theory (H.1.1 ... )
 
 
Physics (J.2 ... )
 
Would you recommend this review?
yes
no
Other reviews under "Information Theory": Date
A generalized class of certainty and information measures.
van der Lubbe J., Boekee D. Information Sciences 32(3): 187-215, 1984. Type: Article
Jan 1 1985
Information in the enterprise
Darnton G., Giacoletto S., Digital Press, Newton, MA, 1992. Type: Book (9780131761735)
Sep 1 1993
Information theory for information technologists
Usher M., Macmillan Press Ltd., Basingstoke, UK, 1984. Type: Book (9789780333367032)
Sep 1 1985
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy