Shannon’s information theory has attracted researchers, academics, practitioners, and even common users of information. Generally, Shannon related information to surprise by attaching predictability to the probabilistic outcome of events. Despite progress in the application of information theory and the importance attached to the predictability of surprises, in practical terms, we encounter surprises on a daily basis. Complexities in life, the environment, and the processes by which our society is governed provide enough indications that such surprises are expected on a regular basis; continuous effort is necessary to address this issue.
This book elaborates on the foundations of information theory and Shannon’s contributions in this area, and introduces novelty as an additional and distinguishing conceptual dimension to information and surprise. Readers will be delighted to encounter the foundational insights on statistical, probabilistic, and descriptive approaches to analyzing information, surprise, and the role of novelty. The author has provided abundant explanations for each characteristic of novelty, information, and surprise through mathematical and statistical theorems that will definitely interest researchers and students pursuing complex modeling under information theory.
The application of information theory in real-life situations is not new, although its demand is ever increasing. While the role of information and its services are well appreciated, the book enhances the richness of the foundations of information theory with the scope to relate it to practical situations. A real-life situation could have been presented and discussed, and the role of novelty, information, and surprise could have been narrated to enable the reader to better appreciate the concepts.
The book is organized in 17 chapters, which are grouped in six parts. Part 1 guides the reader to get exposed to basic concepts on probability theory, information, and surprise. Part 2 delves into discussions on coding theory and information transmission, while Part 3 covers interesting discussions on information rate and channel capacity. Part 4 presents insights into novelty and information with conditioning, mutuality, and gains. Part 5, perhaps the most interesting part of the book, discusses the processing and transmission of information and relates it to neuroscience. Part 6 discusses generalized information theory. All of these parts succinctly guide the reader to appreciate the underpinnings of information and coding theory.
The author sets a goal to provide additional sources of subjectivity in probability and information theory, and successfully presents arguments related to novelty, information, and surprise. Readers interested in probability, set theory, coding, and information theory will enjoy reading this book, but it may disappoint readers looking for information on the application of the relevant subjective and objective concepts. Students, however, will appreciate the organized probing of the concepts, and can improve their understanding by working through the systematically presented questions at the end of each chapter.