Tackling unexplored issues in timed automata and timed languages, this paper formalizes the “quantitative analysis of the size of these languages and of the information content of timed words.”
The purpose of the presented research is to come up with a formalization of entropy (the exponential growth rate of count words with n symbols) for deterministic timed automata. The authors’ solution replaces the cardinality, used in classical approaches, with volume, giving a definition of a (volumetric) entropy similarity measure. This similarity measure represents the average quantity of information per event in a timed word of the language. The proposed method also defines a distinction between thick and thin timed automata, based on their Zeno-like behavior. Further, a positive integral operator is introduced for converging numerical procedures and characterizing entropy spectrally and symbolically.
This very thorough paper describes the context of the presented innovation in a detailed manner, for example, from outlining classical works on entropy of regular languages and methods of ruling out pathologies in timed automata, to outlining the problem in strictly mathematical terms, to showing the solution in an equally precise and meticulous way, including several-pages-long appendices with proofs. It is very good reading for scholars and advanced students interested in the theory of automata and timed languages.