The standard approach used to construct a lexical analyzer automatically is to begin with a specification of the lexical elements as regular expressions and to translate the collection of regular expressions into a finite state automaton. The authors begin by describing an algorithm to perform the translation using lazy evaluation techniques. They then provide additional algorithms to modify the existing translation when a regular expression is added to the collection or a regular expression in the collection is removed.
The work is a well written and interesting exercise in applying lazy incremental programming techniques to a standard compiler construction tool. It makes a natural companion piece to another paper written by the same authors on generating parsers incrementally [1]. Whether any great need for the incremental scanner generator exists is, however, a major question in my eyes. I consider the existing tools to be blindingly fast and I would not notice the difference if an even faster incremental generation technique were to be used. An even more important issue is that the generated scanner executes lazily, which means, in effect, that the entries in the tables defining the finite state machine are created only as the scanner is used. The implication is that the generated scanner executes significantly more slowly than if lazy techniques were not employed.