This paper covers the use of second-order decision tables for compact representation of knowledge. There are two modes for their use: representation of expert knowledge in a compact manner, and compressed representation of learned rules as a hypothesis for classifying large data sets. Similarity with relational tables provides visualization and ideas for a rich set of operations that can be used to compress a simple flat table to a second-order table with multivalued condition attributes.
For knowledge restructuring, a set of equivalence-preserving compression operations is defined and used for reducing such flat tables to second-order decision tables. Of particular interest for knowledge systems is the use of a second-order decision table representation as a simple hypothesis for classification, where an additional set of consistency preserving operations is used. The author discusses the results of experiments using second-order relation compression for extraction of rules (SORCER) on data sets from the Machine Learning Repository.
SORCER is a supervised learning system that induces second-order decision tables from a given database. This paper demonstrates that even a simple induction process (used in reducing decision tables) accurately classifies given data sets.