The process of building intelligence into AI engines is replete with problems. In the course of applying predicate calculus, we encounter many such problems, including the frame problem, the qualification problem, and the ramification problem. This paper addresses the so-called expensive-rule problem in explanation-based learning (EBL). In some cases, it would be computationally more expensive to solve a problem after learning than before learning. The authors postulate that the expense is introduced inadvertently and unnecessarily into the learning algorithms and propose a method of identifying and eliminating the source of this expense. The key step is to formulate the EBL problem as a sequence of transformations and estimate the cost of each step in order to identify the culprits. This transformational analysis is presented in terms of EBL-SOAR, an implementation of EBL in SOAR. SOAR is a public-domain architecture that combines general problem-solving abilities with a learning mechanism called “chunking.” Using this technique, the authors identify three sources of expense. They postulate that all three sources can be traced back to what they call “loss of information during learning.”
This long paper, divided into seven sections, is easy to read. A brief tutorial on EBL and SOAR is followed by material on measuring the cost of learning. Sections 4 and 5, constituting the heart of the paper, contain a treatment of the transformational analysis.