Truth maintenance systems provide caches of beliefs and inferences that support explanations and search. Traditionally, the cost of using a TMS is monotonic growth in the size of this cache. In some applications this cost is too high; for example, intelligent learning environments may require students to explore many alternatives, which leads to unacceptable performance. This paper describes an algorithm for fact garbage collection that retains the explanation-generating capabilities of a TMS while eliminating the increased storage overhead. We describe the application context that motivated this work and the properties of applications that benefit from this technique. We present the algorithm, showing how to balance the tradeoff between maintaining a useful cache and reclaiming storage, and analyze its complexity. We demonstrate that this algorithm can eliminate monotonic storage growth, thus making it more practical to field large-scale TMS-based systems.
John O. Everett, Kenneth D. Forbus