Learning tasks from a single demonstration presents a significant challenge because the observed sequence is inherently an incomplete representation of the procedure that is specific to the current situation. Observation-based machine-learning techniques are not effective without multiple examples. However, when a demonstration is accompanied by natural language explanation, the language provides a rich source of information about the relationships between the steps in the procedure and the decision-making processes that led to them. In this paper, we present a oneshot task learning system built on TRIPS, a dialogue-based collaborative problem solving system, and show how natural language understanding can be used for effective one-shot task learning.
Hyuckchul Jung, James F. Allen, Nathanael Chambers