Most previous logical accounts of goals do not deal with prioritized goals and goal dynamics properly. Many are restricted to achievement goals. In this paper, we develop a logical account of goal change that addresses these deficiencies. In our account, we do not drop lower priority goals permanently when they become inconsistent with other goals and the agent's knowledge; rather, we make such goals inactive. We ensure that the agent's chosen goals/intentions are consistent with each other and the agent's knowledge. When the world changes, the agent recomputes her chosen goals and some inactive goals may become active again. This ensures that our agent maximizes her utility. We prove that the proposed account has desirable properties. We also discuss previous work on postulates for goal revision. Categories and Subject Descriptors I.2.11 [Artificial Intelligence]: Distributed Artificial Intelligence--Intelligent agents, Multiagent systems; I.2.4 [Artificial Intelligenc...
Shakil M. Khan, Yves Lespérance