This paper aims to investigate methodologies to utilize an agent’s intentions as a means to guide the revision of its beliefs. For this purpose, we develop a collection of belief revision operators that employ the effect of the revision on the agent’s intentions as the selection criteria. These operators are then assessed for rationality against the traditional AGM postulates. There is a large volume of work concerned with classical belief revision, the primary issue of which is the mitigation of the uncertainty inherent in environments in which belief revision is necessary. Traditional approaches attempt to assess the explanatory power of beliefs and utilize this as a heuristic to resolve this ambiguity. We argue that for practical reasoning systems, whose primary focus lies in the maintenance of behavior and not information, an agent’s intentions provide a better guide.