Existing cognitive agent programming languages that are based on the BDI model employ logical representation and reasoning for implementing the beliefs of agents. In these programming languages, the beliefs are assumed to be certain, i.e. an implemented agent can believe a proposition or not. These programming languages fail to capture the underlying uncertainty of the agent’s beliefs which is essential for many real world agent applications. We introduce Dempster-Shafer theory as a convenient method to model uncertainty in agent’s beliefs. We show that the computational complexity of Dempster’s Rule of Combination can be controlled. In particular, the certainty value of a proposition can be deduced in linear time from the beliefs of agents, without having to calculate the combination of Dempster-Shafer mass functions.