Trust is a crucial basis for interactions among parties in large, open systems. Yet, the scale and dynamism of such systems make it infeasible for each party to have a direct basis for trusting another party. For this reason, the participants in an open system must share information about trust. However, they should not automatically trust such shared information. This paper studies the problem of propagating trust in multiagent systems. It describes a new algebraic approach, shows some theoretical properties of it, and empirically evaluates it on two social network datasets. This evaluation incorporates a new methodology that involves dealing with opinions in an evidential setting.
Chung-Wei Hang, Yonghong Wang, Munindar P. Singh