In the area of privacy-preserving data mining, a differentially private mechanism intuitively encourages people to share their data truthfully because they are at little risk of revealing their own information. However, we argue that this interpretation is incomplete because external incentives are necessary for people to participate in databases, and so data release mechanisms should not only be differentially private but also compatible with those incentives, otherwise the data collected may be false. We apply the notion of truthfulness from game theory. In certain settings, it turns out that existing differentially private mechanisms do not encourage participants to report their information truthfully. On the positive side, we exhibit a transformation that takes truthful mechanisms and transforms them into differentially private mechanisms that remain truthful. Our transformation applies to games where the type space is small and the goal is to optimize an insensitive quantity ...