: We study general state-space Markov chains that depend on a parameter, say, . Sufficient conditions are established for the stationary performance of such a Markov chain to be differentiable with respect to . Specifically, we study the case of unbounded performance functions and thereby extend the result on weak differentiability of stationary distributions of Markov chains to unbounded mappings. The two main ingredients of our approach are (a) that we work within the framework of measure-valued differentiation (MVD) in order to study derivatives of unbounded performance functions, and (b) that we elaborate on normed ergodicity of Markov chains in order to establish the existence of the overall derivative expression. Our approach is not restricted to a particular estimation method. In fact, MVD expressions can be translated into various unbiased estimators. We illustrate our results with examples from queuing theory.