Several mathematical distances between probabilistic languages have been investigated in the literature, motivated by applications in language modeling, computational biology, syntactic pattern matching and machine learning. In most cases, only pairs of probabilistic regular languages were considered. In this paper we extend previous results to pairs of languages generated by a probabilistic context-free grammar and a probabilistic finite automaton. Key words: Probabilistic Context-Free Languages, Probabilistic Finite Automata, Probabilistic Language Distances, Language Entropy, Kullback-Leibler Divergence