A popular way to measure the degree of dependence between two random objects is by their mutual information, defined as the divergence between the joint and product-of-marginal distributions. We investigate an alternative measure of dependence: the lautum information defined as the divergence between the product-of-marginal and joint distributions, i.e., swapping the arguments in the definition of mutual information. Some operational characterizations and properties are provided for this alternative measure of information.