We consider the problem of decision-making with side information and unbounded loss functions. Inspired by probably approximately correct learning model, we use a slightly different model that incorporates the notion of side information in a more generic form to make it applicable to a broader class of applications including parameter estimation and system identification. We address sufficient conditions for consistent decisionmaking with exponential convergence behavior. In this regard, besides a certain condition on the growth function of the class of loss functions, it suffices that the class of loss functions be dominated by a measurable function whose exponential Orlicz expectation is uniformly bounded over the probabilistic model. Decay exponent, decay constant, and sample complexity are discussed. Example applications to method of moments, maximum likelihood estimation, and system identification are illustrated, as well.