We study conditional computational entropy: the amount of randomness a distribution appears to have to a computationally bounded observer who is given some correlated information. By considering conditional versions of HILL entropy (based on indistinguishability from truly random distributions) and Yao entropy (based on incompressibility), we obtain: • a separation between conditional HILL and Yao entropies (which can be viewed as a separation between the traditional HILL and Yao entropies in the shared random string model, improving on Wee’s 2004 separation in the random oracle model); • the first demonstration of a distribution from which extraction techniques based on Yao entropy produce more pseudorandom bits than appears possible by the traditional HILLentropy-based techniques; • a new, natural notion of unpredictability entropy, which implies conditional Yao entropy and thus allows for known extraction and hardcore bit results to be stated and used more generally.