People accessing documents via the Internet typically experience latencies in retrieving content. We discuss continual-computation policies that dictate strategies for prefetching into cache portions of documents that a user may wish to review later. These utility-directed prefetching strategies maximize the expected utility of idle network resources that are available frequently, yet sporadically during the review of documents accessed from the World Wide Web. We present policies based on alternate utility models for assigning value to having immediate access to content and discuss means for coupling the methods with probabilistic models that predict a user's interests and access behavior. Keywords Prefetching, caching, network bandwidth, cost--benefit analysis, decision theory, continual computation.