As the use of virtualization and partitioning grows, it becomes possible to deploy a multi-tier web-based application with a variable amount of computing power. This introduces the possibility of provisioning only for a minimum workload, with the intention of renting more resources as necessary, but it also creates the problem of quickly and accurately identifying when more resources are needed or unneeded resources are being paid for. This paper presents a machine learning based approach to handling this problem. An autonomous adaptive agent learns to predict the gain (or loss) that would result from more (or less) resources; this agent uses only low-level system statistics, rather than relying on custom instrumentation of the operating system or middleware. Our agent is fully implemented and evaluated on a publicly available multi-machine, multi-process distributed system (the online transaction processing benchmark TPC-W). We show that our adaptive agent is competitive with any sta...