In this paper, we study the interdependency between leakage energy and chip temperature in real-time systems. We observe that the temperature variation on chip has a large impact on the system’s leakage energy. By incorporating the temperature information, we propose an online temperatureaware leakage minimization algorithm for real-time systems. The basic idea is to run tasks when the system is cool and the workload is high, and put the system into sleep when it is hot and the workload is light. This online algorithm has low run-time complexity and improve the leakage energy saving by 34% on average in both real life and artificial benchmarks over traditional DVS approaches. Finally, our algorithm can be combined with existing dynamic voltage scaling methods to further improve the total energy efficiency.