While some leakage power reduction techniques require modification of process technology achieving savings at the fabrication stage, others are based on circuit-level optimizations and are applied at run-time. We focus our study on the latter kind and compare three techniques: Input Vector Control, Body Bias Control and Power Supply Gating. We determine their limits and benefits, in terms of the potential leakage reduction, performance penalty and area and power overhead. The importance of the `minimum idle time' parameter, as an additional evaluation tool, is emphasized, as well as the feasibility of achieving Power Supply Gating at low levels of granularity. The obtained data supports the formulation of a comprehensive leakage reduction scheme, in which each technique is targeted for certain types of functional units and a given level of granularity depending on the incurred overhead cost and the obtainable savings.