Sciweavers

ICCAD
2006
IEEE

Thermal-induced leakage power optimization by redundant resource allocation

14 years 5 months ago
Thermal-induced leakage power optimization by redundant resource allocation
Traditionally, at early design stages, leakage power is associated with the number of transistors in a design. Hence, intuitively an implementation with minimum resource usage would be best for low leakage. Such an allocation would generally be followed by switching optimal resource binding to achieve a low power design. This treatment of leakage power is unaware of operating conditions such as temperature. In this paper, we propose a technique to reduce the total leakage power of a design by identifying the optimal number of resources during allocation and binding. We demonstrate that, contrary to the general tendency to minimize the number of resources, the best solution can actually be achieved if a certain degree of redundancy is allowed. This is due to the fact that leakage is strongly dependent on the on-chip temperature profile. Distributing activity over a higher number of resources can reduce power density, remove potential hotspots and subsequently minimize thermal induced ...
Min Ni, Seda Ogrenci Memik
Added 11 Jun 2010
Updated 11 Jun 2010
Type Conference
Year 2006
Where ICCAD
Authors Min Ni, Seda Ogrenci Memik
Comments (0)