Reducing leakage power and improving the reliability of data stored in the memory cells are both becoming challenging as technology scales down. While the smaller threshold voltages causes increased leakage, smaller supply voltages and node capacitances can be a problem for soft errors. This work compares the soft error rates of some recently proposed SRAM leakage optimization approaches. Our results using designs in 70nm technology show that many of these approaches may increase the soft error rates as compared to a standard 6T SRAM. Further, we demonstrate that there is a tradeoff between optimizing the leakage power and improving the immunity to soft error.