Faster-than-at-speed test have been proposed to detect small delay defects. While these techniques increase the test frequency to reduce the positive slack of the path, they exacerbate the already well known issue of IR-drop during test. This may result in false identification of good chips to be faulty due to IR-drop rather than small delay defects. We present a case study of IR-drop effects due to faster-than-at-speed test. We propose a novel framework for pattern generation/application using any commercial no-timing ATPG tool, to screen small delay defects and a technique to determine the optimal test frequency considering both performance degradation due to IR-drop effects and positive slack.