Energy dissipation in cache memories is becoming a major design issue in embedded microprocessors. Predictive filter cache based instruction cache hierarchy is effective in reducing the access energy substantially at the cost of certain performance degradation. Here, the energy-delay product reduction heavily depends on the prediction accuracy of the predictor. In this paper, a simplified pattern prediction algorithm, which maximizes power savings in the novel filter cache prediction hierarchy, is proposed. The prediction mechanism relies on the static nature of the hit or miss pattern of the instruction access stream over the past filter cache line accesses. These static patterns are stored in a 32-entry single-bit Pattern Table (PT). The entries could be dynamically changed during run time to provide options for real-time adaptation in a complex application. The proposed prediction algorithm results in better prediction for all the benchmarks simulated. Energy delay product reductio...