— In this paper, we propose a novel supernode caching scheme to reduce IP lookup latencies and energy consumption in network processors. In stead of using an expensive TCAM based scheme, we implement a set associative SRAM based cache. We organize the IP routing table as a supernode tree (a tree bitmap structure) [5]. We add a small supernode cache in-between the processor and the low level memory containing the IP routing table in a tree structure. The supernode cache stores recently visited supernodes of the longest matched prefixes in the IP routing tree. A supernode hitting in the cache reduces the number of accesses to the low level memory, leading to a fast IP lookup. According to our simulations, up to 72% memory accesses can be avoided by a 128KB supernode cache for the selected three trace files. Average supernode cache miss ratio is as low as 4%. Compared to a TCAM with the same size, 77% of energy consumption can be reduced. Keywords-Network processor; Tree bitmap; IP look...