In the past decade. there has been much literature describing various cache organizatrons that exploit general programming idiosyncrasies to obtain maxrmum hit rate (the probability that a requested datum is now resident in the cache). Little. if any, has been presented to exploit: (1) the inherent dual input nature of the cache and (2) the many-datum reference type central processor instructions. No matter how high the cache hit rate is, a cache miss may impose a penalty on subsequent cache references. This penalty is the necessny of waiting until the missed requested datum is received from central memory and, possibly, for cache update. For the two cases above, the cache references following a miss do not require the informatron of the datum not resident In the cache, and are therefore penalized in this fashion. In this paper, a cache organization is presented that essentially eliminates this penalty. This cache organizational feature has been incorporated in a cache/memory interfac...