In this paper we present a scene analysis technique with subpixel filtering based on dense coded light fields. Our technique computes alignment and optically projects analysis filters to local surfaces within the extent of a camera pixel. The resolution gain depends on the local light field density not on the point spread function of the camera optics. An initial structured light sequence is used in establishing each camera pixel’s footprint in the projector generated light field. Then a sequence of basis functions embedded in the light field, with camera pixel support, combine with the local surface texture and are integrated by a camera sensor to produce a localized response at the subpixel scale. We address optical modeling and aliasing issues since the dense light field is undersampled by the camera pixels. Results are provided with objects of planar and non-planar topology.