We present a novel technique for calibrating display-camera systems from reflections in the user's eyes. Display-camera systems enable a range of vision applications that need controlled illumination, including 3D object reconstruction, facial modeling and human computer interaction. One important issue, though, is the geometric calibration of the display, which requires additional hardware and tedious user interaction. The proposed approach eliminates this requirement by analyzing patterns that are reflected in the cornea, a mirroring device that naturally exists in any display-camera system. By applying this strategy we also obtain a continuous estimation of eye poses which facilitates further applications. We investigate the effect of display size, camera-eye distance and individual eye anatomy experimentally using only off-the-shelf components. Results are promising and show the general feasibility of the approach.