The next generation of computers might be literally wearable. Our vision of such a wearable computing device is an intelligent assistant, which is always with you and helps you to solve your every day tasks. Besides size and power, an important challenge is how to interact with wearable computers. An important aspect and unique opportunity of a wearable device is that it can perceive the world from a first-person perspective: a wearable camera can see what you see in order to analyze, model, and recognize things and people which are around you. In this paper we argue that a promising direction for interaction is to make the computers more aware of the situation the user is in and to model the user's context. Wearable cameras, mounted to the user's glasses, can recognize what the user is looking at, estimate the user's location, and model what the user is doing.