In this paper the term implicit human computer interaction is defined. It is discussed how the availability of processing power and advanced sensing technology can enable a shift in HCI from explicit interaction, such as direct manipulation GUIs, towards a more implicit interaction based on situational context. In the paper an algorithm that is based on a number of questions to identify applications that can facilitate implicit interaction is given. An XMLbased language to describe implicit HCI is proposed. The language uses contextual variables that can be grouped using different types of semantics as well as actions that are called by triggers. The term of perception is discussed and four basic approaches are identified that are useful when building context-aware applications. Providing two examples, a wearable context awareness component and a sensor-board, it is shown how sensor-based perception can be implemented. It is also discussed how situational context can be exploited to i...