Smart homes have a user centered design that makes human activity as the most important type of context to adapt the environment according to people's needs. Sensor systems that include a variety of ambient, vision based, and wearable sensors are used to collect and transmit data to reasoning algorithms nize human activities at different levels of abstraction. Despite various types of action primitives are extracted from sensor data and used with state of the art classification algorithms there is little understanding of how these action primitives affect high level activity recognition. In this paper we utilize action primitives that can be extracted from data collected by sensors worn on human body and embedded in different objects and environments to identify how various types of action primitives influence the performance of high level activity recognition systems. Our experiments showed that wearable sensors in combination with object sensors clearly play a crucial role in re...