In this paper, we demonstrate that simple interactions with objects in the environment leads to a manifestation of the perceptual properties of objects. This is achieved by deriving a condensed representation of the effects of actions (called effect prototypes in the paper), and investigating the relevance between perceptual features extracted from the objects and the actions that can be applied to them. With this at hand, we show that the agent can categorize (i.e., partition) its raw sensory perceptual feature vector, extracted from the environment, which is an important step for development of concepts and language. Moreover, after learning how to predict the effect prototypes of objects, the agent can categorize objects based on the predicted effects of actions that can be applied on them.