While objects of our focus of attention (“where we are looking at”) and accompanying affective responses to those objects is part of our daily experience, little research exists on investigating the relation between attention and positive affective evaluation. The purpose of our research is to process users’ emotion and attention in real-time, with the goal of designing systems that may recognize a user’s affective response to a particular visually presented stimulus in the presence of other stimuli, and respond accordingly. In this paper, we introduce the AutoSelect system that automatically detects a user’s preference based on eye movement data and physiological signals in a two-alternative forced choice task. In an exploratory study involving the selection of neckties, the system could correctly classify subjects’ choice of in 81%. In this instance of AutoSelect, the gaze ‘cascade effect’ played a dominant role, whereas pupil size could not be shown as a reliable...