This paper describes an extensive analysis of the comfort level data of 7 subjects with respect to 12 robot behaviours as part of a human-robot interaction trial. This includes robot action, proximity and motion relative to the subjects. Two researchers coded the video material, identifying visible states of discomfort displayed by subjects in relation to the robot's behaviour. Agreement between the coders varied from moderate to high, except for more ambiguous situations involving robot approach directions. The detected visible states of discomfort were correlated with the situations where the comfort level device (CLD) indicated states of discomfort. Results show that the uncomfortable states identified by both coders, and by either of the coders corresponded with 31% and 64% of the uncomfortable states identified by the subjects' CLD data (N=58), respectively. Conversely there was 72% agreement between subjects’ CLD data and the uncomfortable states identified by both c...