— For humanoid robots which are able to assist humans in their daily life, the capability for adequate interaction with human operators is a key feature. If one considers that more than 60% of human communication is conducted non-verbally (by using facial expressions and gestures), an important research topic is how interfaces for this non-verbal communication can be developed. To achieve this goal, several robotic heads have been designed. However, it remains unclear how exactly such a head should look like and what skills it should have to be able to interact properly with humans. This paper describes an approach that aims at answering some of these design choices. A behaviorbased control to realize facial expressions which is a basic ability needed for interaction with humans is presented. Furthermore a poll in which the generated facial expressions should be detected is visualized. Additionally, the mechatronical design of the head and the accompanying neck joint are given.