In this paper we suggest the use of tangible user interfaces (TUIs) for human-robot interaction (HRI) applications. We discuss the potential benefits of this approach while focusing on low-level of autonomy tasks. We present an experimental robotic interaction test bed to support our investigation. We use the test bed to explore two HRIrelated task-sets: robotic navigation control and robotic posture control. We discuss the implementation of these two task-sets using an AIBOTM robot dog. Both tasks were mapped to two different robotic control interfaces: keypad interface which resembles the interaction approach currently common in HRI, and a gesture input mechanism based on Nintendo WiiTM game controllers. We discuss the interfaces implementation and conclude with a detailed user study for evaluating these different HRI techniques in the two robotic tasks-sets. ACM Classification Keywords H5.2 [Information interfaces and presentation]: User Interfaces ? Interaction Styles Author Keywo...