Mixed Reality (MR) opens a new dimension for Human Computer Interaction (HCI). Combined with computer vision (CV) techniques, it is possible to create advanced input devices. This paper describes a novel form of HCI for the MR environment that combines CV with MR to allow a MR user to interact with a floating virtual touch screen using their bare hands. The system allows the visualisation of the virtual interfaces and touch screen through a Head Mounted Display (HMD). Visual tracking and interpretation of the user’s hand and finger motion allows the detection of key presses on the virtual touch screen. We describe an implementation of this type of interfaces and demonstrate the results through a virtual keypad application.