We integrated touch menus into a cohesive smartphone-based VR controller. Smartphone touch surfaces offer new interaction styles and also aid VR interaction when tracking is absent or imprecise or when users have limited arm mobility or fatigue. In Handymenu, a touch surface is split into two areas: one for menu interaction and the other for spatial interactions such as VR object selection, manipulation, navigation, or parameter adjustment. Users in our studies transitioned between the two areas and performed nested, repeated selections. A formal experiment included VR object selection (ray and touch), menu selection (ray and touch), menu layout (pie and grid), as well as touch and visual feedback sizes in some cases (two levels each).
Nicholas G. Lipari, Christoph W. Borst