Mixed reality (MR) has changed the perspective we see and interact with our world. While the current-generation of MR head-mounted devices (HMDs) are capable of generating high quality visual contents, interaction in most MR applications typically relies on in-air hand gestures, gaze, or voice. These interfaces although are intuitive to learn, may easily lead to inaccurate operations due to fatigue or constrained by the environment. In this work, we present Dual-MR, a novel MR interaction system that i) synchronizes the MR viewpoints of HMD and handheld smartphone, and ii) enables precise, tactile, immersive and user-friendly object-level manipulations through the multi-touch input of smartphone.
In addition, Dual-MR allows multiple users to join the same MR coordinate system to facilitate the collaborate in the same physical space, which further broadens its usability.
A preliminary user study shows that our system easily overwhelms the conventional interface, which combines in-air hand gesture and gaze, in the completion time for a series of 3D object manipulation tasks in MR.