Many extended reality systems use controllers, e.g. near-infrared motion trackers or magnetic coil-based hand-tracking devices for users to interact with virtual objects. These interfaces lack tangible sensation, especially during walking, running, crawling, and manipulating an object. Special devices such as the Tesla suit and omnidirectional treadmills can improve tangible interaction. However, they are not flexible for broader applications, builky, and expensive. In this study, we developed a configurable multi-modal sensor fusion interface for extended reality applications. The system includes wearable IMU motion sensors, gait classification, gesture tracking, and data streaming interfaces to AR/VR systems. This system has several advantages: First, it is reconfigurable for multiple dynamic tangible interactions such as walking, running, crawling, and operating with an actual physical object without any controllers. Second, it fuses multi-modal sensor data from the IMU and sensors on the AR/VR headset such as floor detection. And third, it is more affordable than many existing solutions. We have prototyped tangible extended reality in several applications, including medical helicopter preflight walking around checkups, firefighter search and rescue training, and tool tracking for airway intubation training with haptic interaction with a physical mannequin.
Yang Cai, Mel Siegel, "Tangible extended reality with sensor fusion" in Electronic Imaging, 2023, pp 214-1 - 214-5, https://doi.org/10.2352/EI.2023.35.12.ERVR-214