The future of Extended Reality (XR) technologies is revolutionising our interactions with digital content, transforming how we perceive reality, and enhancing our problem-solving capabilities. However, many XR applications remain technology-driven, often disregarding the broader context of their use and failing to address fundamental human needs. In this paper, we present a teaching-led design project that asks postgraduate design students to explore the future of XR through low-fidelity, screen-free prototypes with a focus on observed human needs derived from six specific locations in central London, UK. By looking at the city and built environment as lenses for exploring everyday scenarios, the project encourages design provocations rooted in real-world challenges. Through this exploration, we aim to inspire new perspectives on the future states of XR, advocating for human-centred, inclusive, and accessible solutions. By bridging the gap between technological innovation and lived experience, this project outlines a pathway toward XR technologies that prioritise societal benefit and address real human needs.
Many extended reality systems use controllers, e.g. near-infrared motion trackers or magnetic coil-based hand-tracking devices for users to interact with virtual objects. These interfaces lack tangible sensation, especially during walking, running, crawling, and manipulating an object. Special devices such as the Tesla suit and omnidirectional treadmills can improve tangible interaction. However, they are not flexible for broader applications, builky, and expensive. In this study, we developed a configurable multi-modal sensor fusion interface for extended reality applications. The system includes wearable IMU motion sensors, gait classification, gesture tracking, and data streaming interfaces to AR/VR systems. This system has several advantages: First, it is reconfigurable for multiple dynamic tangible interactions such as walking, running, crawling, and operating with an actual physical object without any controllers. Second, it fuses multi-modal sensor data from the IMU and sensors on the AR/VR headset such as floor detection. And third, it is more affordable than many existing solutions. We have prototyped tangible extended reality in several applications, including medical helicopter preflight walking around checkups, firefighter search and rescue training, and tool tracking for airway intubation training with haptic interaction with a physical mannequin.