In 2024, we brought London closer to the ocean: Nearly 700 design students at the Royal College of Art participated in the Grand Challenge 2023/24. In teams of five, the students were tasked with tackling challenges around London and the Ocean from a design perspective. The challenge involved multiple methodologies, including design engineering, speculative design, service design, materials- and fashion-related approaches. Each team had one month to develop a compelling proposal. Fifty students participating in the Grand Challenge were able to join the Extended Reality/XR Stream: Ten teams of five students each came up with different design solutions using the Unreal Engine 5. This paper presents how Unreal Engine 5 was introduced to students through lectures and hand-on sessions, and how XR technologies were employed to creatively interpret the original brief of the Grand Challenge and how it inspired our students to come up with unique design propositions. In particular, we discuss here two case studies in detail: XRiver and SuDScape. These two student projects were among the top 13 teams exhibited at the final Grand Challenge show, offering insights and recommendations for incorporating XR into design education.
Many extended reality systems use controllers, e.g. near-infrared motion trackers or magnetic coil-based hand-tracking devices for users to interact with virtual objects. These interfaces lack tangible sensation, especially during walking, running, crawling, and manipulating an object. Special devices such as the Tesla suit and omnidirectional treadmills can improve tangible interaction. However, they are not flexible for broader applications, builky, and expensive. In this study, we developed a configurable multi-modal sensor fusion interface for extended reality applications. The system includes wearable IMU motion sensors, gait classification, gesture tracking, and data streaming interfaces to AR/VR systems. This system has several advantages: First, it is reconfigurable for multiple dynamic tangible interactions such as walking, running, crawling, and operating with an actual physical object without any controllers. Second, it fuses multi-modal sensor data from the IMU and sensors on the AR/VR headset such as floor detection. And third, it is more affordable than many existing solutions. We have prototyped tangible extended reality in several applications, including medical helicopter preflight walking around checkups, firefighter search and rescue training, and tool tracking for airway intubation training with haptic interaction with a physical mannequin.
Incident Command Dashboard (ICD) plays an essential role in Emergency Support Functions (ESF). They are centralized with a massive amount of live data. In this project, we explore a decentralized mobile incident commanding dashboard (MIC-D) with an improved mobile augmented reality (AR) user interface (UI) that can access and display multimodal live IoT data streams in phones, tablets, and inexpensive HUDs on the first responder’s helmets. The new platform is designed to work in the field and to share live data streams among team members. It also enables users to view the 3D LiDAR scan data on the location, live thermal video data, and vital sign data on the 3D map. We have built a virtual medical helicopter communication center and tested the launchpad on fire and remote fire extinguishing scenarios. We have also tested the wildfire prevention scenario “Cold Trailing” in the outdoor environment.