
In 2024, we brought London closer to the ocean: Nearly 700 design students at the Royal College of Art participated in the Grand Challenge 2023/24. In teams of five, the students were tasked with tackling challenges around London and the Ocean from a design perspective. The challenge involved multiple methodologies, including design engineering, speculative design, service design, materials- and fashion-related approaches. Each team had one month to develop a compelling proposal. Fifty students participating in the Grand Challenge were able to join the Extended Reality/XR Stream: Ten teams of five students each came up with different design solutions using the Unreal Engine 5. This paper presents how Unreal Engine 5 was introduced to students through lectures and hand-on sessions, and how XR technologies were employed to creatively interpret the original brief of the Grand Challenge and how it inspired our students to come up with unique design propositions. In particular, we discuss here two case studies in detail: XRiver and SuDScape. These two student projects were among the top 13 teams exhibited at the final Grand Challenge show, offering insights and recommendations for incorporating XR into design education.

The NEMO project (New Economic Model for the Oceans) tackles ocean-related issues from a design perspective. The initial NEMO project is based on an experimental data gathering using 4K cameras to capture objects along a ship voyage of 6,070 nautical miles from Kangerlussuaq (Greenland) to Poole (UK) via Sardinia (Italy). This data was combined with recorded GPS coordinates to visualize the journey using the Cesium.js environment. Several objects were identified and mapped to the corresponding locations of the ship trajectory. In this work, we are using this initial web visualization as starting point to create a first prototype of an UNREAL engine-based immersive environment which can be explored within the new RCA SNAP Visualization Lab facility. Based on a previous co-design workshop locally ran at the RNLI headquarters in Poole, we started to explore opportunities to enable similar experiences in an immersive 3D environment. The ultimate aim is to enable collaborative data visualization and exploration in the context of co-design workshops. In this way, we are combining Macro-Meso-Micro perspectives with the purpose to provide on one hand a holistic overview of the journey, and on the other hand enable place-based exploration of ocean- and coast-related scenarios.

Many extended reality systems use controllers, e.g. near-infrared motion trackers or magnetic coil-based hand-tracking devices for users to interact with virtual objects. These interfaces lack tangible sensation, especially during walking, running, crawling, and manipulating an object. Special devices such as the Tesla suit and omnidirectional treadmills can improve tangible interaction. However, they are not flexible for broader applications, builky, and expensive. In this study, we developed a configurable multi-modal sensor fusion interface for extended reality applications. The system includes wearable IMU motion sensors, gait classification, gesture tracking, and data streaming interfaces to AR/VR systems. This system has several advantages: First, it is reconfigurable for multiple dynamic tangible interactions such as walking, running, crawling, and operating with an actual physical object without any controllers. Second, it fuses multi-modal sensor data from the IMU and sensors on the AR/VR headset such as floor detection. And third, it is more affordable than many existing solutions. We have prototyped tangible extended reality in several applications, including medical helicopter preflight walking around checkups, firefighter search and rescue training, and tool tracking for airway intubation training with haptic interaction with a physical mannequin.

Incident Command Dashboard (ICD) plays an essential role in Emergency Support Functions (ESF). They are centralized with a massive amount of live data. In this project, we explore a decentralized mobile incident commanding dashboard (MIC-D) with an improved mobile augmented reality (AR) user interface (UI) that can access and display multimodal live IoT data streams in phones, tablets, and inexpensive HUDs on the first responder’s helmets. The new platform is designed to work in the field and to share live data streams among team members. It also enables users to view the 3D LiDAR scan data on the location, live thermal video data, and vital sign data on the 3D map. We have built a virtual medical helicopter communication center and tested the launchpad on fire and remote fire extinguishing scenarios. We have also tested the wildfire prevention scenario “Cold Trailing” in the outdoor environment.