Back to articles
Work Presented at Electronic Imaging 2024
Volume: 67 | Article ID: 060401
Image
Spatial Analysis and Visual Communication of Emergency Information through Augmented Reality
  DOI :  10.2352/J.ImagingSci.Technol.2023.67.6.060401  Published OnlineNovember 2023
Abstract
Abstract

During emergencies like fire and smoke or active shooter events, there is a need to address the vulnerability and assess plans for evacuation. With the recent improvements in technology for smartphones, there is an opportunity for geo-visual environments that offer experiential learning by providing spatial analysis and visual communication of emergency-related information to the user. This paper presents the development and evaluation of the mobile augmented reality application (MARA) designed specifically for acquiring spatial analysis, situational awareness, and visual communication. The MARA incorporates existing permanent features such as room numbers and signages in the building as markers to display the floor plan of the building and show navigational directions to the exit. Through visualization of integrated geographic information systems and real-time data analysis, MARA provides the current location of the person, the number of exits, and user-specific personalized evacuation routes. The paper also describes a limited user study that was conducted to assess the usability and effectiveness of the MARA application using the widely recognized System Usability Scale (SUS) framework. The results show the effectiveness of our situational awareness-based MARA in multilevel buildings for evacuations, educational, and navigational purposes.

Subject Areas :
Views 294
Downloads 49
 articleview.views 294
 articleview.downloads 49
  Cite this article 

Sharad Sharma, Rishitha Reddy Pesaladinne, "Spatial Analysis and Visual Communication of Emergency Information through Augmented Realityin Journal of Imaging Science and Technology,  2023,  pp 1 - 9,  https://doi.org/10.2352/J.ImagingSci.Technol.2023.67.6.060401

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2023
 Open access
  Article timeline 
  • received July 2023
  • accepted October 2023
  • PublishedNovember 2023
jist
JIMTE6
Journal of Imaging Science and Technology
J. Imaging Sci. Technol.
J. Imaging Sci. Technol.
1062-3701
1943-3522
Society for Imaging Science and Technology
1.
Introduction
In recent years, augmented reality (AR) has emerged as a powerful tool for enhancing spatial analysis and visual communication in various fields. One area where AR holds significant potential is in the domain of emergency management and response. The ability to effectively convey critical emergency-related information in a spatially accurate and contextually relevant manner is crucial for ensuring the safety and well-being of individuals during high-stress situations. During emergencies, accurate and timely dissemination of information plays a critical role in saving lives and minimizing damage. The traditional methods of conveying emergency-related information, such as paper maps, charts, and textual instructions, often fall short of providing individuals with the necessary spatial awareness and situational understanding during high-stress situations. As technology continues to advance, there is an increasing need to explore innovative approaches that enhance situational awareness and facilitate efficient decision-making in emergency scenarios. In complex built environments, understanding evacuation routes, locating emergency exits, and identifying potential hazards can be challenging, especially during moments of panic or confusion.
This study aims to explore the application of AR in spatial analysis and visual communication of emergency-related information. Spatial analysis plays a crucial role in real-world emergencies by providing valuable insights and aiding decision-making processes. It allows emergency responders to develop a comprehensive understanding of the emergency, identify hotspots, and make informed decisions regarding evacuation routes. By analyzing historical data, topography, critical infrastructure, and vulnerable areas; authorities can identify high-risk zones, develop emergency response plans, and implement preventive measures to mitigate risks and enhance preparedness. Spatial analysis also supports effective communication and dissemination of information during emergencies by visualizing data on maps or using geographic information systems (GIS). This enables authorities to convey critical information to the public, including evacuation orders, shelter locations, hazard zones, and real-time updates. Overall, the spatial analysis provides valuable insights into the geographic context of emergencies, enabling authorities and responders to make evidence-based decisions, allocate resources efficiently, plan for risks, and enhance overall emergency response and recovery efforts. Sharma et al. [17] have developed mobile AR applications for building evacuation using preexisting features in the building as markers. The application gives a visual representation of a building in 3D space, allows people to identify the exits in the building as well as creates alerts for emergency response such as fire and smoke. It also gives the path to the various exits from the current location as well as the shortest path to the exits, and directions to a safe zone.
Visual communication provides a quick and intuitive way to understand information, especially in high-stress situations. Visuals such as symbols, icons, maps, and diagrams can convey crucial details at a glance, enabling individuals to grasp the situation and take appropriate action swiftly. By using standardized symbols, colors, and visual elements, emergency authorities can ensure that information is uniform and recognizable, reducing confusion and improving response coordination. In situations where verbal communication is challenging, such as in noisy or chaotic environments, visual communication becomes indispensable. Signage, pictograms, and visual cues enable effective communication even in situations with limited verbal interaction or when individuals may be experiencing heightened stress or panic. By leveraging visual elements effectively, emergency authorities can improve information dissemination, enhance public safety, and enable individuals to make informed decisions during critical situations. In mobile augmented reality application (MARA), spatial analysis and visual communication are very essential as they provide users with methods for navigating and visualizing complex situations by overlaying digital data, sensor data about the floor blueprints, and maps and physically driving them onto the environment.
By harnessing the capabilities of AR, we can revolutionize the way emergency-related information is communicated, ultimately improving public safety and response effectiveness. This research opens new possibilities for spatial analysis and visual communication in emergency management, paving the way for innovative solutions that empower individuals to make informed decisions and take appropriate actions during critical situations. During critical situations like fire and smoke or active shooter events, providing basic information for first responders, building occupants, and decision-makers becomes very critical. The next-generation systems need to include a visual analytics environment that supports planning, detection, prediction, and decision-making for emergency evacuation. Such kinds of environments will require integration of spatial analysis, and situational awareness, including improved strategies for visual communication, and user-specific relevant information as a basis for actionable decision-making strategy. To this end, we have developed a new MARA incorporating the principles of spatial analysis and visual communication into a command-and-control environment for enhanced situational awareness as shown in Figure 1.
Figure 1.
Display of the floor plan in MARA using room numbers as a marker.
The rest of the paper is organized as follows: Section 2 discusses studies related to this study; Section 3 details the system framework of MARA by addressing hardware and software specifications used; Section 4 describes the spatial analysis implementation of the proposed mobile augmented reality system; Section 5 describes the visual communication implementation of the proposed mobile augmented reality system; Section 6 addresses the limited user study for evaluating MARA as well as the results of the study, and Section 7 concludes this paper and gives ideas for future work regarding this study.
2.
Related Work
During emergencies, it is important to communicate evacuation information to building occupants promptly. It sometimes becomes challenging for people to create a visual and mental representation of 3D space from a 2D floor plan displayed in a building. Therefore, it is imperative to convey the evacuation information to the user based on the current location and taking into account the architectural complexities of the building. Chen et al. [8] have developed an AR-based real-time mobile system for assistive indoor navigation with target segmentation (ARMSAINTS), which provides personalized turn-by-turn navigation instructions to the user. Their system utilizes an automatic graph construction method to generate a graph from a 2D floorplan for providing location sensing.
Marker-based indoor position systems do not require the installation of expensive infrastructure as compared to bluetooth-based indoor position systems and wi-fi-based indoor positioning systems. Marker-based systems estimate the device’s current location based on the predefined markers. Commute Booster [9] is a mobile application for navigation support for people with blindness and low vision. Their system provides real-time feedback to users regarding the presence or absence of relevant navigation signs. Similarly, an indoor navigation system [10] has been developed that uses QR codes to provide localization and navigation services to attender robots using contour detection techniques. AR markers are more accurate as compared to QR code markers. Sato [11] has utilized AR markers with sensors for user localization with power-saving mode. However, the marker-based positioning system still requires the installation and maintenance of the markers. But instead of installing new markers, the existing features in the building can act as markers for the AR system. Sharma et al. [12, 13] have explored the use of HoloLens for building evacuation by incorporating existing permanent features in the building as markers to trigger the floor plan and subsequent location of the person in the building.
Lately, there has been a rise in the development of mobile applications for emergency response and decision-making. MARAs have been used for heritage Tourism [14], assist visually impaired disabled people [15], and help blind people in indoor navigation [16]. Cognitive or mental maps are important representations of space stored in a user’s mind. Taylor et al. [17] suggested that the displays organizations use should meet users’ dynamic navigational goals and navigation systems should be adaptable to users’ spatial information preferences. However, evacuation maps are designed to show the evacuation plan to the user. There are three types of evacuation map styles namely planning, evacuation, and crisis [18]. Evacuation maps in buildings are located around corridors and within rooms to highlight the path to the nearest exit [19]. Chen et al. [20] have suggested that a successful evacuation map should be easily understood by the user and should follow cartographic principles. Evacuation maps are useful in case of emergency evacuation as they assist building occupants. However, evacuation maps are not easily available in every corridor and room. As a result, there is a need to find alternative ways to communicate evacuation routes to the user. MARA provides this functionality of spatial analysis and visually communicate the evacuation routes to the user. MARA also provides the user with both cognitive maps and evacuation maps for knowledge discovery and aids in evacuation.
Computer simulations and multi-agent systems are also powerful tools for communicating emergency information to the user during crises. Sharma et al. [2124] have argued for the use of computer simulations and multi-agent systems for emergency evacuations. They conducted a study for modeling both individual behavior and group behavior during emergency evacuations. These simulations have been also incorporated into MARA and Google Glass as participatory agent-based simulations for indoor evacuation [25]. The participatory multi-agent simulation combines scenario-guided maps for users equipped with Google Glass. Sanchez et al. [25] have explored the use of Google Glass for evacuation to improve the visual communication of personalized evacuation routes for indoor positioning and tracking. Smartphones [26, 27], tablets [28], and wearable devices [29, 30] have been used to provide evacuation instructions to the user. Smartphones and tablets aid in spatial knowledge acquisition using automatic navigation systems [31].
The utilization of virtual reality (VR) and AR technology has potential benefits for indoor environment emergency management. Evacuation during emergencies is an important topic for emergency response and decision-making. Evacuation inside a building could be a result of fire and smoke. There have been a lot of studies conducted for using VR/AR experiments for fire emergencies by creating different what-if scenarios to study emergency response and conduct evacuation training safely. VR and AR have also been employed in training for fire emergencies safety education [32], wayfinding [33], evacuation experiments for evacuation [34], evaluating fire exit signs [35], recreating past disaster [36], and in search and rescue during disasters [37].
3.
System Framework of MARA
The system framework diagram provides a comprehensive overview of MARA, highlighting the essential components and their interactions and demonstrates how the user, devices, camera, and application functionalities are integrated. This framework serves as a fundamental structure for comprehending the infrastructure and processes underlying the utilization of AR technology on mobile devices. The user interacts with the application installed on either a phone or tablet running Android or iOS. The application opens up a camera that scans the marker i.e., room number in the building, and displays the floorplan of the building and the current location of the user in the building with a pin projected and superimposed on top of the floor plan, formulating the display that the user sees.
As shown in Figure 2, the user interface consists of buttons and legends that help the user acquire the respective information about the building. When the user clicks on any button which could be A Wing, B Wing, D Wing, E Wing, F Wing, G Wing, H Wing, J Wing, K Wing, M Wing, shelter areas, AED (Automated External Defibrillator), stairs to the shelter areas or exits, they get instructions via color mapping on the floor plan highlighting the respective area. The individual’s location on the floor plan would also be displayed along with the selected legend information with which the user can identify the nearest shelter areas, stairs, AEDs, and the exits from the location of the room. When a user scans a marker with an AR camera, a floor map is projected and a button for emergency navigation will be located on the display GUI. When the emergency navigation button is clicked, the navigation path to the nearest exits from the current location in the building for evacuation will dynamically be displayed which will help the user in navigation during an emergency. Some evacuation information could be only for privileged personnel. For example, certain zones in a building need a key card access to enter the area. There may be an exit located in the key card access area that might not be available to all the building occupants. In our proposed MARA we show the navigation path to the nearest exit from the current location for the public without incorporating key card access areas. The alternative path shows the next nearest exit which incorporates the navigation path through the key card access zones. As shown in Fig. 1, visual communication includes the emergency navigation button and arrows toward the nearest exit.
Figure 2.
System framework diagram for MARA.
3.1
Hardware Devices Specifications
The MARA is developed for two devices phones and tablets, and for both Android and iOS versions of the devices. The system framework diagram in Fig. 2 shows multiple devices on which the AR application is deployed. The iOS version mobile devices used and tested for this application are iPhone 12 Mini and iPhone 14 wherein iPhone 12 Mini features a 12-megapixel wide-angle lens with optical image stabilization, while the secondary camera is a 12-megapixel ultra-wide-angle lens. iPhone 14 Pro has a 48MP Main wide-angle lens with optical image stabilization, while the secondary camera is a 12-megapixel ultra-wide-angle lens. Android version mobile devices used for this application are Samsung Galaxy S23 Ultra which features a triple-lens camera, 50MP resolution main lens with up to 30X digital zoom and it runs on an Android 10 operating system, 1080 × 2340 pixel’ resolution, and contains 32 gigabytes of RAM whereas Samsung Galaxy S22 uses a Dynamic AMOLED 2X display, an Octa-core CPU, a resolution of 1080 × 2340 pixels, and a rear camera of 50 MP, f/1.8, 23 mm (wide). This application is also tested on Android 12.0 tablet Samsung S8 and iPad Pro.
3.2
Software Specifications
MARA is developed on Unity (version 2021.3.21), a powerful game development engine that offers a wide range of software specifications and features. It includes a built-in physics engine that allows developers to simulate realistic physics interactions in their games and applications. It supports rigid body dynamics, collisions, joints, and other physics simulations. It provides a visual editor that allows developers to design game scenes, create 3D models, import assets, set up animations, configure lighting and camera settings, and manage game objects and components. Multiple assets like XR, Vuforia, ARKit, and lean touch are added in MARA. All the markers are added to the Vuforia database, and these markers project out the floor plans of the building highlighting the current location. Multiple C#scripts are added for highlighting a respective area of the floor map on a button click.
4.
Spatial Analysis Implementation
Implementing spatial analysis in an emergency navigation application can greatly enhance its effectiveness and provide critical support during crises. By integrating spatial analysis capabilities, MARA provides real-time navigation guidance, the current location of the person on the displayed map, the number of exits, user-specific personalized evacuation routes and improves overall situational awareness for users.
When users scan the marker in Figure 3 from the device, the application superimposes the appropriate floorplan above the markers with a pin indicating where the individual is located while also providing them with different legends as displayed in Fig. 3. The user interface allows users to engage with the system and when the user clicks on the wing button in the right corner of the floor map, the respective wing area on the floor plan is highlighted as shown in Figure 4.
Figure 3.
The user interface of the MARA.
Figure 4.
The projected floor plan with wings highlighted when the marker (room number) is scanned.
When the user clicks on the shelter area button on the floor map, shelter areas on the map will be highlighted for user awareness. With a click of the exit button, all the exits in the building would be highlighted on the floor plan, and with the click of the stair and AED buttons, stairs and AEDs in the building would be highlighted. The users can adjust the zoom level of the floor map according to their requirements. They can zoom in to get a closer view of specific areas or details, and they can also zoom out to have a broader perspective of the entire floor map. This zooming functionality allows users to customize their view and navigate the floor map as needed, enhancing their understanding and interaction with the space. This visualization helps users orient themselves within the building, providing a clear understanding of the layout and locations of various rooms, corridors, and amenities. This cutting-edge technology enhances safety, improves navigation efficiency, and ensures individuals can quickly and confidently respond to emergencies within buildings.
As our world becomes increasingly complex, the integration of AR and data-driven solutions is paving the way for safer and more efficient experiences in indoor environments. MARA integrates geospatial data, such as building blueprints, occupancy data, and emergency response infrastructure, to provide comprehensive information for effective emergency navigation. MARA’s advanced features also include the ability to zoom in on images, providing users with detailed and magnified views of floor plans, markers, and critical information. By analyzing the spatial data, the application can identify optimal evacuation routes, identify potential hazards, and locate critical resources within the building. In the future, MARA can be used as a training and simulation tool for emergency preparedness. By simulating different emergency scenarios, users can practice navigating the building, following evacuation routes, and understanding emergency protocols, enhancing their readiness and confidence during real emergencies. The current location of the user is determined in MARA through the process of feature extraction using markers. Feature extraction through markers provides a reliable method for tracking and determining the user’s location in AR applications. It enables precise positioning and facilitates the overlaying of virtual content onto the real-world environment, enhancing the overall augmented reality experience. In emergencies where GPS and wi-fi capabilities may fail or be unreliable, having a MARA that can function without these technologies is essential. The proposed MARA is designed to operate independently of GPS and wi-fi, ensuring that it can still provide valuable assistance and support during emergencies.
5.
Visual Communication Implementation
The mobile AR application presented in this paper shows great potential in supporting effective decision-making during emergencies, benefiting both building occupants and emergency responders. Effective navigation during emergencies plays a critical role in ensuring the safety and well-being of individuals. Visual communication methods have the potential to significantly improve emergency navigation by providing clear and intuitive directions. This paper explores the use of visual communication techniques in emergency navigation and discusses their benefits, challenges, and potential applications. By leveraging visual cues and technologies such as AR, interactive maps, and signage systems, visual communication can enhance situational awareness, reduce response time, and facilitate efficient decision-making in emergency scenarios.
Visual communication techniques, such as interactive maps and AR overlays, provide real-time information about escape routes, obstacles, and emergency services, improving situational awareness for individuals navigating through emergency environments. Navigation routes for emergency evacuation are superimposed on the floor plan and the two nearest exit navigation paths are highlighted; this helps users in effective decision-making in case of emergency. When the user scans the room number using their device, the mobile augmented reality application displays the corresponding floor map. Upon clicking the Emergency Navigation button, the application shows an updated floor plan that includes recommended and alternative navigation paths. These paths guide the user from their current location to the nearest exit in case of an emergency.
Figure 5 illustrates this floor plan with the navigation paths. Whenever the user scans any marker, the navigation path to the nearest exit from the current location from where the user scanned the marker is updated on the floor plan dynamically. Users can exit the navigation flow by clicking on the Back button on the top left of the floor map as displayed in Fig. 5.
Figure 5.
Floor map with navigation path for evacuation.
6.
Evaluation and Results
The MARA presented in this paper shows great potential in supporting effective decision-making during emergencies.
6.1
Study Overview
A limited user study with 10 graduate students was conducted to assess the usability and effectiveness of the MARA using the System Usability Scale (SUS) framework. All the users who had either prior experience with game engines or no experience were instructed to install the MARA application and use the application.
6.2
Method
After informed consent, a demographics questionnaire was administered to understand participants’ experience in using mobile applications. MARA was installed on multiple devices; iPhone 12 mini, iPhone 14, Samsung Galaxy S23 Ultra, Samsung Galaxy S22, and Android 12.0 tablet Samsung S8. Participants were provided a short orientation and step-by-step instructions on how to use the system. After completion of the activity, participants were asked to complete a questionnaire regarding their experience of using the application.
6.3
Experimental Setup
At the beginning of each session, participants were shown a demonstration of how to use MARA for iPhone, Android tablet, Android phone, and iPad. Then, the users were given MARA and asked to use the application to evacuate the building from the same initial location in the campus building. The experimental setup consisted of the building’s existing permanent features, such as room numbers and signage, as markers, as seen in Figure 6. These markers were solely used to interact with the system. When users position their mobile phones or tablets in front of these markers, the camera detects these markers, and the application generates and overlays the building’s floor plan on their device’s screen. All the participants first held a 2D paper floor plan (traditional method), then a mobile phone (android and iOS), and then a tablet (android and iOS) with the MARA installed on it. When either the tablet or the mobile phone detected the markers, the participants were able to see the corresponding floorplans and all the toggle buttons to interact with the floor plan.
Figure 6.
View of the floor plan in a tablet triggered through a marker (room number) in the building.
6.4
User Activity
The users’ assigned task was to scan markers with the device camera and the floor map was projected on the screen. Users will have the ability to visually explore and navigate the building’s layout, enhancing their understanding of the space and facilitating efficient wayfinding. Users can click the buttons on the map to find out exits, shelter areas, and navigation routes in the building from the current location as shown in Fig. 3. As part of this activity, users try to leave the building using the emergency navigation path as shown in Fig. 5, and see if it’s the best path. Then, they are given a satisfaction questionnaire about the overall experience.
6.5
Data Collection and Analysis
The questionnaire used in the study consisted of five standard Likert-scale with an interval from 1–5. For usability and satisfaction using the systems, we used the System Usability Scale (SUS) framework. SUS was created by John Brooke [38] in 1986, it allows for evaluation of a wide variety of products and services, including hardware, software, mobile devices, and websites. SUS has generally been seen as providing a high-level subjective view of usability and is thus often used in carrying out comparisons of usability between systems. The questionnaires on user experience with MARA are displayed in Table I. For performance analysis, we measured and annotated the time spent performing a wide variety of tasks using each device (e.g., iPhone, Android Phone, Tablet, iPad) and the experience of participants with the application.
Table I.
The questions used in the SUS user study.
QuestionsAverage
(1) I would use MARA frequently4.45
(2) I found MARA unnecessarily complex2.09
(3) I thought MARA was easy to use4.27
(4) I would need technical support to use MARA2.72
(5) Expected outcome would occur on button click in MARA3.81
(6) The arrows guided me to the appropriate exit from my4.36
current location
(7) The floor plan was generated when the marker is scanned4.36
(8) I was able to zoom in and out when viewing a floor plan4.36
(9) Most people would learn to use MARA very quickly.4.54
(10) Felt confident using MARA for instructional, educational,4.54
and navigational purposes
(11) Device effectiveness of MARA compared with 2D map(choose one)
6.6
Results
In the user study, the participants were asked to complete a questionnaire about their perceptions of the usability and effectiveness of MARA on both tablet and phone at the end of the study. The study consisted of 40% male participants and 60% female participants. Participants were asked a series of questions about their engagement, familiarity, and functionality of MARA.
The results of the user study with error bars representing the standard deviation of the data can be seen in Figure 7. A higher standard deviation means that the data is more spread out, while a lower standard deviation means that the data is more concentrated. Participants were asked to rate their answers on a Likert scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). It was found that 75% of users had above-average and extensive experience using mobile applications. Based on the results in Table I, the high average for the question “I would use MARA frequently” indicates that users are inclined to use MARA frequently, indicating that the system has successfully generated user interest and perceived value. The high average rating of the question “I thought MARA was easy to use” indicates that users generally found MARA to be easy to use. This is a positive finding, as ease of use is essential for a system’s usability. The average rating of the question “I would need technical support to use MARA” suggests that users may occasionally require technical support to use MARA. While this rating is not extremely low, it indicates room for improvement in terms of providing clearer instructions or better self-explanatory features to reduce the need for support.
Figure 7.
Questionnaire results for user study.
Users provided a high average rating for “The floor plan was generated when the marker is scanned” indicating that the floor plan generation upon scanning the marker was successful. This implies that the system’s marker recognition and floor plan rendering features are functioning as expected. The high average rating for “Most people would learn to use MARA very quickly” suggests that users believe most people would be able to learn how to use MARA quickly. This indicates that the system has good learnability, making it accessible to a broad range of users. The high average rating for the question “Felt confident using MARA for instructional, educational, and navigational purposes” suggests that users had a strong sense of confidence when using MARA for instructional, educational, and navigational purposes.
Figure 8 shows that users generally found the arrow guidance in MARA effective in assisting them to navigate and find appropriate exits. This indicates that the arrow guidance within MARA was successful in assisting users with navigation in emergency situations and helping them find the appropriate exits from their current location. By successfully directing users to the appropriate exits, the arrow guidance feature in MARA demonstrates good wayfinding functionality. Wayfinding refers to the user’s ability to navigate through a system or environment with ease and accuracy and this indicates that the system instills trust and reliability in users. 90% of users felt confident of using MARA for instructional, educational, and navigational purposes, with most people feeling confident or very confident. Fig. 8 shows that 92% of iPhone users felt confident of using MARA in iPhone as compared to 90% of those using MARA in android tablet, iPad, and android phone. However, it is important to note that some people did not feel as confident because they found the interface difficult to use or that they did not find the information they were looking for in MARA, or that the information was not presented clearly and concisely.
Figure 8.
User responses on using MARA for navigational purposes.
The results in Figure 9 for the response to question 11 in Table I focus on finding how efficient MARA is for navigational purposes when compared to 2D maps (traditional methods of evacuation). Fig. 9 shows that majority of users (60%) prefer phones (android and iOS) over tablets (android and iOS) and 2D maps. Only 10% of users preferred to use a traditional paper-based method for evacuation. The likely reason for higher preference for smart phones is that smartphones are more portable, convenient, and offer a wider range of features than tablets or 2D maps. Overall, the analysis of the SUS ratings for MARA indicates positive usability scores, with users generally perceiving the system as easy to use, having good wayfinding functionality, and instilling confidence.
Figure 9.
Device suitability and effectiveness of MARA (Android tablet, iPad, Android phone, iPhone).
7.
Conclusions
This paper presents location-aware MARA for multilevel spaces by incorporating situational awareness, spatial analysis, and visual communication of emergency information through AR. MARA was built for tablets and mobile phones. We have demonstrated how our proposed MARA can provide spatially contextualized AR visualizations that promote spatial knowledge acquisition and support cognitive mapping. We have introduced a series of AR visualizations that are designed to enhance spatial perception and situational awareness of multilevel spaces through situated AR evacuation displays. Our work demonstrates how AR tools can support improved emergency preparedness communication. These AR visualizations are developed to educate and prepare at-risk populations before the occurrence of a hazardous event. This study subtly represents how sophisticated and important it is to have strategic planning for emergency evacuation.
Buildings are often mounted with 2D evacuation maps throughout the building. A 2D floor plan provides a building outline and is considered as a traditional way for evacuation. Moreover, 2D evacuation maps require users to memorize the evacuation path and recall the path during evacuation. Even if the 2D evacuation maps are saved on smart phone devices, they are static floor plans and lack interactivity. Our proposed MARA permits building occupants to evacuate multi-level buildings during emergencies by enhancing situational awareness by promoting spatial knowledge and cognitive mapping to a larger degree than 2D visualizations. MARA is deployed in Android and iOS smart phones and tablets, and include image markers that are used to generate a 3D floorplan visualization when a device camera is pointed toward a relevant marker. The results from the user studies indicated that the MARA was useful in helping people evacuate the building and can be used as a substitute for traditional paper-based evacuation plans. 2D evacuation plans displayed at key places in the building are also used as a marker in our current MARA to make the plans more interactive.
The proposed MARA is currently created for a campus building. Every building has its own floor plan and room number signages. The deployment of MARA to other scenarios would be difficult because every building is different, and require a customized MARA. But a generalized framework can be created in the future, where a user can take a picture of the room number signages (as markers) and tag it on the floor plan. Future work can involve scaling up the project to include a learning phase for a new building. The learning phase might involve a user carrying a smartphone (or tablet or HoloLens) and capturing the permanent features in the building (such as room numbers, and signages) as markers and tagging them to their location in the floor plan. So next time when the user scans the same marker, the MARA will extract the floor plans with their current location. We hope that our project stands as a starting point and continuation of AR technology in emergency response and inspires more global researchers to contribute to this work. Future work will involve conducting a detailed user study and evaluation after getting IRB approval. We will also explore the integration of MARA with HoloLens incorporating touch and voice functionality.
Acknowledgment
This work is funded in part by the NSF award 2321539. The authors would also like to acknowledge the support of NSF Award 2319752, NSF Award 2321574, and Sub Award No. NSF00123-08 for NSF Award 2118285. The author would like to acknowledge Mr. Praneeth Koppolu, who was involved in the initial development of MARA in Unity 3D.
References
1SharmaS.2023Mobile augmented reality system for emergency responseProc. 21st IEEE/ACIS Int’l. Conf. on Software Engineering, Management and Applications (SERA 2023)IEEEPiscataway, NJ10.1109/SERA57763.2023.10197820
2SharmaS.EngelD.2023Mobile augmented reality system for object detection, alert, and safetyIS&T Electronic Imaging: The Engineering Reality of Virtual Reality 2023IS&TSpringfield, VA10.2352/EI.2023.35.12.ERVR-218
3MannuruN. R.KanumuruM.SharmaS.Mobile AR application for navigation and emergency responseProc. IEEE Int’l. Conf. on Computational Science and Computational Intelligence, (IEEE-CSCI-RTMC)2022IEEEPiscataway, NJ
4SharmaS.StigallJ.BodempudiS. T.2020Situational awareness-based augmented reality instructional (ARI) module for building evacuationProc. 27th IEEE Conf. on Virtual Reality and 3D User Interfaces, Training XR Workshop707870–8IEEEPiscataway, NJ10.1109/VRW50115.2020.00020
5StigallJ.SharmaS.2019Evaluation of mobile augmented reality application for building evacuationProc. ISCA 28th Int’l. Conf. on Software Engineering and Data Engineering (SEDE 2019)Vol. 64109118109–18EasychairWindsor, CO10.29007/7jch
6StigallJ.SharmaS.“Mobile augmented reality application for building evacuation using intelligent signs,” ISCA 26th Int’l. Conf. Software Engineering and Data Engineering (ISCA, San Diego, CA, 2017)
7SharmaS.JerripothulaS.2015An indoor augmented reality mobile application for simulation of building evacuationProc. SPIE9392627062–7010.1117/12.2086390
8ChenJ.RuciA.SturdivantE.ZhuZ.2022ARMSAINTS: an AR-based real-time mobile system for assistive indoor navigation with target segmentation2022 IEEE Int’l. Conf. on Advanced Robotics and Its Social Impacts (ARSO)16,1–6,IEEEPiscataway, NJ10.1109/ARSO54254.2022.9802970
9FengJ.BeheshtiM.PhilipsonM.RamsaywackY.PorfiriM.RizzoJ.-R.2023Commute booster: a mobile application for first/last mile and middle mile navigation support for people with blindness and low visionIEEE J. Translational Eng. Health Med.11523535523–3510.1109/JTEHM.2023.3293450
10SnehaA.TejaV.MishraT. K.Satya ChitraK. N.2020Qr code based indoor navigation system for attender robotEAI Endorsed Trans. Internet Things6e310.4108/eai.13-7-2018.165519
11SatoF.2017Indoor navigation system based on augmented reality markersInt’l. Conf. Innovative Mobile and Internet Services in Ubiquitous Computing266274266–74SpringerCham10.1007/978-3-319-61542-4_25
12SharmaS.BodempudiS. T.ScribnerD.GrynovickiJ.GrazaitisP.2019Emergency response using HoloLens for building evacuationLecture Notes in Computer ScienceVol. 11574299311299–311SpringerCham10.1007/978-3-030-21607-8_23
13StigallJ.BodempudiS. T.SharmaS.ScribnerD.GrynovickiJ.GrazaitisP.2019Use of microsoft hololens in indoor evacuationInt. J. Comput. Their Appl.263123–12
14DieckM. C. T.JungT.2015A theoretical model of mobile augmented reality acceptance in urban heritage TourismCurr. Issues Tour211211–2110.1080/13683500.2015.1070801
15KatzB. F.KammounS.ParseihianG.GutierrezO.BrilhaultA.AuvrayM.TruilletP.DenisM.ThorpeS.JouffraisC.2012NAVIG: augmented reality guidance system for the visually impairedVirtual Real16253269253–6910.1007/s10055-012-0213-6
16JosephS. L.ZhangX.DryanovskiI.XiaoJ.YiC.TianY.2013Semantic indoor navigation with a blind-user oriented augmented realityProc. 2013 IEEE Int’l. Conf. on Systems, Man, and Cybernetics (SMC)358535913585–91IEEEPiscataway, NJ10.1109/SMC.2013.611
17TaylorH. ABrunyéT. T.TaylorS. T.2008Spatial mental representation: implications for navigation system designRev. Hum. Factors Ergon.41401–4010.1518/155723408X342835
18DymonU. J.1994Mapping–the missing link in reducing risk under SARA IIIRisk5337
19TeknomoK.FernandezP.2012Simulating optimum egress timeSaf. Sci.50122812361228–3610.1016/j.ssci.2011.12.025
20ChenY. H.ZickS. E.BenjaminA. R.2015A comprehensive cartographic approach to evacuation map creation for Hurricane Ike in Galveston CountyTexas. Cartogr. Geogr. Inf. Sci.4061181–1810.1080/15230406.2015.1014426
21SharmaS.2009Avatarsim: a multi-agent system for emergency evacuation simulationJ. Comput. Meth. Sci. Eng.9S13S22S13–22ISSN 1472-7978
22SharmaS.SinghH.PrakashA.2008Multi-agent modeling and simulation of human behavior in aircraft evacuationsIEEE Trans. Aerosp. Electronic Syst.44147714881477–8810.1109/TAES.2008.4667723
23SharmaS.2009Simulation and modeling of group behavior during evacuationProc. IEEE Symp. Series on Computational Intelligence, Intelligent Agents122127122–7IEEEPiscataway, NJ10.1109/IA.2009.4927509
24SharmaS.2010Military route planning in battle field simulations for a multi-agent systemJ. Comput. Meth. Sci. Eng.10S97S105S97–S105ISSN: 1472-7978
25SánchezJ. M.CarreraA.IglesiasC. A.SerranoE. A.2016Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google GlassSensors (Basel)16136010.3390/s16091360PMID: 27563911; PMCID: PMC5038638
26NakajimaY.ShiinaH.YamaneS.IshidaT.YamakiH.2007Disaster evacuation guide: using a massively Multi-agent server and GPS mobile phonesProc. Int’l. Symp. on Applications and the Internet (SAINT 2007)IEEEPiscataway, NJ10.1109/SAINT.2007.13
27AhnJ.HanR.2012An indoor augmented-reality evacuation system for the smartphone using personalized PedometryHum. Centric Comput. Inf. Sci21231–2310.1186/2192-1962-2-18
28AraiK.2012Wearable health monitoring sensor network and its application to evacuation and rescue information server system for disabled and elderly personsScience31633
29KimH.KwonY. T.LimH. R.KimJ. H.KimY. S.YeoW. H.2021Recent advances in wearable sensors and integrated functional devices for virtual and augmented reality applicationsAdv. Funct. Mater.31200569210.1002/adfm.202005692
30BowenS.YangJ.HuangZ.HuiPan2015Offloading guidelines for augmented reality applications on wearable devicesProc. 23rd ACM Int’l. Conf. on Multimedia (MM ’15), Association for Computing Machinery127112741271–4ACMNew York, NY10.1145/2733373.2806402
31ParushA.AhuviaS.ErevI.2007Degradation in spatial knowledge acquisition when using automatic navigation systemsSpatial Inf. TheoryVol. 8 238254 238–54SpringerCham10.1007/978-3-540-74788-8_15
32CatalC.AkbulutA.TunaliB.UlugE.OzturkE.2019Evaluation of augmented reality technology for the design of an evacuation training gameVirtual Real.24359368359–6810.1007/s10055-019-00410-z
33MengF.ZhangW.2014Way-finding during a fire emergency: an experimental study in a virtual environmentErgonomics57816827816–2710.1080/00140139.2014.904006
34AriasS.La MendolaS.WahlqvistJ.RiosO.NilssonD.RonchiE.2019Virtual reality evacuation experiments on way-finding systems for the future circular colliderFire Technol.55231923402319–4010.1007/s10694-019-00868-y
35OcchialiniM.BernardiniG.FerracutiF.IarloriS.D’OrazioM.LonghiS.2016Fire exit signs: the use of neurological activity analysis for quantitative evaluations on their perceptiveness in a virtual environmentFire Saf. J.82637563–7510.1016/j.firesaf.2016.03.003
36SharmaS.FrempongI. A.ScribnerD.GrynovickiJ.GrazaitisP.2019Collaborative virtual reality environment for a real-time emergency evacuation of a nightclub disasterIS&T Electronic Imaging: The Engineering Reality of Virtual Reality 2019181-1181-10(10)181-1–181-10(10)IS&TSpringfield, VA10.2352/ISSN.2470-1173.2019.2.ERVR-181
37ParkS.ParkS. H.ParkL. W.ParkS.LeeS.LeeT.LeeS. H.JangH.KimS. M.ChangH.ParkS.2018Design and implementation of a Smart IoT based building and town disaster management system in smart city infrastructureAppl. Sci.8223910.3390/app8112239
38BrookeJ.JordanP.W.ThomasB.WeerdmeesterB. A.McClellandI. L.1996SUS: a quick and dirty usability scaleUsability Evaluation in Industry189194189–94Taylor & FrancisLondon