Back to articles
Work Presented at Electronic Imaging 2024
Volume: 67 | Article ID: 060402
Image
Development and Implementation of an Augmented Reality Thunderstorm Simulation for General Aviation Weather Theory Training
  DOI :  10.2352/J.ImagingSci.Technol.2023.67.6.060402  Published OnlineNovember 2023
Abstract
Abstract

In 2021, there were 1,157 general aviation (GA) accidents, 210 of them fatal, making GA the deadliest civil aviation category. Research shows that accidents are partially caused by ineffective weather theory training. Current weather training in classrooms relies on 2D materials that students often find difficult to map into a real 3D environment. To address these issues, Augmented Reality (AR) was utilized to provide 3D immersive content while running on commodity devices. However, mobile devices have limitations in rendering, camera tracking, and screen size. These limitations make the implementation of mobile device based AR especially challenging for complex visualization of weather phenomena. This paper presents research on how to address the technical challenges of developing and implementing a complex thunderstorm visualization in a marker-based mobile AR application. The development of the system and a technological evaluation of the application’s rendering and tracking performance across different devices is presented.

Subject Areas :
Views 62
Downloads 17
 articleview.views 62
 articleview.downloads 17
  Cite this article 

Kexin Wang, Jack Miller, Philippe Meister, Michael C. Dorneich, Lori Brown, Geoff Whitehurst, Eliot Winer, "Development and Implementation of an Augmented Reality Thunderstorm Simulation for General Aviation Weather Theory Trainingin Journal of Imaging Science and Technology,  2023,  pp 1 - 14,  https://doi.org/10.2352/J.ImagingSci.Technol.2023.67.6.060402

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2023
 Open access
  Article timeline 
  • received July 2023
  • accepted October 2023
  • PublishedNovember 2023
jist
JIMTE6
Journal of Imaging Science and Technology
J. Imaging Sci. Technol.
J. Imaging Sci. Technol.
1062-3701
1943-3522
Society for Imaging Science and Technology
1.
Instruction
General Aviation (GA) refers to all civil aviation operations that do not transport passengers or goods for commercial or “for hire” purposes [1]. GA had the most flight hours in civil aviation and is the deadliest segment in civil aviation (regulated by Title 14 Code of Federal Regulations Part 121, 135, 91) [2]. In 2021 alone, GA had more than 21.9 million flight hours compared to 15.9 million flight hours for part 121 air carriers (regularly scheduled air carriers such as U.S.-based airlines, regional air carriers, and cargo operators). For every 100,000 GA flight hours there were 5.26 accidents with 0.95 being fatal, compared to part 121 air carriers’ accident rate of 0.49 with 0 being fatal [2]. Among those accidents, pilot decision making was a decisive factor with 55%–85% of GA accidents caused, or contributed to, by pilot error [3].
A pilot’s decision is important in dangerous weather conditions like thunderstorms. During 1996–2014, 71% of thunderstorm-related accidents resulted in fatalities, which was significantly higher than non-thunderstorm-related accidents (23%) [4]. The high percentage of thunderstorm-related accidents highlights the need to improve pilots’ decision-making skills when facing adverse weather conditions [5]. Efforts to help student pilots take correct decisions when facing severe weather conditions, include required training programs on weather theory knowledge [6]. Weather theory knowledge provides pilots with a background of weather principles, how they affect flight safety, and how to make proper decisions [7]. Without sufficient knowledge of weather theory, pilots might not be able to take safe decisions when facing adverse weather conditions [8, 9].
Student or novice pilots can learn weather theory knowledge from a Federal Aviation Administration (FAA) certificated pilot school, either in-person or at home [6]. Different training programs focus on different aspects of weather, which leads to inconsistency in pilots’ knowledge levels [10, 11]. In addition to the inconsistency in training programs, the educational materials provided to student pilots are exclusively in 2D formats such as textbooks, images, and videos [12]. These 2D materials cannot effectively communicate all the complexities of weather phenomena and require students to create their own 3D mental models leading to misunderstanding of important concepts [13, 14]. Though there are flight simulators and other expensive visualization equipment to simulate weather phenomena, most of them use cockpit views and focus on practical training instead of aeronautical knowledge instruction. Traditional materials also lack hypothetical scenarios that may be encountered in flight or clear instructions on different topics in weather aviation [9]. To effectively overcome these issues for effective weather theory training, three criteria were proposed to for educational materials: (1) allow students who do not have expensive equipment to easily visualize complex weather phenomena with hypothetical scenarios, (2) provide easily accessible equipment to teach 3D weather concepts, and (3) complement traditional classroom and 2D flight instruction materials. Due to the wide range of GA training programs, it poses a considerable challenge to ensure the availability of 3D visualization equipment to students. As a result, it is important for students to access the 3D training content through their existing devices.
Advanced flight simulators to visualize complex 3D weather phenomena are available but fail to address the proposed criteria. Most of the high-fidelity training simulators are aircraft specific and dedicated to commercial and military pilots [1517]. In the past 10–15 years, extended reality (XR) has been used in aviation training to further reduce training costs and broaden training system availability [18, 19]. For the purposes of this work, XR will be used as an umbrella term for technologies that immerse users into different degrees of virtual environments (VE), including Virtual Reality (VR), AR, and Mixed Reality (MR) [20]. VR immerses users into a completely virtual environment [21], typically using a Head-Mounted Display (HMD) (also referred to as a Head Worn Display) or a CAVETM system. For the purposes of this paper, the term HMD will be used when referring to a display on a user’s head. AR presents 3D virtual content overlaid onto real-world environments [22]. AR allows a user to view virtual content and 2D material using either mobile devices, such as smartphones and tablets (see Figure 1) or AR HMDs such as the Microsoft HoloLens (see Figure 2) [2325]. HMD-based VR and AR training have limitations such as requiring an external computer to power the device or a limited area of use due to constraints on tracking [19].
Figure 1.
Video-based AR.
Figure 2.
See-through AR with Microsoft HoloLens [24].
A critical decision in this work was to determine a suitable platform for the 3D training content. Several potential platforms were considered in this decision. First, a VR system can be in an HMD powered by a mobile device, standalone computing in the headset, or tethered to a more powerful external computer. Any VR HMD provides a fully immersive experience and can remove a user’s situational awareness of their physical environment. So, a student in a VR HMD may not be able to use a physical textbook while in a VE, requiring the simulation to include all teaching materials (i.e., no textbook). Alternatively, to use any kind of VR HMD with traditional teaching materials will require several awkward putting on and taking off of the system by a user. A standalone computing HMD or one powered by external computer can provide sufficient rendering power. This power comes at an additional cost of $300–$3,000 per user. A phone-based VR HMD reduces overall system cost but will have significantly decreased rendering capabilities. AR can present virtual content using a device that students most likely already own (e.g., a smartphone) and meets the three criteria to improve weather theory training listed above. AR on a mobile device was a favorable choice for this work due to several advantages. First, AR does not require every screen pixel to show virtual content, so rendering in real-time on a mobile device can be accomplished. Additionally, AR allows for seamless integration with traditional materials, making it easier to work with existing content. Furthermore, AR can deliver reliable tracking and spatial navigation by utilizing a simple printed image marker. Lastly, AR is widely used in many fields where training personnel on complex spatial relationships is necessary, such as learning 3D geometry [26]. After considering all these issues from the literature search, AR on a mobile device using a marker-based algorithm was chosen for this project. In this work, AR on mobile device is referred to as mobile AR.
While marker-based mobile AR technology was chosen in this work, a mobile device does not have the same graphics capability as an HMD or high-fidelity flight simulator. These hardware limitations have reduced many XR-based training materials to simple content such as videos, static 3D models, and 2D images. In addition, most of the implementations do not have complex interaction to further enhance training [2731]. Mobile devices only have the capability to render a limited number of polygons due to their processing power and battery capacity [32] and due to the relatively small screen size, weather models might not fully fit on the screen with a corresponding user interface (UI). In addition, touch-screen interfaces limit interactions for 3D VEs. A user cannot input depth information as most touch screens only provide 2D position information. Finally, AR on mobile devices is further limited by one hand use, due to the other hand usually occupied with holding the device [33, 34].
2.
Objective
To use AR in weather theory training, a new solution is needed to render complex 3D weather phenomena with limited processing power, an effective UI, and the ability to compensate for non-ideal tracking conditions. The research presented in this paper studied, addresses the technical challenges to use mobile AR in weather theory training for GA student pilots. The goal is to achieve high-quality rendering and real-time stable tracking of thunderstorm educational modules on a mobile device to ensure an immersive and effective user experience. This application serves as an easily accessible 3D training tool for initial student pilots, assisting their comprehension of abstract thunderstorm concepts from 2D materials.
To achieve this objective, a particle system was created to form a thunderstorm cell with a life-like cloud appearance. Specialized shaders, viewing perspectives, and an image target were developed to improve weather theory training, lower the rendering requirements for a mobile device, and maintain stable tracking of virtual content. The work also conducted a technological evaluation of the application’s rendering performance and tracking accuracy across all compatible devices.
Although this paper focuses on the technical achievements of this work, it is worth noting that a series of user studies were conducted to assess the usability and educational benefits to students of the system as well. While not the focus of this paper, they are briefly summarized here (for full details see Meister et al. [35, 36]). Evaluations determined that the students improved their factual knowledge and visual knowledge, with high levels of motivation in a statistically significant manner. A preliminary evaluation of the application was conducted with three subject matter experts in weather and aviation, and three students, to assess whether the AR thunderstorm visualization could communicate weather theory and whether the interfaces were usable for learning and task completion (for full details, see Meister et al. [35]). Students’ knowledge increased after using visualization to explore the dynamics of the thunderstorm. Subject matter experts felt the learning experiences appropriately communicated thunderstorm theory in ways that supported instruction. The AR interfaces were rated as usable for learning interactions and produced low levels of workload. In a follow-up study, 18 (17 male, 1 female) student pilots, or pilots with fewer than 250 total flight hours, were asked to complete the AR weather-related learning activities on a smartphone (for full details, see Meister, [36]). The learning outcomes were measured by pre-and post-tests concerning factual and visual knowledge and participants’ completion time. There was a statistically significant increase in student factual knowledge (from 71% to 91%), even though students already had a high level of incoming knowledge. Visual knowledge also increased in a statistically significant manner, from 55% to 90%. To evaluate user experience with the AR application, post-trial surveys were also given to evaluate the application’s usability, user motivation, and overall experience. Participants also reported positive learning experiences with high motivation, reasonable task load and completion time, and excellent usability, as rated by the system usability scale. Together these studies demonstrate that the AR thunderstorm model, and associated learning scenarios, resulted in an effective application that enhanced the learning outcomes of students and teaches aviation weather in a way that met the expectations of aviation and weather instructors.
3.
Background
A literature review elucidated current research efforts on weather theory training for GA pilots and mobile AR in training and education.
3.1
Weather Training for GA Pilots
Pilots must acquire weather theory knowledge and pass a FA approved assessment [37]. However, training methods that provide weather theory knowledge are inconsistent, leaving pilots with a level of education that is not sufficient for safe decision making in the cockpit [5, 6, 38].
In a study performed by the FAA on how training affects GA pilots’ ability to make in-flight decisions, 57 GA pilots were put into a low-visibility visual flight rules scenario with an approaching thunderstorm traversing their flight path [38]. The pilots’ behaviors were categorized as: (1) “tactical”: did not maintain a safe (20 nautical mile) from the thunderstorm, and (2) “strategic”: maintained a safe distance from the thunderstorm. Half of each group received additional training on how to safely avoid thunderstorms and then all pilots were asked to go through the scenario again. For pilots who received training, the study showed that 66% of trained tactical pilots changed their unsafe behaviors in one training iteration to maintain a safe distance from the thunderstorm. The average distance trained tactical pilots maintained when avoiding the thunderstorm increased from 10.2 nautical miles (SD = 4.0) to 31.3 miles (SD = 18.2). Pilots who did not receive training did not show a statistically significant difference in their behavior. This study indicates the crucial impact of weather theory knowledge on pilots’ decision-making.
Assessments of weather training for GA pilots do not provide objective metrics that can be used to improve weather theory training methods. In one study, 95 weather knowledge questions were developed to evaluate GA pilots’ weather knowledge and their interpretation of weather phenomena [5]. These questions were presented to 204 GA and student pilots. The mean score of less than 60% for all participants highlights the need for improved weather training for current GA pilots and student pilots.
Research also found inconsistency in current aviation training programs. Guinn et al. [11] studied 22 Aviation Accreditation Board International (AABI) accredited professional flight baccalaureate degree programs. Results of the study revealed that most of the programs focused on the interpretation of weather reports, and only 60% of them mentioned teaching flight hazards. Even when flight hazards were provided in course descriptions, whether instruction included a theoretical understanding of weather hazards or simply focused on interpreting weather reports was unknown.
3.2
High Fidelity Flight Simulators versus Extended Reality Simulators
Aviation simulators have a long history of effective use in pilot training. Early aviation tools include computer-generated images presented on a spherical screen to create an immersive flight simulation [39]. This early virtual reality system had many limitations, such as large physical size and high operational costs. Hexapod platforms were then developed in flight simulators for civilian aircraft and have been in use for over 50 years to simulate flight motion [40, 41]. At the beginning of the 20th century, ground simulators were developed to simulate movement, banking, and other flight movements [42]. In recent years, the Dynamic Flight Simulator (DFS) was developed to train military pilots fully with minimal to no supplemental training in a real aircraft [16] simulated motion and gravity changes during conditions that GA pilots would rarely encounter.
High-fidelity flight simulators are expensive to operate and rarely accessible for GA pilots. Thus, XR technologies have been used in the last 10–15 years to provide lower cost training platforms. In 2011, the Scalab Virtual Reality Simulator was developed using a VR HMD and data gloves for helicopter flight training [18]. Scalab reduced the cost of flight simulation but focused on helicopter flight operations in the cockpit as opposed to weather theory knowledge and was very complex for student pilots, or instructors, to download, compile, and run. In 2019, the U.S. Air Force created the Pilot Training Next (PTN) model to assess and engage in initial pilot training [14]. While assessing PTN, researchers encountered issues such as VR training devices not being widely available to participants.
Another limitation of using VR, is the ability of an instructor to maintain situational awareness. VR HMDs only support one person, so an instructor’s ability to monitor progress of a trainee and provide useful feedback is severely limited. For example, the Virtual Instructor Pilot Exercise Referee (VIPER®) is another program to assist flight training [43]. To assess the effectiveness of this tool a study was conducted on 52 student naval aviators (SNAs) training with VIPER®, 64 SNAs training with other VR conditions with guidance, 3,014 SNAs training with VR conditions without guidance, and 836 SNAs training with non-VR conditions. Results showed that students working with the guidance-free VR conditions had improved grades (statistically significant) compared to those using the non-VR condition. In addition, students using the guidance VR condition had statistically significant improved grades than their peers using the guidance-free VR conditions. Lastly, students using VIPER® had statistically significant improved grades than those using the guidance VR conditions.
3.3
Marker-based Mobile Augmented Reality
Marker-based mobile AR refers to using mobile devices as both a computational and visualization device to present computer-generated non-physical information in a live, real-world environment with a physical marker as a reference point in the physical world [4446]. When using a marker-based mobile AR application, users point the device’s camera to a pre-defined image target and the virtual content will be placed relative to the target’s position in the virtual environment.
Studies in education have shown that relying on students’ mental models to learn complex 3D phenomena is not as effective as using 3D visual models [4749]. AR enables the ability to present 3D objects and animations visually and can also aid students in developing deeper spatial understanding, motivation, and long-term memory retention of learned skills [5053]. In addition, AR on a mobile device is more accessible, lower in cost, and portable. Although marker-based mobile AR applications have been successfully developed and implemented in various training areas with positive outcomes, there are still many challenges that exist such as rendering resources, tracking performance, and screen space.
Numerous studies have been conducted to develop content for marker-based mobile AR with the aim of facilitating training and education. However, a common limitation observed in these studies is the insufficient complexity of the content. pARabola is a mobile AR system that utilizes markers to facilitate learning quadratic equations [29]. pARabola enables users to input various quadratic functions, with the particle system updating in real-time to reflect the changes made. In the evaluation, some participants commented on the size of text information and buttons, highlighting the significance of interface design in AR applications. In addition, projects with AR applications such as MagicBook, created by Kucuk, Kapakin, and Gotas [30], as well as one for teaching molecular structures [49] improved students’ achievement and reduced cognitive load, but only used 3D video animation, static 3D models, images, and sounds.
The literature review highlighted several technical challenges related to marker-based mobile AR in training and education. One major challenge is the limited complexity of content due to rendering power constraints. Rendering power on mobile devices is limited by its battery capacity. A high-end desktop Graphic Process Unit (GPU) consumes 500 watts of power, whereas the total battery capacity of a high-end mobile device such as iPad Pro is only 28.65 watts-hour [54, 55]. As a result, GPUs on mobile devices consume less than 10 watts of power resulting in lower rendering capacity compared to desktop GPUs [54]. For mobile AR specifically, since tracking algorithms run constantly and consume significant power, less is available for rendering [56]. Various approaches have been taken to overcome and reduce mobile applications rendering overhead, such as remote rendering, image rendering, pre-rendering, and optimization of graphics code [5759]. FlashBack is an example of a software that uses pre-rendering to reduce overhead on mobile VR [59]. However, FlashBack was implemented on an HP Pavilion Mini, which is not suitable for phone-based AR applications [60].
Additionally, tracking issues pose a significant challenge in marker-based mobile AR. Tracking is often done using feature points of a target image or model in the camera field to match pre-defined schemes and connect 2D locations in videos captured by the camera with 3D locations in virtual space [61]. Stable tracking requires the camera to pick up a sufficient number of feature points in an image target. Failure to do so results in virtual content shaking or not being rendered in its intended location. In low-light conditions, the Charge-Coupled Device (CCD) on a mobile device can introduce noise into the representation of an image target further complicating tracking [56]. To improve tracking accuracy and robustness, an image target should contain a high amount of feature points and have significant color contrast.
The UI on a mobile application is another critical element for an effective AR experience. Since users observe and interact with virtual content through a device’s screen, efficiently utilized screen space is critical to mobile AR application UI design. For example, in a research study conducted on user experience with a mobile AR application, the UI was mentioned in 25 out of 90 respondents [62].
Overall, the literature review conducted revealed inconsistent, and at times insufficient, weather theory training to pilots. The review demonstrated that marker-based mobile AR offers capabilities that will not only improve learning outcomes, but also make it more accessible to pilots of all abilities. However, the literature also shown that current marker-based mobile AR projects were limited to simple geometry with no or little interaction due to technical challenges in rendering power and tracking performance.
4.
Methodology
A thunderstorm model with learning activities and scenarios was developed using Unity and implemented in a mobile AR application for both iOS and Android devices [63]. Experienced flight instructors provided expertise throughout the development process including the thunderstorm characteristics and corresponding scenario activities [35]. This feedback was crucial to create educational materials that would be effective for GA student pilots. The following sections discuss the development and implementation process of creating the thunderstorms training content in a marker-based mobile AR application and how to overcome the technical challenges identified earlier.
4.1
Model Creation
The model simulated a single 60-minute thunderstorm cell cycle moving through the defined stages of development, maturation, and dissipation. The model contained different hazards and weather information, including temperature, icing, wind, and precipitation.
5.
Volumetric Appearance of Thunderstorm Clouds
Thunderstorms change dynamically with clouds forming and dissipating throughout their lifecycle. To simulate such activities in a mobile application, where computational resources are limited, a particle system was used. Unity offers a particle system Application Programming Interface (API) that allows small images to be emitted to simulate fuzzy visual effects [60]. Particle systems are a common way to create “volumetric” visual effects with reduced rendering overhead [32, 60]. Volumetric refers to visual effects that have movement on the surface and within an object such as fire, smoke, and clouds. A thunderstorm in real life transforms through its lifecycle stages (i.e., developing to dissipation) with a classic anvil shape (i.e., nonuniform in the direction of environmental winds). Controlling each particle within the model to achieve such a visual effect increases processing time and complexity, and is not ideal for mobile AR. To address this issue, the thunderstorm was formed in segments with each segment containing one particle system to form a part of the overall anvil cloud shape through its lifecycle (see Figure 3). Particles are emitted at the beginning of the simulation with random orientations and sizes. The particles’ properties (e.g., transparency and color) are controlled by a custom developed graphics shader. A graphics shader is computer code that outputs correct levels of opacity, shading, and color, and even geometry movement (i.e., vertex procedures), during the rendering of a 3D scene. The developed shader takes the height of different location on the cloud, the presented weather information, and the current simulation time. Based on these factors, the shader assigns a specific color and transparency to each location on the cloud, enabling a gradual distribution. By combining this shader with the segment approach, thunderstorm can achieve a nonuniform shape while still maintaining an overall color and transparency changes during the cycle. However, no new particle was created nor modifications to existing particles in the cell were done at run-time to keep rendering resources low. The shader code and the segment approach significantly reduced the time to create unique cloud formations (e.g., duplicating cloud segments or the entire cloud cell) and ran in real-time on a mobile device.
Figure 3.
Cloud model developed by particle system.
6.
Indication of Dynamic Weather Information
Based on the Pilot’s Handbook of Aeronautical Knowledge [12], wind, icing, and temperature conditions in a thunderstorm play a critical role in flight safety and are crucial for pilots to understand. Representing these various components of information required easy-to-understand visual cues that were distinguishable when viewed alone or all together.
Color is a powerful tool to display numeric information and was used in the model. The custom graphics shader developed allow gradual color changes throughout the cloud. This shader was used in presenting cloud density (see Figure 4), icing conditions (see Figure 5), and temperature information through the cloud (see Figure 6). To distinguish between cloud density, icing, and temperature, as colors will overlap with each other, icing has labels on the cloud as secondary markings, and each piece of information can be viewed separately. It is important to note that the choice of colors was arbitrary and does not have special meaning in aviation education. Vibrant colors with high contrast to one another were used with the approval of expert flight instructors to allow easy distinction for the concepts they wanted conveyed to students.
Figure 4.
Thunderstorm cloud with different densities at different stages.
Figure 5.
Icing through cloud.
Figure 6.
Temperature through cloud.
However, not all weather information could be presented with color. Unlike gradually changing data inside the thunderstorm cloud, some corresponding weather phenomena required a more realistic appearance. Particle systems were again used to produce these phenomena. For precipitation, each particle was represented as a raindrop, or hail particle, with a gravity force attached to make it descend from the cloud. Rain and hail were differentiated with unique textures and densities (see Figure 7). In addition, a microburst, an additional characteristic of a thunderstorm critical to pilot training, had to be developed. A microburst contains localized downdraft air, a dust ring, rolling clouds, and an area of intense rain called a “virga” [64]. To simulate microbursts, particles were set to a continuous cycle movement and dissipation to represent the accompanying rolling clouds, virga, and dust ring (see Figure 8). A variety of materials were implemented to attain a realistic appearance for these phenomena. Since each particle system has its own animation, it was controlled through scripting to ensure synchronization with the overall thunderstorm simulation, which was under the user’s control.
Figure 7.
Thunderstorm simulated precipitation.
Figure 8.
Microburst wind simulation.
Finally, variation in wind direction and speed is one of the most crucial components for pilots to understand in real-time flight decisions. Unlike other information in and around a thunderstorm, which may be more contained to a specific region, wind constantly changes in and around the entire cloud. For example, when a microburst occurs, the wind moves toward the ground and then circles back to the thunderstorm after hitting the ground as shown in (see Fig. 8). Additionally, since the wind movement cannot be represented in a lifelike appearance, a simplified 3D arrow geometry was utilized. The model presented in this paper used Unity’s physics engine to compute the movement of these 3D arrows according to a predefined wind pattern advised by expert flight instructors [65]. However, a limitation of Unity’s physics engine was the absence of a timeline-like animation playback feature. To address this limitation, a workaround was implemented whereby the arrow movement remained constant throughout the simulation, and the activation of the arrows was adjusted accordingly to indicate the progression of the wind. By implementing this approach, the wind movement driven by the Unity physics engine can be integrated and controlled within the overall thunderstorm simulation.
6.1
Implementation of Different Viewing Perspectives
A mature thunderstorm cell can be up to 16 km wide, whereas some activities inside it only happen in a relatively small area such as a microburst, which is usually less than 4 km in diameter [64, 66]. To allow users to view the thunderstorm, and all accompanying information, two viewing positions were created with different distances from the virtual camera. The first viewing mode was cloud-centric, which allows users to view the model statically as the ground moves underneath (see Figure 9). This effectively puts the user’s viewpoint moving at the same speed as the thunderstorm so that the cloud can be set to a larger scale. The second mode was terrain-centric, where the user’s viewpoint is fixed, and the cloud can be viewed moving across the terrain (see Figure 10). The cloud is further from the virtual camera so that it effectively puts the user in an all-encompassing view to observe the movement of the cloud and still be able to see additional elements that may interact with the cloud model.
Figure 9.
Cloud-centric mode.
Figure 10.
Terrain-centric mode.
6.2
Implementation of Learning and Scenario-based Activities
2D training materials do not effectively show hypothetical scenarios that pilots might encounter in real-life operations or visualizations of accident scenarios [9]. 14 Code of Federal Regulations 61.105(b) also emphasized that pilots need to have the ability to recognize critical weather stations and make appropriate decisions [6]. However, the developed model can only contain a limited amount of 3D content to keep computational resources optimal for a mobile AR application. To address this issue, four learning activities were developed using the thunderstorm model in different ways. These activities were developed with continual input from experienced flight instructors. The four activities include two learning activities: (1) takeoff under a microburst and (2) thunderstorm avoidance; and two scenario-based activities: (1) takeoff scenario and (2) approach scenario. Each activity is supplemented with questions, instructions, or different flight paths to show how the thunderstorm model, and accompanying weather phenomena, affect pilot decision-making.
The thunderstorm model contains a detailed simulation of a microburst during its three stages: (1) formation, (2) impact, and (3) dissipation. Trying to take off under a microburst is very dangerous and can easily cause an aircraft to lose control at low altitudes and crash. An activity was specifically designed to emphasize a microburst’s deadly impact on an aircraft during takeoff. This activity contains an intended and actual flight path to demonstrate the dramatic effect a microburst can have during takeoff (Figure 11). To maintain low rendering overhead, sample 3D geometries were implemented to present the situation, and supplemented with textual descriptions as well as airspeed to emphasize the danger a microburst poses to an aircraft. The flight paths were designed based on expert flight instructors’ input.
Figure 11.
Takeoff under a microburst learning activity.
Three learning and scenario activities were developed with the thunderstorm model to emphasize FAA’s regulation on staying 20 nautical miles away from the thunderstorm [3]. The take-off scenario (see Figure 12) and approach scenario (see Figure 13) simulated a real-world accident where an aircraft tried to approach and take off from an airport as a thunderstorm was nearby. The movement of the thunderstorm was based on the accident reports and flight instructors’ expertise. After reaching 20 nautical miles from the thunderstorm, the scenario animation will pause and ask students about their decision to continue or divert to an alternate airport. Different consequences were given based on the decisions students made. In the take-off scenario, the thunderstorm was formed by multiple cells with lightning and precipitation. Developing each thunderstorm cell individually would increase the computational resources needed, so a custom shader was developed to scale the cloud in different sizes without changing the particle systems or segments.
Figure 12.
Take-off scenario.
Figure 13.
Approach scenario.
6.3
Implementation of Marker-based Mobile AR
To integrate the thunderstorm model into an easy-to-use mobile AR application, a user interface, and a high-quality image target were needed for a complete experience.
Interfacing with an application in mobile AR presents unique challenges from viewing content to interactions. The interface for the thunderstorm model only contains essential interactions and information, such as a play/pause button, a play bar, a drop-down for playback speed, and a label for the stage of the cloud (see Figure 14). A menu toggle button was used to hide all sub buttons to toggle on/off different features to save viewing space when not needed. All the informative labels were also set to appear only as needed, and the text was kept as concise as possible. Additionally, a grid was implemented around the terrain to provide users with an accurate way to gauge the actual sizes of different thunderstorm characteristics (see Figure 15).
Figure 14.
Interface menu.
Figure 15.
Grid and 3D labels.
Another technical challenge identified was tracking performance [33, 34]. The tracking quality is heavily reliant on the image target, the devices’ camera position, and its capability of getting a clear image. Besides different viewing perspectives, which ensures a user will not move away from the image target while viewing the AR content. An image target with a sufficient amount of feature points, high contrast, and limited repetitive patterns was also designed to maximize tracking stability. As previously mentioned, the selection of mobile AR was driven by the goal of providing a cost-effective solution for visualizing 3D thunderstorm simulation. As a result, image marker was chosen as a suitable method to enable tracking for the AR application. Though newer HMDs offer advanced tracking capabilities, their cost poses a constraint in their usage.
The application was compatible with both iOS and Android system across smart phones and tablets. To preserve rendering power when running the application on different mobile devices, the target framerate of the application was set to the device’s screen refresh rate. Because a device’s screen refresh rate is the highest frame rate any application running on them can achieve, and a higher frame rate will require more power on rendering.
7.
Evaluation
7.1
Technical Evaluation
The presented thunderstorm model has realistic and immersive visualization and integrates the model into a mobile AR application allowing it to work along with the traditional materials and be easily accessed by students. The mobile AR application was designed to run on both smartphones and tablets, so an assessment of the performance of the application across these devices was performed. The application was run on three testing devices: (1) OnePlus 8 pro with Android 13 [67] (referred to as Android in the following section), (2) iPhone 13 with iOS 16 [68] (referred to as iPhone in the following section), and (3) 2nd generation iPad Pro with iOS 16 [69] (referred to as iPad Pro in the following section). For the Android device, the screen refresh rate can be set to both 120 Hz and 60 Hz, the iPhone device screen refresh rate can be set only to 60 Hz, and the iPad screen refresh rate can be set to both 120 Hz and 60 Hz. To control variables, the screen refresh rate for all devices, and target frame rate for the mobile AR application, was set to 60 Hz.
The render performance was measured by the application’s frame rate. A 60-minute thunderstorm cell cycle was simulated at 180x speed in the terrain-centric mode with all the features turned on (i.e., precipitation, icing, temperature, wind arrows, labels, and grid) (see Figure 16). The device was held 4 ft from the image target. During the animation, the frame rate per second (FPS) was recorded every 10 seconds, with ten trials conducted for each device. Table I shows the average frame rate for each trial and device. The Android device met the target of 60 FPS and had an average frame rate of 58.83 FPS. The iPhone met the target of 60 FPS and had an average frame rate of 60.00 FPS, while the iPad Pro met the target of 60 FPS and had an average frame rate of 60.00 FPS. These results confirm that the application provided a sufficient frame rate across all devices, and operating systems, when viewing the thunderstorm moving across the terrain in front of the image target.
Figure 16.
Render performance evaluation setup.
Table I.
Frame rate result of render performance evaluation in frames per second (FPS).
Trial 1Trial 2Trial 3Trial 4Trial 5Trial 6Trial 7Trial 8Trial 9Trial 10Average
Android (60fps)59.7558.5059.5058.25 58.2559.0059.2559.2559.0057.5058.83
iPhone (60fps)60.0060.0060.0060.00 60.0060.0060.0060.0060.0060.0060.00
iPad Pro (60fps)60.0060.0060.0060.00 60.0060.0060.0060.0060.0060.0060.00
While viewing the thunderstorm animation, a user will typically move around the image target to view the model from various angles. In this case, FPS may not be the main factor affecting user experience as tracking quality plays a more important role. The AR model needs to maintain its relative position in the field of view when moving the mobile device around. To assess the application’s tracking quality, a second evaluation was performed. The same devices were held and moved around the image target through four observation points, each 4 ft away (see Figure 17). The light condition was controlled by evaluating at the same time of day in a controlled lab setting with limited windows and consistent ceiling-mounted, artificial LED lighting. The model’s animation was stopped at the mature stage and viewed under cloud-centric mode (see Figure 18). When entering each observation point, tracking quality was recorded based on the model’s behavior with the following measures: (1) Accurate Tracking, where the model was statically attached to the image target, (2) Imperfect Tracking, where the model was still attached to the image target but was not perfectly static or aligned, and (3) Loose Tracking, where the model was not attached to the image target, and a user needed to point the camera close to the image target to re-calibrate. The assessment of tracking accuracy is qualitative, as it is obvious when inaccurate tracking has occurred. Since the purpose of this system is for education, and not a physically accurate weather model, a qualitative assessment was deemed appropriate for tracking. As with the first evaluation, ten trials were conducted for each device. For Android, iPhone, and iPad Pro, the model achieved Accurate Tracking, all 40 times for each device (120 times total). The full results are shown in Table II.
Figure 17.
Tracking quality evaluation observation points.
Figure 18.
Tracking quality evaluation setup.
Table II.
Result of tracking performance evaluation.
AndroidTrial 1Trial 2Trial 3Trial 4Trial 5Trial 6Trial 7Trial 8Trial 9Trial 10Total
Accurate Tracking444444444440
Imperfect Tracking00000000000
Loose Tracking00000000000
iPhoneTrial 1Trial 2Trial 3Trial 4Trial 5Trial 6Trial 7Trial 8Trial 9Trial 10Total
Accurate Tracking444444444440
Imperfect Tracking00000000000
Loose Tracking00000000000
iPad ProTrial 1Trial 2Trial 3Trial 4Trial 5Trial 6Trial 7Trial 8Trial 9Trial 10Total
Accurate Tracking444444444440
Imperfect Tracking00000000000
Loose Tracking00000000000
It was also important to assess rendering performance as a user moved around, so FPS was again recorded during the tracking trials. The average frame rate for each device at each observation point is shown in Table III. Looking at rendering performance during the tracking experiments, a noticeable drop in FPS was observed when the camera entered the thunderstorm model (i.e., point 3 in the testing locations). An in-depth look at this viewpoint revealed that limitations on occlusion culling were the cause. Occlusion culling is a computer graphics algorithm that disables rendering on an object, or vertices therein, if not seen by the camera or completely occluded by other objects [70]. At point 3, because of the offset between the thunderstorm cloud and the image target, the user’s viewpoint is inside the thunderstorm. All the particles making up the thunderstorm cell movement were within the virtual camera defining the user’s viewpoint and were rendered, even though they might be behind other particles. This caused a high rendering overhead and taxed the ability of the tested mobile devices. The app still ran at a sufficient FPS to allow a user to see the thunderstorm from inside without exceeding the devices’ limitation, but a noticeable visual difference was observed. Future work on this project is to refine the rendering algorithm to address this issue.
Table III.
Result of render performance during tracking performance evaluation (FPS).
Devices/PointPoint1Point2Point3Point4
Android (60fps)60.0059.7017.3060.00
iPhone (60fps)60.0060.0023.7060.00
iPad Pro (60fps)60.0060.0023.0060.00
8.
Discussion
To assist student pilots in understanding 2D materials, a thunderstorm simulation was developed and integrated into a mobile AR application. This allowed students to experience 3D learning content on widely available devices. In the literature review, various technical challenges were identified in relevant research, including limitations in rendering power, tracking performance, and interaction difficulties. The development and the implementation of the thunderstorm simulation has overcome these challenges by using a custom particle system and multiple developed graphics shaders to reduce rendering overhead. Additionally, different viewing perspectives, well-designed image targets, and a simple yet useful interface were developed to ensure stable tracking and improve user interaction. The result from the technical evaluations showed that the application had a stable and sufficient rendering and tracking performance on different mobile devices and operating systems. Additionally, prior study also indicated that the application had the potential to enhance students’ learning outcomes with highly usable and relevant content [35, 36].
The evaluation results proved that a low-cost device could provide a complex visual simulation for education even when it has lower processing and power requirements compared to other simulation platforms. A detailed comparison is presented in Table IV between different simulation training platforms for GA. A mobile device is relatively cheap and has similar levels of resolution to other platforms. On the other hand, the processing hardware on a mobile device is not as powerful when compared to other platforms. Compared to a standalone HMD, a mobile device’s processor is not dedicated to XR. Compared to desktop HMDs and high-fidelity flight simulators, a mobile device has a less powerful processor because of its limited power supply. A direct comparison of power consumption between a mobile device and a wall-mounted device (e.g., desktop HMD and high-fidelity flight simulation) is challenging because the source of power is different. A mobile device relies on its own battery to provide power and is measured by its battery capacity, whereas a wall-mounted device gets power from a wall outlet and is measured by the power consumed per hour. However, the battery capacity of a mobile device is significantly smaller than an hour of power consumed from a desktop HMD or high-fidelity simulation. As a result, a mobile device possesses a lower level of power consumption and processing capabilities than a desktop HMD and high-fidelity flight simulation. To highlight the graphic differences between the thunderstorm implementation and other existing mobile AR training or educational software, a comparison was made between FenAR [31] (left in Figure 19), which utilizes 3D models and animations in mobile AR for teaching physics, and the thunderstorm simulation developed in this research (right in Fig. 19). The graphic shows that the thunderstorm content exhibits better visual quality and more complex visual effects. Combined with the evaluation result, the thunderstorm simulation was able to provide a stable 60fps thunderstorm simulation with stable AR tracking on low-cost and less powerful devices.
Figure 19.
FenAR [31] versus the thunderstorm model.
Table IV.
Comparison between different simulation platforms.
PlatformExample DevicesBattery Capacity/ Power ConsumptionCostProcessing HardwareResolution
Standalone HMDsMeta Quest 3 [71]Up to 2.2 hours of usage on average (∼40Wh)$499.99Qualcomm® Snapdragon XR2 Gen 22064 × 2208 pixels per eye
Desktop HMDsVive Pro 2 [72]Desktop power consumption 500–1200W [73]$799.00Intel® CoreTM i5–4590 or AMD Ryzen 1500 equivalent or greater/ NVIDIA® GeForce® GTX 1060 or AMD Radeon RX 480 equivalent or greater2448 × 2448 pixels per eye
High-fidelity Flight SimulatorsTRC6000 FULL MOTION C172G [74]8000–9000 WMillions [75]N/ASeven 4K LCD LED Screens
Mobile DevicesiPhone 15 [76, 77]Up to 20 hours of Video playback (∼20Wh)$799A16 Bionic chip2556 × 1179 (460 pixels per inch)
9.
Conclusion
According to reports, over 29% of GA accidents are weather-related. One of the potential causes of those accidents is a lack of fundamental weather theory knowledge by pilots [78]. Traditional GA training on weather is delivered mainly by images, text, or video, which does not provide an immersive and compelling training environment. Mobile AR is an easily accessible visualization tool that can work in conjunction with traditional material making it a better solution to assist GA weather theory training. However, mobile AR is limited in its rendering and processing power, unstable tracking, and small touch screens. The mobile AR application described in this paper provides a detailed solution that can provide an immersive environment with a lower cost than high-fidelity equipment and is more easily accessible than a VR or AR environment requiring an HMD. The model described in this paper represents thunderstorms using a volumetric and resource-efficient model. The model provides various approaches to delivering necessary weather information to overcome the limitations of mobile AR technology. A technical evaluation was performed on the thunderstorm mobile AR application. The evaluation result showed outstanding rendering performance and tracking quality across different devices. User evaluations conducted on the thunderstorm model also reiterate that this application can enhance students’ learning outcomes.
9.1
Future Work
The evaluation found an FPS drop when moving into the thunderstorm cloud and future work can is planned to minimize this frame drop. To increase the amount of AR content and enlarge content type, size, and space, other tracking references such as area targets could be implemented to use the surrounding environment as registration points [79]. This feature could enable a collaborative functionality between teacher and student to further assist pilots’ learning.
Funding
The work described in this paper was funded by the PEGASAS Center of the Federal Aviation Administration Air Transportation Center of Excellence for General Aviation Research, Cooperative Agreement 12-C-GA-ISU.
Disclaimer
Statements and opinions expressed in this text do not necessarily reflect the position or the policy of the United States Government, and no official endorsement should be inferred.
References
1ICAO, “Review of the Classification and Definitions Used for Civil Aviation Activities,” 2009
2NTSB, “US Civil Aviation Accident Statistics.” 2022. Accessed:  Jan. 09, 2023.  [Online].  Available:  https://www.ntsb.gov/safety/Pages/research.aspx
3BoydD. D.In-flight decision-making by general aviation pilots operating in areas of extreme thunderstormsAerosp. Med. Hum. Perform.2017Vol. 88ASMAAlexandria, VA106610721066–7210.3357/AMHP.4932.2017
4BoydD. D.A review of general aviation safety (1984–2017)Aerosp Med Hum Perform2017Vol. 88ASMAAlexandria, VA657664657–6410.3357/AMHP.4862.2017
5BlickensderferB.LanicciJ.GuinnT. A.ThomasR.ThroppJ. E.KingJ.Cruit,J.DeFilippisN.BerendschotK.McSorleyJ.KleberJ.Combined Report: Aviation Weather Knowledge Assessment & General Aviation (GA) Pilots’ Interpretation of Weather ProductsGeneral Aviation Weather Display Interpretation2019Accessed: Oct. 31, 2022. [Online]. Available: https://commons.erau.edu/ga-wx-display-interpretation/13
6A.N.AdministrationR.Office of the Federal Register, “14 CFR 61.105 - Aeronautical knowledgeOffice of the Federal Register, National Archives and Records AdministrationJan. 01, 2012 [Online]. Available https://www.govinfo.gov/app/details/CFR-2012-title14-vol2/CFR-2012-title14-vol2-sec61-105/summary
7Federal Aviation Administration and U.S. Department of TransportationWeather theoryPilot’s Handbook of Aeronautical Knowledge20161261–26[Online]. Available https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/phak/media/14_phak_ch12.pdf
8OrtizY.BlickensderferB.KingJ.Assessment of general aviation cognitive weather tasks: Recommendations for autonomous learning and training in aviation weatherProc. Human Factors and Ergonomics Society Annual Meeting2017Vol. 61Sage PublicationsLos Angeles, CA186118651861–510.1177/1541931213601946
9LanicciJ.GuinnT.KingJ.BlickensderferB.ThomasR.OrtizY.2020A proposed taxonomy for general aviation pilot weather education and trainingJ. Aviation/Aerospace Education Res.2910.15394/jaaer.2020.1815
10WiegmannD. A.TalleurD. A.JohnsonC. M.Redesigning weather training and testing of general aviation pilots by applying traditional curriculum evaluation and advanced simulation-based methodsHuman Factors Division Institute of Aviation2008FAAWashington, DC
11GuinnT.RaderK.Disparities in weather education across professional flight baccalaureate degree programsCollegiate Aviation Review Int.2012Vol. 30UAAMillington, TN112311–2310.22488/okstate.18.100432
12Federal Aviation Administration and U.S. Department of Transportation. Pilot’s Handbook of Aeronautical Knowledge (Federal Aviation Administration and U.S. Department of Transportation, 2016)
13RisukhinV. N.Integration of affordable information technology products into general aviation training and research16th AIAA Aviation Technology, Integration, and Operations Conf.2016AIAAReston, VA1111–1110.2514/6.2016-3917
14PenningtonE.HaferR.NistlerE.SeechT.TossellC.Integration of advanced technology in initial flight training2019 Systems and Information Engineering Design Symposium, SIEDS 20192019IEEEPiscataway, NJ151–510.1109/SIEDS.2019.8735628
15ReismanR.A brief introduction to the art of flight simulationVirtuelle Welten, Ars Electronica1990Linz, Austria159170159–70
16DouradoA. O.MartinC. A.New concept of dynamic flight simulator, Part IAerosp. Sci. Technol.2013Vol. 30ElsevierAmsterdam798279–8210.1016/j.ast.2013.07.005
17RudiD.KieferP.RaubalM.The instructor assistant system (iASSYST) – utilizing eye tracking for commercial aviation training purposesErgonomics2020Vol. 63Taylor & FrancisOxfordshire617961–7910.1080/00140139.2019.1685132
18YavrucukI.KubaliE.TarimciO.A low cost flight simulator using virtual reality toolsIEEE Aerospace and Electronic Systems Magazine2011Vol. 26IEEEPiscataway, NJ101410–410.1109/MAES.2011.5763338
19RossR.SlavinskasD.MazzaconeE.JungT.DaltonJ.US air force weather training platform: Use of virtual reality to reduce training and equipment maintenance costs whilst improving operational efficiency and retention of US air force personnelXR Case Studies: Using Augmented Reality and Virtual Reality Technology in Business2021SpringerCham9110291–10210.1007/978-3-030-72781-9_12
20AlizadehsalehiS.HadaviA.HuangJ. C.2020From BIM to extended reality in AEC industryAutom Constr11610.1016/J.AUTCON.2020.103254
21MilgramP.KishinoF.1994A taxonomy of mixed reality visual displaysIEICE Trans. Information SystemsE77-D132113291321–9
22RollandJ.CakmakciO.2009Head-worn displays: The future through new eyesOpt. Photon. News20202720–710.1364/OPN.20.4.000020
23AzumaR. T.A survey of augmented realityPresence: Teleoperators and Virtual Environments1997Vol. 6355385355–85Accessed: Apr. 27 2022. [Online]. Available http://www.cs.unc.edu/∼azumaW
24MillerJ.HooverM.WinerE.2020Mitigation of the Microsoft HoloLens’ hardware limitations for a controlled product assembly processInt. J. Adv. Manufact. Technol.109174117541741–5410.1007/s00170-020-05768-y
25Microsoft, “Buy HoloLens 2: Find Specs, Features, Capabilities & More – Microsoft Store,” Microsoft. Accessed: Dec. 13, 2022. [Online]. Available https://www.microsoft.com/en-us/d/hololens-2/91pnzzznzwcp
26İbiliE.ÇatM.ResnyanskyD.ŞahinS.BillinghurstM.2020An assessment of geometry teaching supported with augmented reality teaching materials to enhance students’ 3D geometry thinking skillsInt. J. Math. Educ. Sci. Technol.51224246224–4610.1080/0020739X.2019.1583382
27Pérez-LópezD.ConteroM.AlcañizM.Collaborative development of an augmented reality application for digestive and circulatory systems teaching2010 10th IEEE Int’l. Conf. on Advanced Learning Technologies2010IEEEPiscataway, NJ173175173–510.1109/ICALT.2010.54
28BaccaJ.BaldirisS.FabregatR.KinshukGrafS.Mobile augmented reality in vocational education and trainingProcedia Comput. Sci.2015Vol. 75ElsevierAmsterdam495849–5810.1016/j.procs.2015.12.203
29CastilloR. I. BarrazaCruz SánchezV. G.Vergara VillegasO. O.2015A pilot study on the use of mobile augmented reality for interactive experimentation in quadratic equationsMath Probl. Eng.201510.1155/2015/946034
30KüçükS.KapakinS.GöktaşY.2016Learning anatomy via mobile augmented reality: Effects on achievement and cognitive loadAnat. Sci. Educ.9411421411–2110.1002/ase.1603
31FidanM.TuncelM.2019Integrating augmented reality into problem based learning: The effects on learning achievement and attitude in physics educationComput. Educ.14210363510.1016/j.compedu.2019.103635
32CristinaF.DapotoS.ThomasP.PesadoP.Performance evaluation of a 3D engine for mobile devicesCommunications in Computer and Information Science2018Vol. 790155163155–6310.1007/978-3-319-75214-3_15/FIGURES/6
33NazriN. I. A. M.RambliD. R. A.Current limitations and opportunities in mobile augmented reality applications2014 Int’l. Conf. on Computer and Information Sciences, ICCOINS 2014 – A Conference of World Engineering, Science and Technology Congress, ESTCON 2014 – Proceedings2014IEEEPiscataway, NJ141–410.1109/ICCOINS.2014.6868425
34GohE. S.SunarM. S.IsmailA. W.20193D object manipulation techniques in handheld mobile augmented reality interface: A reviewIEEE Access7405814060140581–60110.1109/ACCESS.2019.2906394
35MeisterP.MillerJ.WangK.DorneichM. C.WinerE.BrownL. J.WhitehurstG.2022Designing three-dimensional augmented reality weather visualizations to enhance general aviation weather educationIEEE Trans Prof. Commun.65321336321–3610.1109/TPC.2022.3155920
36MeisterP.WangK.DorneichM. C.WinerE.BrownL.WhitehurstG.2022Augmented reality enhanced thunderstorm learning experiences for general aviationJ. Air Transportation30113124113–2410.2514/1.D0308
37FAA, “Become a Pilot | Federal Aviation Administration.” Accessed: Nov. 21, 2022. [Online]. Available https://www.faa.gov/pilots/become/rec_private
38BallJ.StatesU.D. of Transportation. F. A. Administration. O. of Aviation. C. A. M. Institute, “The Impact of Training on General Aviation Pilots’ Ability to Make Strategic Weather-Related Decisions,” Feb. 2008
39RolfeK. J.JohnM.StaplesFlight Simulation1988Cambridge University PressCambridge
40HuangY.PoolD. M.StroosmaO.ChuQ. P.MulderM.A review of control schemes for hydraulic stewart platform flight simulator motion systemsAIAA Modeling and Simulation Technologies Conf.2016AIAAReston, VA10.2514/6.2016-1436
41StewartD.1965A platform with six degrees of freedomAircraft Engineering and Aerospace Technology38303530–510.1108/eb034141
42PageR.RayL.Brief history of flight simulationSimTecT 2000 Proc.20001111–11Accessed: May 11, 2021. [Online]. Available: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.132.5428&rep=rep1&type=pdf
43Gabriella Severe-ValsaintM.phD Ada MishlerLT Michael NataliP.Randolph AstwoodP.LT Todd SeechP.Cecily McCoy-FisherP.Training effectiveness evaluation of an adaptive virtual instructor for naval aviation training2022Accessed: Dec. 10, 2022. [Online]. Available: https://apps.dtic.mil/sti/pdfs/AD1170195.pdf
44HöllererT.FeinerS.2004Mobile augmented realityTelegeoinformatics: Location-based computing and services21221260221–60
45ClemensA. R. T. H.GruberL.GrassetR.LanglotzT.MilloniA.SchmalstiegD.WagnerD.“The history of mobile augmented reality developments in mobile AR over the last almost 50 years,” 2015, Accessed: Dec. 11, 2022. [Online]. Available http://studierstube.org/handheld_ar/
46ChatzopoulosDi.BermejoC.HuangZ.HuiP.2017Mobile augmented reality survey: From where we are to where we goIEEE Access5691769506917–5010.1109/ACCESS.2017.2698164
47RaduI.Why should my students use AR? A comparative review of the educational impacts of augmented-reality2012 IEEE Int’l. Symposium on Mixed and Augmented Reality (ISMAR)2012IEEEPiscataway, NJ313314313–410.1109/ISMAR.2012.6402590
48WebsterR.Declarative knowledge acquisition in immersive virtual learning environmentsInteractive Learning Environments2015Vol. 24Taylor & FrancisOxfordshire131913331319–3310.1080/10494820.2014.994533
49IrwansyahF. S.YusufY. M.FaridaI.RamdhaniM. A.2018Augmented reality (AR) technology on the android operating system in chemistry learningIOP Conf. Ser Mater. Sci. Eng.2881206810.1088/1757-899x/288/1/012068
50MacchiarellaN. D.VincenziD. A.Augmented reality in a learning paradigm for flight and aerospace maintenance trainingAIAA/IEEE Digital Avionics Systems Conf. – Proc.2004Vol. 1IEEEPiscataway, NJ10.1109/DASC.2004.1391342
51BroyN.AndréE.SchmidtA.Is stereoscopic 3D a better choice for information representation in the car?Proc. 4th Int’l. Conf. on Automotive User Interfaces and Interactive Vehicular ApplicationsAutomotiveUI ’122012Association for Computing MachineryNew York, NY, USA9310093–10010.1145/2390256.2390270
52RaduI.2014Augmented reality in education: A meta-review and cross-media analysisPers Ubiquitous Comput18153315431533–4310.1007/s00779-013-0747-y
53SolakE.CakirR.Investigating the role of augmented reality technology in the language classroomCroatian J. Education : Hrvatski c̆asopis za odgoj i obrazovanje2016Vol. 18106710851067–8510.15516/cje.v18i4.1729
54CuervoE.Wolman,A.CoxL. P.LebeckK.RazeenA.SaroiuS.MusuvathiM.Kahawai: High-quality mobile gaming using GPU offloadProc. 13th Ann. Int’l. Conf. on Mobile Systems, Applications, and ServicesMobiSys ’152015Association for Computing MachineryNew York, NY, USA121135121–3510.1145/2742647.2742657
55“iPad Pro 11-inch (4th generation) – Technical Specifications.” Accessed: Mar. 20, 2023. [Online]. Available: https://support.apple.com/kb/SP882?viewlocale=en_US&locale=en_US
56ArthC.SchmalstiegD.Challenges of large-scale augmented reality on smartphonesISMAR 2011 Workshop on Enabling Large-Scale Outdoor Mixed Reality and Augmented Reality2011[Online]. Available: http://data.icg.tugraz.at/∼dieter/publications/Schmalstieg_224.pdf
57CapinT.PulliK.Akenine-MöllerT.2008The state of the art in mobile graphics researchIEEE Comput. Graph. Appl.28748474–8410.1109/MCG.2008.83
58ShiS.NahrstedtK.CampbellR.2012A real-time remote rendering system for interactive mobile graphicsACM Trans. Multimedia Comput. Commun. Appl.810.1145/2348816.2348825
59BoosK.ChuD.CuervoE.FlashBack: Immersive virtual reality on mobile devices via rendering memoizationProc. 14th Annual Int’l. Conf. on Mobile Systems, Applications, and ServicesMobiSys ’162016Association for Computing MachineryNew York, NY291304291–30410.1145/2906388.2906418
60NusratF.HassanF.ZhongH.WangX.How developers optimize virtual reality applications: A study of optimization commits in open source unity projects2021 IEEE/ACM 43rd Int’l. Conf. on Software Engineering (ICSE)2021IEEEPiscataway, NJ473485473–8510.1109/ICSE43902.2021.00052
61WagnerD.ReitmayrG.MulloniA.DrummondT.SchmalstiegD.2009Real-time detection and tracking for augmented reality on mobile phonesIEEE Trans. Vis. Comput. Graph.16355368355–6810.1109/TVCG.2009.99
62OlssonT.SaloM.Online user survey on current mobile augmented reality applications2011 10th IEEE Int’l. Symposium on Mixed and Augmented Reality2011IEEEPiscataway, NJ758475–8410.1109/ISMAR.2011.6092372
63Unity, “Unity Real-Time Development Platform|3D, 2D VR&AR Engine.” Accessed: May 18, 2021. [Online]. Available: https://unity.com/
64N. N. W. S. US Department of Commerce, “What is a Microburst?”
65Unity, “Unity – Manual: Physics.” Accessed: Mar. 22, 2023. [Online]. Available: https://docs.unity3d.com/Manual/PhysicsSection.html
66“Severe Weather 101: Thunderstorm Basics.” Accessed: Mar. 21, 2023. [Online]. Available: https://www.nssl.noaa.gov/education/svrwx101/thunderstorms/
67“OnePlus 8 Pro Specs - OnePlus (Global).” Accessed: Jul. 06, 2023. [Online]. Available https://www.oneplus.com/global/8-pro/specs
68“iPhone 13 – Technical Specifications.” Accessed: Jul. 06, 2023. [Online]. Available: https://support.apple.com/kb/SP851?locale=en_US
69“iPad Pro 11-inch (2nd generation) – Technical Specifications.” Accessed: Jul. 09, 2023. [Online]. Available: https://support.apple.com/kb/SP814?locale=en_US
70Unity, “Unity - Manual: Occlusion culling.” Accessed: Jan. 16, 2023. [Online]. Available https://docs.unity3d.com/Manual/OcclusionCulling.html
71“Meta Quest 3: New Mixed Reality VR Headset - Shop Now | Meta Store.” Accessed: Oct. 22, 2023. [Online]. Available: https://www.meta.com/quest/quest-3/
72“VIVE Pro 2 Specs | VIVE United States.” Accessed: Oct. 22, 2023. [Online]. Available: https://www.vive.com/us/product/vive-pro2/specs/
73“PC Power Supplies|Newegg.com.” Accessed: Oct. 22, 2023. [Online]. Available:  https://www.newegg.com/Power-Supplies/SubCategory/ID-58
74“TRC 6000 FULL MOTION C172G – TRC Simulators.” Accessed: Oct. 22, 2023. [Online]. Available: https://www.trcsimulators.com/trc-472fgm-full-motion-system/
76“iPhone 15 and iPhone 15 Plus - Apple.” Accessed: Oct. 22, 2023. [Online]. Available: https://www.apple.com/iphone-15/
78LongT.2022Analysis of weather-related accident and incident data associated with section 14 CFR Part 91 operationsCollegiate Aviation Review40253925–39
79Vuforia, “Getting Started with Vuforia Engine in Unity | VuforiaLibrary.” Accessed: May 18, 2021. [Online]. Available: https://library.vuforia.com/articles/Training/getting-started-with-vuforia-in-unity.html