Regular
Augmented realityAugmented Realityalarm system
building evacuation
Extended Realityemergent augmented reality platforms
Guesture
Human perceptionHaptic
immersive VR
mobile application development
object detection
Sampled imaging systemSystem MTFSensor Fusion
training simulationText qualityTangible
virtual reality UI and UXVR and AR in education, learning, gaming, artvirtual and augmented reality systemsVirtual RealityVirtual reality
XR
 Filters
Month and year
 
  31  22
Image
Pages A12-1 - A12-10,  © 2023, Society for Imaging Science and Technology 2023
Volume 35
Issue 12
Abstract

Virtual and augmented reality systems are evolving. In addition to research, the trend toward content building continues and practitioners find that technologies and disciplines must be tailored and integrated for specific visualization and interactive applications. This conference serves as a forum where advances and practical advice toward both creative activity and scientific investigation are presented and discussed. Research results can be presented and applications can be demonstrated.

Digital Library: EI
Published Online: January  2023
  79  39
Image
Pages 213-1 - 213-6,  © 2023, Society for Imaging Science and Technology 2023
Volume 35
Issue 12
Abstract

Pixels per degree (PPD) alone is not a reliable predictor for high-resolution experience in VR and AR. This is because "high-resolution experience" not only depends on PPD but also display fill factor, pixel arrangement, graphics rendering, and other factors. This complicates architecture decisions and design comparisons. Is there a simple way to capture all the contributors and match user experience? In this paper, we present a system level model, system MTF, to predict perceptual quality considering all the key VR/AR dimensions: pixel shape (display), pixel per degree (display), fill factor (display), optical blur (Optics), and image processing (graphics pipeline). The metric can be defined in much the same way of traditional MTF for imaging systems by examining image formation of a point source and then performing Fourier transform over the response function, with special mathematical treatment. One application is presented on perceived text quality, where two weight functions depending on text orientation and frequency incorporated into the above model. A perceptual study about text quality was performed to validate the system MTF model.

Digital Library: EI
Published Online: January  2023
  35  26
Image
Pages 214-1 - 214-5,  © 2023, Society for Imaging Science and Technology 2023
Volume 35
Issue 12
Abstract

Many extended reality systems use controllers, e.g. near-infrared motion trackers or magnetic coil-based hand-tracking devices for users to interact with virtual objects. These interfaces lack tangible sensation, especially during walking, running, crawling, and manipulating an object. Special devices such as the Tesla suit and omnidirectional treadmills can improve tangible interaction. However, they are not flexible for broader applications, builky, and expensive. In this study, we developed a configurable multi-modal sensor fusion interface for extended reality applications. The system includes wearable IMU motion sensors, gait classification, gesture tracking, and data streaming interfaces to AR/VR systems. This system has several advantages: First, it is reconfigurable for multiple dynamic tangible interactions such as walking, running, crawling, and operating with an actual physical object without any controllers. Second, it fuses multi-modal sensor data from the IMU and sensors on the AR/VR headset such as floor detection. And third, it is more affordable than many existing solutions. We have prototyped tangible extended reality in several applications, including medical helicopter preflight walking around checkups, firefighter search and rescue training, and tool tracking for airway intubation training with haptic interaction with a physical mannequin.

Digital Library: EI
Published Online: January  2023
  73  31
Image
Pages 217-1 - 217-5,  © 2023, Society for Imaging Science and Technology 2023
Volume 35
Issue 12
Abstract

There is a need to prepare for emergencies such as active shooter events. Emergency response training drills and exercises are necessary to train for such events as we are unable to predict when emergencies do occur. There has been progress in understanding human behavior, unpredictability, human motion synthesis, crowd dynamics, and their relationships with active shooter events, but challenges remain. This paper presents an immersive security personnel training module for active shooter events in an indoor building. We have created an experimental platform for conducting active shooter drills for training that gives a fully immersive feel of the situation and allow one to perform virtual evacuation drills. The security personnel training module also incorporates four sub-modules namely 1) Situational assessment module, 2) Individual officer intervention module, 3) Team Response Module, and 4) Rescue Task Force module. We have developed an immersive virtual reality training module for active shooter events using an Oculus for course of action, visualization, and situational awareness for active shooter events as shown in Fig.1. The immersive security personnel training module aims to get information about the emergency situation inside the building. The dispatched officer will verify the active shooter situation in the building. The security personnel should find a safe zone in the building and secure the people in that area. The security personnel should also find the number and location of persons in possible jeopardy. Upon completion of the initial assessment, the first security personnel shall advise communications and request resources as deemed necessary. This will allow determining whether to take immediate action alone or with another officer or wait until additional resources are available. After successfully gathering the information, the personnel needs to update the info to their officer through a communication device.

Digital Library: EI
Published Online: January  2023
  130  37
Image
Pages 218-1 - 218-5,  © 2023, Society for Imaging Science and Technology 2023
Volume 35
Issue 12
Abstract

The usefulness of mobile devices has increased greatly in recent years allowing users to perform more tasks in daily life. Mobile devices and applications provide many benefits for users, perhaps most significantly is the increased access to point-of-use tools, navigation, and alert systems. This paper presents a prototype of a cross-platform mobile augmented reality (AR) system with the core purpose of finding a better means to keep the campus community secure and connected. The mobile AR System consists of four core functionalities – an events system, a policing system, a directory system, and a notification system. The events system keeps the community up-to-date on current events that are happening or will be happening on campus. The policing system allows the community to stay in arms reach of campus resources that will allow them to stay secure. The directory system serves as a one-stop-shop for campus resources, ensuring that staff, faculty, and students will have a convenient and efficient means of accessing pertinent information on the campus departments. The mobile augmented reality system includes integrated guided navigation system that users can use to get directions to various destinations on campus. The various destinations are different buildings and departments on campus. This mobile augmented reality application will assist the students and visitors on campus to efficiently navigate the campus as well as send alert and notifications in case of emergencies. This will allow campus police to respond to the emergencies in a quick and timely manner. The mobile AR system was designed using Unity Game Engine and Vuforia Engine for object detection and classification. Google Map API was integrated for GPS integration in order to provide location-based services. Our contribution lies in our approach to create a user specific customizable navigational and alert system in order to improve the safety of the users at their workplace. Specifically, the paper describes the design and implementation of the proposed mobile AR system and reports the results of the pilot study conducted to evaluate their perceived ease-of-use, and usability.

Digital Library: EI
Published Online: January  2023

Keywords

[object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object]