Regular
Cognitive Sciences
Depth Image Based Rendering
emergent augmented reality platformsextended reality
Immersive Experiences
Media Arts
Social Shared SpacesSocial Sciences
virtual and augmented reality systemsVirtual/Augmented Reality Exhibitionsvirtual reality UI and UXVirtual RealityVR and AR in education, learning, gaming, artVirtual reality
 
Embedded Systems 3D interaction techniques Gaming building evacuation Painting Writing immersive VR Drawing Sketching systematic review information behaviors Interactive Narrative situational awareness Data Visualization Procedural Rhetoric Virtual Reality Storytelling co-presence embodiment eye tracking Vulkan
 Filters
Month and year
 
  30  5
Image
Page ,  © Society for Imaging Science and Technology 2022
Volume 34
Issue 12
Abstract

Virtual and augmented reality systems are evolving. In addition to research, the trend toward content building continues and practitioners find that technologies and disciplines must be tailored and integrated for specific visualization and interactive applications. This conference serves as a forum where advances and practical advice toward both creative activity and scientific investigation are presented and discussed. Research results can be presented and applications can be demonstrated.

Digital Library: EI
Published Online: January  2022
  29  2
Image
Pages 271-1 - 271-6,  This work is licensed under the CC BY-NC-SA 4.0 (Attribution-NonCommercial-ShareAlike 4.0 International License). To view a copy of this license, visit https://creativecommons.org/licenses/by-nc-sa/4.0/. 2022
Volume 34
Issue 12
Abstract

This paper discusses the Erasmus XR project which responds to the urgent need to enrich the existing educational programs for both cultural and media managers, but also for artists aspiring to connect with their audiences in the digital space. The project’s overall goal is to develop an educational offer for these groups in the field of immersive media (XR), and ways of using these media to engage audiences. More specifically, the project aims at increasing the skills and competences of the participants in designing and evaluating immersive experiences in order to effectively manage, disseminate, and produce culture in the digital sphere.

Digital Library: EI
Published Online: January  2022
  89  37
Image
Pages 187-1 - 187-5,  © Society for Imaging Science and Technology 2022
Volume 34
Issue 12
Abstract

Visualization explores quantitative content of data with human intuition and plays an integral part in the data mining process. When the data is big there are different analysis methods and approaches used to find inherent patterns and relationships. However, sometimes a human in loop intervention is needed to find new connections and relationships that the existing algorithms cannot provide. Immersive Virtual Reality (VR) provides the “sense of presence” and gives the ability to discover new connections and relationships by visual inspection. The goal of this work is to investigate the merging of immersive VR and data science for advanced visualization. VR and immersive visualization involve interplay between novel technology and human perception to generate insight into both. We propose to use immersive VR for exploring the higher dimensionality and abstraction that are associated with big data. VR can use an abstract representation of high-dimensional data in support of advanced scientific visualization. This paper demonstrates the data visualization tool with real-time feed of Baltimore crime data in immersive environment, non-immersive environment, and mobile environment. We have combined virtual reality interaction techniques and 3D geographical information representation to enhance the visualization of situational impacts as shown in Figure 1. The data visualization tool is developed using the Unity gaming engine. We have presented bar graphs with oculus controller to combine the bar chart visualization with a zooming feature that allows users to view the details more effectively. The Oculus Touch headset allows the users to navigate and experience the environment with full immersion. Oculus Touch controllers also give haptic feedback to the user when using objects such as laser pointer. We are interested in extending our VR data visualization tools to enable collaborative, multi-user data exploration and to explore the impact of VR on collaborative analytics tasks, in comparison with traditional 2D visualizations. The benefits of our proposed work include providing a data visualization tool for immersive visualization and visual analysis. We also suggest key features that immersive analytics can provide with situational awareness and in loop human intervention for decision making.

Digital Library: EI
Published Online: January  2022
  85  20
Image
Pages 269-1 - 269-6,  © Society for Imaging Science and Technology 2022
Volume 34
Issue 12
Abstract

Virtual Reality and Free Viewpoint navigation require high-quality rendered images to be realistic. Current hardware assisted raytracing methods cannot reach the expected quality in real-time and are also limited by the 3D mesh quality. An alternative is Depth Image Based Rendering (DIBR) where the input only consists of images and their associated depth maps for synthesizing virtual views to the Head Mounted Display (HMD). The MPEG Immersive Video (MIV) standard uses such DIBR algorithm called the Reference View Synthesizer (RVS). We have first implemented a GPU version, called the Realtime accelerated View Synthesizer (RaViS), that synthesizes two virtual views in real-time for the HMD. In the present paper, we explore the differences between desktop and embedded GPU platforms, porting RaViS to an embedded HMD without the need for a separate, discrete desktop GPU. The proposed solution gives a first insight into DIBR View Synthesis techniques in embedded HMDs using OpenGL and Vulkan, a cross-platform 3D rendering library with support for embedded devices.

Digital Library: EI
Published Online: January  2022
  158  7
Image
Pages 270-1 - 270-6,  © Society for Imaging Science and Technology 2022
Volume 34
Issue 12
Abstract

This project presents a virtual reality (VR) Interactive Narrative aiming to leave users reflecting on the perspectives one chooses to view life through. The narrative is driven by interactions designed using the concept of procedural rhetoric, which explores how rules and mechanics in games can persuade people about an idea, and Shin’s cognitive model, which presents a dynamic view of immersion in VR. The persuasive nature of procedural rhetoric in combination with immersion techniques such as tangible interfaces and first-person elements of VR can effectively work together to immerse users into a compelling narrative experience with an intended emotional response output. The narrative is experienced through a young woman in a state between life and death, who wakes up as her subconscious-self in a limbo-like world consisting of core memories from her life, where the user is tasked with taking photos of the protagonist’s memories for her to come back to life. Users primarily interact with and are integrated into the narrative through a photography mechanic, as they have the agency to select ”perspective” filters to apply to the woman’s camera from which to view a core memory through, ultimately choosing which perspectives of her memories become permanent when she comes back to life.

Digital Library: EI
Published Online: January  2022
  213  21
Image
Pages 297-1 - 297-5,  © Society for Imaging Science and Technology 2022
Volume 34
Issue 12
Abstract

Realistically writing and sketching on virtual surfaces in vir-tual reality (VR) enhance the potential uses of VR in productionand art creation. To achieve a similar writing experience as inthe real world, the VR system needs to be pressure sensitive to theuser’s stylus on the virtual plane to create the effect of differentpen strokes and improve the user experience. Typical 6 degree-of-freedom (DoF) VR controllers do not measure force or pressureon surfaces because they are held in mid-air. We propose a newmethod, VirtualForce, to calculate the force based on the differ-ence between the user’s physical hand position and their handavatar in VR. Our method does not require any specialized hard-ware. Furthermore, we explore the potential of our method toimprove the creation of VR art, and we present several ways inwhich VirtualForce can greatly enhance the accuracy of drawingand writing on virtual surfaces in virtual reality.

Digital Library: EI
Published Online: January  2022
  388  32
Image
Pages 298-1 - 298-19,  © Society for Imaging Science and Technology 2022
Volume 34
Issue 12
Abstract

We present a state of the art and scoping review of the literature to examine embodied information behaviors, as reflected in shared gaze interactions, within co-present extended reality experiences. Recent proliferation of consumer-grade head-mounted XR displays, situated at multiple points along the Reality-Virtuality Continuum, has increased their application in social, collaborative, and analytical scenarios that utilize data and information at multiple scales. Shared gaze represents a modality for synchronous interaction in these scenarios, yet there is a lack of understanding of the implementation of shared eye gaze within co-present extended reality contexts. We use gaze behaviors as a proxy to examine embodied information behaviors. This review examines the application of eye tracking technology to facilitate interaction in multiuser XR by sharing a user’s gaze, identifies salient themes within existing research since 2013 in this context, and identifies patterns within these themes relevant to embodied information behavior in XR. We review a corpus of 50 research papers that investigate the application of shared gaze and gaze tracking in XR generated using the SALSA framework and searches in multiple databases. The publications were reviewed for study characteristics, technology types, use scenarios, and task types. We construct a state-of-the field and highlight opportunities for innovation and challenges for future research directions.

Digital Library: EI
Published Online: January  2022
  357  19
Image
Pages 299-1 - 299-6,  © Society for Imaging Science and Technology 2022
Volume 34
Issue 12
Abstract

Active shooter events are not emergencies that can be reasonably anticipated. However, these events do occur more than we think, and there is a dire need for an effective evacuation plan that can increase the likelihood of saving lives and reducing casualties in the event of an active shooting incident. This has raised a major concern about the lack of tools that would allow robust predictions of realistic human movements and the lack of understanding about the interaction in designated environments. Clearly, it is impractical to carry out live experiments where thousands of people are evacuated from buildings designed for every possible emergency condition. There has been progress in understanding human movement, human motion synthesis, crowd dynamics, indoor environments and their relationships with active shooter events, but challenges remain. This paper presents an immersive virtual reality (VR) experimental setup for conducting evacuation experiments and virtual evacuation drills in response to extreme events that impact the actions of occupants. We have presented two ways for controlling crowd behavior. First, by defining rules for agents or NPCs (Non-Player Characters). Second, by providing controls to the users as avatars or PCs (Player characters) to navigate in the VR environment as autonomous agents with a keyboard/ joystick along with an immersive VR headset in real time.The results will enable scientists and engineers to develop more realistic models of the systems they are designing, and to obtain greater insights into their eventual behavior, without having to build costly prototypes.

Digital Library: EI
Published Online: January  2022

Keywords

[object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object]