Regular
CHEMOTHERAPY
EMERGENT AUGMENTED REALITY PLATFORMS
FARM SIMULATION
GAME DESIGN AND MECHANICS
PAIN DISTRACTIONPEDIATRIC ONCOLOGY PATIENTS
SERIOUS GAME
VIRTUAL AND AUGMENTED REALITY SYSTEMSVIRTUAL AND AUGMENTED REALITY IN EDUCATION, LEARNING, GAMING, ARTVIRTUAL REALITY UI AND UXVIRTUAL REALITY
 Filters
Month and year
 
  19  3
Image
Pages 553-1 - 553-6,  © Society for Imaging Science and Technology 2018
Digital Library: EI
Published Online: January  2018
  201  21
Image
Pages 432-1 - 432-4,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

For nearly two decades, immersive Virtual Reality (VR) has been used as a form of pain distraction and management for acute, chronic and cancer pain. Moreover, in numerous studies, VR games and virtual environments (VEs) have been shown to be effective during chemotherapy treatment for reducing pain and anxiety. However, amongst all of these research studies, few have focused on pediatric patients who are undergoing chemotherapy treatment. Furthermore, most of the research studies used commercial video games which were neither specifically tailored for immersive VR environments nor for pediatric patients. Therefore, to understand pediatric oncology patients' preferences about using specific VR games during their treatment, we developed a VR farm simulation game called Farmooo using an Oculus Rift DK2 and a Leap Motion sensor. In this paper, we introduce the design inspirations, rationale and procedures, as well as the game mechanics of the VR game we specifically built for this application, entitled Farmooo. Our goal was to design and test a VR pain distraction game that will enable our future participants to immerse themselves in the game world and thereby help to distract them from their pain and discomfort during and after chemotherapy. Results from our pilot Focus Group study with pediatric outpatients showed the great potential of the VR game, Farmooo, as a distraction tool.

Digital Library: EI
Published Online: January  2018
  48  16
Image
Pages 433-1 - 433-5,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

The concepts of immersion and presence are complementary in the field of Virtual Reality research. Immersion is indeed a necessary and non-sufficient condition for the sensation of presence, this latter being also linked to psychological and contextual factors. In our experimental approach, we focus mainly on spatial presence (the sensation of being "there" in the virtual environment). Spatial presence is dependent on cognitive and sensorimotor aspects of navigation and manipulation within virtual environments. More precisely, we focus on behavioral presence, the fact that participants in virtual reality experiences behave in a manner similar to their behavior in a real environment. In this sense, presence appears as a relevant concept, to evaluate the psychological and behavioral validity of human behavior within virtual environments, with respect to reality. Through different experimental studies dealing with human spatial behavior and using different VR setups, our behavioral approach to presence is presented and discussed. An application case will be also presented, suggesting that presence is also related to visuo-proprioceptive coherency.

Digital Library: EI
Published Online: January  2018
  174  2
Image
Pages 434-1 - 434-9,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

The emergence of VR as a broadly available consumer technology is driving a renewed need for knowledge on how to enhance presence and design virtual experiences for a broader range of users and use cases that come with VR ubiquity. In prior work, we established an integrative framework for presence expanding the definition from a "sense of being there" to a "sense of feeling real," which can encompass multiple dimensions of presence and interactions amongst them. Here we investigate the role of variations in three variables on the experience of presence in virtual reality: expertise in real-world activities, interaction ability, and the virtual hand ownership illusion. Through immersing users in a commercially available VR environment that simulates rock climbing and utilizing a mixed methods approach, we provide insight into how individual differences in felt presence arise between users. This new work supports our integrative framework and provides methods by which a broader research and design community can extend it and assess differences in users to support the design of immersive experiences which address the components and underlying determinants of individual differences in felt presence in VR. Our results indicate that there exist relationships between expertise in real world tasks and corresponding activity within virtual environments and underlying components of presence including interaction ability and a virtual hand ownership illusion.

Digital Library: EI
Published Online: January  2018
  136  6
Image
Pages 435-1 - 435-6,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

Virtual reality presents a new set of challenges and opportunities for both engineers and neuroscientists. Here we provide an overview of a programme designed by a group of psychologists, neuroscientists and VR specialists to address some of the most outstanding issues in the field ranging from the very low-level (for example, how the brain processed motion-in-depth signals generated by stereoscopic display devices) to the very high level (how virtual environments can lead to a sense of immersion and emotional engagement). We present data from psychophysical, electrophysiological and neuroimaging experiments and explain how different research methodologies can be applied to different problems in the field of VR/AR. We end by describing an open-source, extensible software package for studying issues in VR that can interface to common laboratory measurement equipment and discussing future directions and challenges facing the neuroscience and VR engineering communities.

Digital Library: EI
Published Online: January  2018
  151  4
Image
Pages 449-1 - 449-10,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

We are creating INSTRUMENT: One Antarctic Night, a performative, multi-participant, and reconfigurable virtual reality artwork. We describe development of a large scale immersive star field from data collected by the AST3 (Antarctic Survey Telescope) robotic telescopes at Dome A of 817,373 astronomical objects from within the Large Magellanic Cloud within a game-engine based virtual environment. Real-time database queries, selections, and filtering operations enable participants to collaboratively interact with the star field to create dataremixes from astronomical data. Additionally, they facilitate collaborative creation of a soundscape via ambisonic audio spatialization and interactive sonification of the data utilizing data-driven granular synthesis. We evaluate the scalability of our approach and demonstrate that it maintains interactive frame rates at datasets with millions of astronomical objects, with each object being both individually manipulable or selectable and manipulable within subsets. Our user interface/interaction prototypes include a controller-attached UI and a wave/ripple based interaction where users grab hold of the star field and propagate waves and ripples throughout the virtual world. Our work arises from the art-science practice of dataremix: the appropriation and recombination (remixing) of data in any and all states along the continuum of its transformation from raw to processed to create new meaning.

Digital Library: EI
Published Online: January  2018
  56  21
Image
Pages 450-1 - 450-6,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

Text input in virtual reality is a problem that does not currently have a widely accepted standard method. As VR headsets have become more commonplace, text input has also become more important. Using a physical keyboard is not possible with a head-mounted display that blocks the users visual field. The two most popular solutions for text input in VR today are a virtual keyboard interfacing with VR controllers and voice recognition. However, they either require a handheld controller or a quiet environment. 3D-tracked controllers with a virtual keyboard can simulate a real keyboard to an extent, but they suffer from a lack of tactile feedback that makes typing slow and unintuitive. A more intuitive solution is a Swype or SwiftKey-like algorithm, where the path the users finger travels is used as input, as opposed to individually pressing each key. We implemented a prototype for the Oculus Rift with a Leap Motion controller on it that combines a novel continuous-motion text input method with hand gestures to demonstrate an all-purpose, intuitive method of text input. We compare it to state-of-the-art VR keyboard input with a virtual keyboard, as well as a head-directed input method.

Digital Library: EI
Published Online: January  2018
  70  21
Image
Pages 451-1 - 451-8,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

The visualization and analysis of sensor data is an important aspect of geovisualization, and Virtual Reality-related technology is increasingly used in this context. Here we present the results of a student project which supports users in recording data (such as environment temperature, humidity and light intensity) with a Texas Instruments SensorTag® and subsequently visualizing the data on a smartphone using mobile Virtual Reality technology. The project resulted in two different applications, which both employ a 2D component for data localization and recording, as well as a 3D component to explore the data on a smartphone. In this paper, we describe our results and summarize our experiences regarding the creation and usage of such software and mobile VR environments.

Digital Library: EI
Published Online: January  2018
  45  14
Image
Pages 452-1 - 452-4,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

Augmented reality is very widely used to increase the attractiveness of sightseeing cultural heritage objects. Many locations are superimposed with additional computer generated data such as monuments, buildings, virtual characters, fauna and flora using dedicated systems that must be compatible with existing preservation policy. In this paper we present an example of an application that is used in King's Chinese Cabinet (Museum of King Jan III's Palace at Wilanów, Warsaw, Poland) that was scanned twice, before and after restoration works. This set of 3D data allows us to create an unique application that visualizes the real past state of preservation instead of computer generated artificial one. The interior is under a very strict protection and that increases significantly the requirements for an augmented reality system.

Digital Library: EI
Published Online: January  2018
  138  3
Image
Pages 468-1 - 468-6,  © Society for Imaging Science and Technology 2018
Volume 30
Issue 3

Assembling specialized manufactured equipment, like aircraft, requires advanced production skills that can take years of training and experience to master. Training new workers is often labor intensive and expensive for specialized manufacturing companies. Traditionally, product assembly training in the manufacturing industry predominantly focuses on methods such as textbook learning, and more recently, video guidance. Recent technological advances in Virtual Reality (VR) devices, however, have introduced technology with the potential to improve the current training system. Studies show that VR, training can decrease assembly errors, production cost, and time. Unfortunately, in the past these VR devices were too expensive and required extensive programming knowledge to create a training application. The release of commercial virtual reality (VR) head mounted displays (HMD) and easy to use game engines like Unity 3D has taken steps towards solving this issue. However, because of the recentness of virtual reality's commercial availability, research on training interfaces in manufacturing environments is limited. This paper develops a prototype training system to test the viability of using a VR HMD as an assembly training tool. The hope moving forward is that, as this technology matures, these tools and lessons learned can be used to improve the training process.

Digital Library: EI
Published Online: January  2018

Keywords

[object Object] [object Object] [object Object] [object Object] [object Object] [object Object] [object Object]