Regular
Games for Health
Pain Distraction
Virtual EnvironmentsVR
 Filters
Month and year
 
  7  0
Image
Pages 1 - 3,  © Society for Imaging Science and Technology 2016
Digital Library: EI
Published Online: February  2016
  35  6
Image
Pages 1 - 5,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

Immersive Virtual Reality (VR) has been shown to work as a non-pharmacological analgesic by inducing [sic] cognitive distraction in acute pain patients. Researchers have shown that VR games have the potential to distract patients cognitively and function as a type of pain management therapy. In this paper, we introduce the gameplay and design metaphors of Mobius Floe (MF), an immersive VR pain distraction game for acute and chronic pain patients. MF introduces an experimental approach with more engaging game interactivity to improve cognitive distraction for pain relief. In MF, we designed game mechanics with specific pain metaphors and therapeutic elements in immersive VR. In this paper, we analyze and explain the overall gameplay design principles and each pain metaphor element implemented in the game. We believe the design procedures and the way we implemented pain metaphors will inspire game design ideas in VR health game design and provide potentially useful references for other researchers and game designers.

Digital Library: EI
Published Online: February  2016
  19  2
Image
Pages 1 - 6,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

As soon as “serious” conclusions (with respect to reality) have to be drawn from virtual reality experiences (training, virtual prototyping…), it is now more and more acknowledged that, besides display calibration and computer graphics issues, some attention has to be given to perceptual calibration, on the human side. This paper presents results from recent experiments that extend previous data on speed perception during driving simulation. They show 1) that the manipulation of the position of the rendering (virtual) camera strongly influences the drivers' speed perception, by transforming the optical flow pattern and 2) that this manipulation remains unnoticed by the driver and does not impact his/her attitude toward the simulation. They suggest that the position of the driver's viewpoint, with respect to the simulation screen, is of critical importance for the calibration of ecologically valid simulation systems. More generally they emphasize the fact that perceptual calibration is fundamental in “serious” virtual reality applications.

Digital Library: EI
Published Online: February  2016
  71  12
Image
Pages 1 - 6,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

Many VR applications require picking and moving of virtual objects. Many gesture based solutions do not work reliably, whereas controller based methods are not as natural as hand pose recognition. Tethered controllers add the issue of a cable being in the user’s way. We developed a gesture interface using a Leap Motion finger tracker attached to an Oculus Rift DK2 and implemented three ways of interacting with objects: innate pinching, magnetic force, and a physical button attached to the index finger. We built a virtual reality test scenario, in which the user needs to move virtual objects between shelves to sort them. Initial testing shows that grabbing with the button works better than the other two more natural methods. Besides the user interaction techniques, we also report on our practical experiences using Oculus Rift, Leap Motion and the button with the Unity 3D development platform.

Digital Library: EI
Published Online: February  2016
  22  0
Image
Pages 1 - 6,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

As the booming of mobile technologies, handheld Augmented Reality draws increasing attention. One crucial process is the position and orientation estimation of the handheld device relative to a reference coordinate system in order to display the virtual content accurately upon the real world. An emerging application is to integrate multimedia elements (e.g., text, images, sound, video) in physical real books to enhance the learning process or provide amusement and interaction. Square markers are popularly used due to their ease-to-use and high accuracy. However, these markers cause visual discomfort. This paper introduces a novel pose tracking approach for mobile devices by combining a tricolored strip, an internal camera and embedded inertial sensors. Furthermore, the application to books is studied. The proposed approach is less intrusive than square markers due to its reduced size and provides accurate six-degree-of-freedom pose estimation in real time.

Digital Library: EI
Published Online: February  2016
  124  20
Image
Pages 1 - 6,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

Negative emotions can lead to emotional stress, and people in a negative state for a long time would easily have physiological and psychological diseases. Virtual reality technology could generate a lifelike high immersion simulated environment may relieve the emotional stress. The main purpose of this paper was to verify effectiveness of the virtual reality system to induce an emotional change. A multi-channel VR system was built to create the fear scene and elicit subjects’ fear sense. Four different type of fear scenes: the Imagine fear scene, the Unknown fear scene, the Threatened fear scene and the Height fear scene were tested in experiment and subject's emotion change would be recorded by ways of subjective and physiological parameters, such as the subject’s subjective feelings, heart rates and behavior performance etc. 15 subjects participated in experiment. The experiment result showed that VR system can induce subject’s emotion change and the Threatended scene more easily induces responsive change in four different fearful scenes. In addition, the correlations between the visual, tactile and auditory stimuli of a VR system and emotion induced was also analyzed and the result showed that the correlation of visual stimulation was stronger than tactile and auditory.

Digital Library: EI
Published Online: February  2016
  40  7
Image
Pages 1 - 6,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

This paper will bring together two terms originating from disciplines that at first glance may seem to be unrelated to artistic activity in Virtual Environments or Virtual Worlds: ‘Storyworld’ which is largely grounded in the field of Narratology, and Gesamtkunstwerk’ from the field of Aesthetics. These terms will be used as the theoretical framework that explicates on the creation of virtual, three dimensional ‘art ecologies’ for narrative purposes in virtual worlds. One such art ecology, created by the author, will be used as an example as to how such a narrative space has been built.

Digital Library: EI
Published Online: February  2016
  29  2
Image
Pages 1 - 7,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

Multi camera vision system has been adopted as a versatile tool for monitoring continuously dynamics environment. Along of time several authors have worked with approaches such that offers high level interpretations over those environments. However, each one of these approaches are founded in specific theoretical sustenance. These diversities may generate heterogeneity of results; which it makes them incomparable. In order to propose a useful way to use the knowledge, it is necessary to develop approaches to unify and query these data. This work proposes an approach to unify heterogeneous data, establishing several metrics criteria for visually query criterion aided by augmented reality.The proposal describes three principal stages, a) Catch the dynamic of movement by several approaches; b) A Similarity criterion to query by example spatio-temporal information reference and c) An Overlapping criterion of data repository aided by Augmented Reality.The experimental model presents a new way to query information in heterogeneous data bases obtained by different approaches, finding similar trajectories using visual query-by-example approach and displaying the results aided by Augmented Reality infrastructure.

Digital Library: EI
Published Online: February  2016
  92  13
Image
Pages 1 - 9,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

Our work explores the potential in turning the concept of presence “inside-out.” We theorize a relationship between current multidimensional conceptualizations of presence, theories which utilize predictive processing and comparative models to explain the underlying cognitive structure of presence, and evidence which suggests that past experience with both the content of media and the media providing the content are influential factors in overall felt presence. We propose a conceptualization of presence not as a unified “sense of being there”, but rather as a “sense of feeling real” as the result of an automatic perceptual process. We consider an emergent metanarrative for presence, or lack thereof, as an alignment of external stimuli with an internal set of schemata. Lastly, we explore possible new research questions and discuss the implications of our proposed model for both the design of virtual experiences and the measurement of presence.

Digital Library: EI
Published Online: February  2016
  6  0
Image
Pages 1 - 11,  © Society for Imaging Science and Technology 2016
Volume 28
Issue 4

While previous research in academia points to the ability of Natural User Interfaces (NUIs) and low-cost display devices to help users better understand a design, there does not exist much research on how these devices can be integrated into existing legacy code used by engineering and design firms. The lack of commercial engineering software that integrates NUIs and low-cost display devices, like the Oculus Rift, can be attributed to the fast changing device market and the lack of awareness many engineering software makers show in emerging interaction paradigms. The lack of work in the area of integrating low-cost immersion devices into commercial software creates a barrier for adoption of these new devices and interaction paradigms. The work presented in this paper details a proof of concept system integrating the Leap Motion and Oculus Rift, into a commercial engineering visualization and analysis package called Siemens’ Teamcenter® Lifecycle Visualization Mockup (Mockup). Based on the recorded performance data, hooking up both the Leap and the Oculus results in a frame rate of around 30 frame per second. Indicating that these two devices together can provide real time, fluid interaction in a commercial engineering platform.

Digital Library: EI
Published Online: February  2016

Keywords

[object Object] [object Object] [object Object]