Sample records for immersive interactive virtual

  1. Social Interaction Development through Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2014-01-01

    The purpose of this pilot study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity…

  2. Evaluating the Effects of Immersive Embodied Interaction on Cognition in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Parmar, Dhaval

    Virtual reality is on its advent of becoming mainstream household technology, as technologies such as head-mounted displays, trackers, and interaction devices are becoming affordable and easily available. Virtual reality (VR) has immense potential in enhancing the fields of education and training, and its power can be used to spark interest and enthusiasm among learners. It is, therefore, imperative to evaluate the risks and benefits that immersive virtual reality poses to the field of education. Research suggests that learning is an embodied process. Learning depends on grounded aspects of the body including action, perception, and interactions with the environment. This research aims to study if immersive embodiment through the means of virtual reality facilitates embodied cognition. A pedagogical VR solution which takes advantage of embodied cognition can lead to enhanced learning benefits. Towards achieving this goal, this research presents a linear continuum for immersive embodied interaction within virtual reality. This research evaluates the effects of three levels of immersive embodied interactions on cognitive thinking, presence, usability, and satisfaction among users in the fields of science, technology, engineering, and mathematics (STEM) education. Results from the presented experiments show that immersive virtual reality is greatly effective in knowledge acquisition and retention, and highly enhances user satisfaction, interest and enthusiasm. Users experience high levels of presence and are profoundly engaged in the learning activities within the immersive virtual environments. The studies presented in this research evaluate pedagogical VR software to train and motivate students in STEM education, and provide an empirical analysis comparing desktop VR (DVR), immersive VR (IVR), and immersive embodied VR (IEVR) conditions for learning. This research also proposes a fully immersive embodied interaction metaphor (IEIVR) for learning of computational concepts as a future direction, and presents the challenges faced in implementing the IEIVR metaphor due to extended periods of immersion. Results from the conducted studies help in formulating guidelines for virtual reality and education researchers working in STEM education and training, and for educators and curriculum developers seeking to improve student engagement in the STEM fields.

  3. Digital Immersive Virtual Environments and Instructional Computing

    ERIC Educational Resources Information Center

    Blascovich, Jim; Beall, Andrew C.

    2010-01-01

    This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…

  4. Coercive Narratives, Motivation and Role Playing in Virtual Worlds

    DTIC Science & Technology

    2002-01-01

    resource for making immersive virtual environments highly engaging. Interaction also appeals to our natural desire to discover. Reading a book contains...participation in an open-ended Virtual Environment (VE). I intend to take advantage of a participants’ natural tendency to prefer interaction when possible...I hope this work will expand the potential of experience within virtual worlds. K e y w o r d s : Immersive Environments , Virtual Environments

  5. Using Virtual Reality to Help Students with Social Interaction Skills

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2015-01-01

    The purpose of this study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity to…

  6. LVC interaction within a mixed-reality training system

    NASA Astrophysics Data System (ADS)

    Pollock, Brice; Winer, Eliot; Gilbert, Stephen; de la Cruz, Julio

    2012-03-01

    The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems.

  7. Declarative Knowledge Acquisition in Immersive Virtual Learning Environments

    ERIC Educational Resources Information Center

    Webster, Rustin

    2016-01-01

    The author investigated the interaction effect of immersive virtual reality (VR) in the classroom. The objective of the project was to develop and provide a low-cost, scalable, and portable VR system containing purposely designed and developed immersive virtual learning environments for the US Army. The purpose of the mixed design experiment was…

  8. Active Learning through the Use of Virtual Environments

    ERIC Educational Resources Information Center

    Mayrose, James

    2012-01-01

    Immersive Virtual Reality (VR) has seen explosive growth over the last decade. Immersive VR attempts to give users the sensation of being fully immersed in a synthetic environment by providing them with 3D hardware, and allowing them to interact with objects in virtual worlds. The technology is extremely effective for learning and exploration, and…

  9. The Effects of Instructor-Avatar Immediacy in Second Life, an Immersive and Interactive Three-Dimensional Virtual Environment

    ERIC Educational Resources Information Center

    Lawless-Reljic, Sabine Karine

    2010-01-01

    Growing interest of educational institutions in desktop 3D graphic virtual environments for hybrid and distance education prompts questions on the efficacy of such tools. Virtual worlds, such as Second Life[R], enable computer-mediated immersion and interactions encompassing multimodal communication channels including audio, video, and text-.…

  10. 'Putting it on the table': direct-manipulative interaction and multi-user display technologies for semi-immersive environments and augmented reality applications.

    PubMed

    Encarnação, L Miguel; Bimber, Oliver

    2002-01-01

    Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.

  11. Immersive realities: articulating the shift from VR to mobile AR through artistic practice

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy; Berry, Rodney; DeFanti, Thomas A.

    2012-03-01

    Our contemporary imaginings of technological engagement with digital environments has transitioned from flying through Virtual Reality to mobile interactions with the physical world through personal media devices. Experiences technologically mediated through social interactivity within physical environments are now being preferenced over isolated environments such as CAVEs or HMDs. Examples of this trend can be seen in early tele-collaborative artworks which strove to use advanced networking to join multiple participants in shared virtual environments. Recent developments in mobile AR allow untethered access to such shared realities in places far removed from labs and home entertainment environments, and without the bulky and expensive technologies attached to our bodies that accompany most VR. This paper addresses the emerging trend favoring socially immersive artworks via mobile Augmented Reality rather than sensorially immersive Virtual Reality installations. With particular focus on AR as a mobile, locative technology, we will discuss how concepts of immersion and interactivity are evolving with this new medium. Immersion in context of mobile AR can be redefined to describe socially interactive experiences. Having distinctly different sensory, spatial and situational properties, mobile AR offers a new form for remixing elements from traditional virtual reality with physically based social experiences. This type of immersion offers a wide array of potential for mobile AR art forms. We are beginning to see examples of how artists can use mobile AR to create social immersive and interactive experiences.

  12. Virtually There.

    ERIC Educational Resources Information Center

    Lanier, Jaron

    2001-01-01

    Describes tele-immersion, a new medium for human interaction enabled by digital technologies. It combines the display and interaction techniques of virtual reality with new vision technologies that transcend the traditional limitations of a camera. Tele-immersion stations observe people as moving sculptures without favoring a single point of view.…

  13. Comparative study on collaborative interaction in non-immersive and immersive systems

    NASA Astrophysics Data System (ADS)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki

    2007-09-01

    This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.

  14. Interactive Immersive Virtualmuseum: Digital Documentation for Virtual Interaction

    NASA Astrophysics Data System (ADS)

    Clini, P.; Ruggeri, L.; Angeloni, R.; Sasso, M.

    2018-05-01

    Thanks to their playful and educational approach Virtual Museum systems are very effective for the communication of Cultural Heritage. Among the latest technologies Immersive Virtual Reality is probably the most appealing and potentially effective to serve this purpose; nevertheless, due to a poor user-system interaction, caused by an incomplete maturity of a specific technology for museum applications, it is still quite uncommon to find immersive installations in museums. This paper explore the possibilities offered by this technology and presents a workflow that, starting from digital documentation, makes possible an interaction with archaeological finds or any other cultural heritage inside different kinds of immersive virtual reality spaces. Two different cases studies are presented: the National Archaeological Museum of Marche in Ancona and the 3D reconstruction of the Roman Forum of Fanum Fortunae. Two different approaches not only conceptually but also in contents; while the Archaeological Museum is represented in the application simply using spherical panoramas to give the perception of the third dimension, the Roman Forum is a 3D model that allows visitors to move in the virtual space as in the real one. In both cases, the acquisition phase of the artefacts is central; artefacts are digitized with the photogrammetric technique Structure for Motion then they are integrated inside the immersive virtual space using a PC with a HTC Vive system that allows the user to interact with the 3D models turning the manipulation of objects into a fun and exciting experience. The challenge, taking advantage of the latest opportunities made available by photogrammetry and ICT, is to enrich visitors' experience in Real Museum making possible the interaction with perishable, damaged or lost objects and the public access to inaccessible or no longer existing places promoting in this way the preservation of fragile sites.

  15. Immersive virtual reality for visualization of abdominal CT

    NASA Astrophysics Data System (ADS)

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A.; Bodenheimer, Robert E.

    2013-03-01

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  16. Immersive Virtual Reality for Visualization of Abdominal CT.

    PubMed

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A; Bodenheimer, Robert E

    2013-03-28

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two-dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  17. Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.

    PubMed

    Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E

    2007-01-01

    This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.

  18. Correcting Distance Estimates by Interacting With Immersive Virtual Environments: Effects of Task and Available Sensory Information

    ERIC Educational Resources Information Center

    Waller, David; Richardson, Adam R.

    2008-01-01

    The tendency to underestimate egocentric distances in immersive virtual environments (VEs) is not well understood. However, previous research (A. R. Richardson & D. Waller, 2007) has demonstrated that a brief period of interaction with the VE prior to making distance judgments can effectively eliminate subsequent underestimation. Here the authors…

  19. Computer-Assisted Culture Learning in an Online Augmented Reality Environment Based on Free-Hand Gesture Interaction

    ERIC Educational Resources Information Center

    Yang, Mau-Tsuen; Liao, Wan-Che

    2014-01-01

    The physical-virtual immersion and real-time interaction play an essential role in cultural and language learning. Augmented reality (AR) technology can be used to seamlessly merge virtual objects with real-world images to realize immersions. Additionally, computer vision (CV) technology can recognize free-hand gestures from live images to enable…

  20. The ALIVE Project: Astronomy Learning in Immersive Virtual Environments

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Sahami, K.; Denn, G.

    2008-06-01

    The Astronomy Learning in Immersive Virtual Environments (ALIVE) project seeks to discover learning modes and optimal teaching strategies using immersive virtual environments (VEs). VEs are computer-generated, three-dimensional environments that can be navigated to provide multiple perspectives. Immersive VEs provide the additional benefit of surrounding a viewer with the simulated reality. ALIVE evaluates the incorporation of an interactive, real-time ``virtual universe'' into formal college astronomy education. In the experiment, pre-course, post-course, and curriculum tests will be used to determine the efficacy of immersive visualizations presented in a digital planetarium versus the same visual simulations in the non-immersive setting of a normal classroom, as well as a control case using traditional classroom multimedia. To normalize for inter-instructor variability, each ALIVE instructor will teach at least one of each class in each of the three test groups.

  1. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases.

    PubMed

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-04-15

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.

  2. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases

    PubMed Central

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-01-01

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907

  3. Immersive Virtual Moon Scene System Based on Panoramic Camera Data of Chang'E-3

    NASA Astrophysics Data System (ADS)

    Gao, X.; Liu, J.; Mu, L.; Yan, W.; Zeng, X.; Zhang, X.; Li, C.

    2014-12-01

    The system "Immersive Virtual Moon Scene" is used to show the virtual environment of Moon surface in immersive environment. Utilizing stereo 360-degree imagery from panoramic camera of Yutu rover, the system enables the operator to visualize the terrain and the celestial background from the rover's point of view in 3D. To avoid image distortion, stereo 360-degree panorama stitched by 112 images is projected onto inside surface of sphere according to panorama orientation coordinates and camera parameters to build the virtual scene. Stars can be seen from the Moon at any time. So we render the sun, planets and stars according to time and rover's location based on Hipparcos catalogue as the background on the sphere. Immersing in the stereo virtual environment created by this imaged-based rendering technique, the operator can zoom, pan to interact with the virtual Moon scene and mark interesting objects. Hardware of the immersive virtual Moon system is made up of four high lumen projectors and a huge curve screen which is 31 meters long and 5.5 meters high. This system which take all panoramic camera data available and use it to create an immersive environment, enable operator to interact with the environment and mark interesting objects contributed heavily to establishment of science mission goals in Chang'E-3 mission. After Chang'E-3 mission, the lab with this system will be open to public. Besides this application, Moon terrain stereo animations based on Chang'E-1 and Chang'E-2 data will be showed to public on the huge screen in the lab. Based on the data of lunar exploration,we will made more immersive virtual moon scenes and animations to help the public understand more about the Moon in the future.

  4. Virtual hydrology observatory: an immersive visualization of hydrology modeling

    NASA Astrophysics Data System (ADS)

    Su, Simon; Cruz-Neira, Carolina; Habib, Emad; Gerndt, Andreas

    2009-02-01

    The Virtual Hydrology Observatory will provide students with the ability to observe the integrated hydrology simulation with an instructional interface by using a desktop based or immersive virtual reality setup. It is the goal of the virtual hydrology observatory application to facilitate the introduction of field experience and observational skills into hydrology courses through innovative virtual techniques that mimic activities during actual field visits. The simulation part of the application is developed from the integrated atmospheric forecast model: Weather Research and Forecasting (WRF), and the hydrology model: Gridded Surface/Subsurface Hydrologic Analysis (GSSHA). Both the output from WRF and GSSHA models are then used to generate the final visualization components of the Virtual Hydrology Observatory. The various visualization data processing techniques provided by VTK are 2D Delaunay triangulation and data optimization. Once all the visualization components are generated, they are integrated into the simulation data using VRFlowVis and VR Juggler software toolkit. VR Juggler is used primarily to provide the Virtual Hydrology Observatory application with fully immersive and real time 3D interaction experience; while VRFlowVis provides the integration framework for the hydrologic simulation data, graphical objects and user interaction. A six-sided CAVETM like system is used to run the Virtual Hydrology Observatory to provide the students with a fully immersive experience.

  5. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges

    PubMed Central

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants’ behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants’ height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother). PMID:26157414

  6. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges.

    PubMed

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants' behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants' height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother).

  7. A Fully Immersive Set-Up for Remote Interaction and Neurorehabilitation Based on Virtual Body Ownership

    PubMed Central

    Perez-Marcos, Daniel; Solazzi, Massimiliano; Steptoe, William; Oyekoya, Oyewole; Frisoli, Antonio; Weyrich, Tim; Steed, Anthony; Tecchia, Franco; Slater, Mel; Sanchez-Vives, Maria V.

    2012-01-01

    Although telerehabilitation systems represent one of the most technologically appealing clinical solutions for the immediate future, they still present limitations that prevent their standardization. Here we propose an integrated approach that includes three key and novel factors: (a) fully immersive virtual environments, including virtual body representation and ownership; (b) multimodal interaction with remote people and virtual objects including haptic interaction; and (c) a physical representation of the patient at the hospital through embodiment agents (e.g., as a physical robot). The importance of secure and rapid communication between the nodes is also stressed and an example implemented solution is described. Finally, we discuss the proposed approach with reference to the existing literature and systems. PMID:22787454

  8. Study on Collaborative Object Manipulation in Virtual Environment

    NASA Astrophysics Data System (ADS)

    Mayangsari, Maria Niken; Yong-Moo, Kwon

    This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.

  9. The Selimiye Mosque of Edirne, Turkey - AN Immersive and Interactive Virtual Reality Experience Using Htc Vive

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A. P.

    2017-05-01

    Recent advances in contemporary Virtual Reality (VR) technologies are going to have a significant impact on veryday life. Through VR it is possible to virtually explore a computer-generated environment as a different reality, and to immerse oneself into the past or in a virtual museum without leaving the current real-life situation. For such the ultimate VR experience, the user should only see the virtual world. Currently, the user must wear a VR headset which fits around the head and over the eyes to visually separate themselves from the physical world. Via the headset images are fed to the eyes through two small lenses. Cultural heritage monuments are ideally suited both for thorough multi-dimensional geometric documentation and for realistic interactive visualisation in immersive VR applications. Additionally, the game industry offers tools for interactive visualisation of objects to motivate users to virtually visit objects and places. In this paper the generation of a virtual 3D model of the Selimiye mosque in the city of Edirne, Turkey and its processing for data integration into the game engine Unity is presented. The project has been carried out as a co-operation between BİMTAŞ, a company of the Greater Municipality of Istanbul, Turkey and the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg, Germany to demonstrate an immersive and interactive visualisation using the new VR system HTC Vive. The workflow from data acquisition to VR visualisation, including the necessary programming for navigation, is described. Furthermore, the possible use (including simultaneous multiple users environments) of such a VR visualisation for a CH monument is discussed in this contribution.

  10. Measuring Flow Experience in an Immersive Virtual Environment for Collaborative Learning

    ERIC Educational Resources Information Center

    van Schaik, P.; Martin, S.; Vallance, M.

    2012-01-01

    In contexts other than immersive virtual environments, theoretical and empirical work has identified flow experience as a major factor in learning and human-computer interaction. Flow is defined as a "holistic sensation that people feel when they act with total involvement". We applied the concept of flow to modeling the experience of…

  11. A succinct overview of virtual reality technology use in Alzheimer's disease.

    PubMed

    García-Betances, Rebeca I; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer's disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers' education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments.

  12. Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!

    NASA Astrophysics Data System (ADS)

    Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.

    2015-04-01

    Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.

  13. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    PubMed Central

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545

  14. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.

    PubMed

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-05-17

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.

  15. Hybrid Reality Lab Capabilities - Video 2

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2016-01-01

    Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.

  16. Children's Perception of Gap Affordances: Bicycling Across Traffic-Filled Intersections in an Immersive Virtual Environment

    ERIC Educational Resources Information Center

    Plumert, Jodie M.; Kearney, Joseph K.; Cremer, James F.

    2004-01-01

    This study examined gap choices and crossing behavior in children and adults using an immersive, interactive bicycling simulator. Ten- and 12-year-olds and adults rode a bicycle mounted on a stationary trainer through a virtual environment consisting of a street with 6 intersections. Participants faced continuous cross traffic traveling at 25mph…

  17. A Succinct Overview of Virtual Reality Technology Use in Alzheimer’s Disease

    PubMed Central

    García-Betances, Rebeca I.; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer’s disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers’ education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments. PMID:26029101

  18. Recent developments in virtual experience design and production

    NASA Astrophysics Data System (ADS)

    Fisher, Scott S.

    1995-03-01

    Today, the media of VR and Telepresence are in their infancy and the emphasis is still on technology and engineering. But, it is not the hardware people might use that will determine whether VR becomes a powerful medium--instead, it will be the experiences that they are able to have that will drive its acceptance and impact. A critical challenge in the elaboration of these telepresence capabilities will be the development of environments that are as unpredictable and rich in interconnected processes as an actual location or experience. This paper will describe the recent development of several Virtual Experiences including: `Menagerie', an immersive Virtual Environment inhabited by virtual characters designed to respond to and interact with its users; and `The Virtual Brewery', an immersive public VR installation that provides multiple levels of interaction in an artistic interpretation of the brewing process.

  19. Sexual self-regulation and cognitive absorption as factors of sexual response toward virtual characters.

    PubMed

    Renaud, Patrice; Trottier, Dominique; Nolet, Kevin; Rouleau, Joanne L; Goyette, Mathieu; Bouchard, Stéphane

    2014-04-01

    The eye movements and penile responses of 20 male participants were recorded while they were immersed with virtual sexual stimuli. These participants were divided into two groups according to their capacity to focus their attention in immersion (high and low focus). In order to understand sexual self-regulation better, we subjected participants to three experimental conditions: (a) immersion with a preferred sexual stimulus, without sexual inhibition; (b) immersion with a preferred sexual stimulus, with sexual inhibition; and (c) immersion with a neutral stimulus. A significant difference was observed between the effects of each condition on erectile response and scanpath. The groups differed on self-regulation of their erectile responses and on their scanpath patterns. High focus participants had more difficulties than low focus participants with inhibiting their sexual responses and displayed less scattered eye movement trajectories over the critical areas of the virtual sexual stimuli. Results are interpreted in terms of sexual self-regulation and cognitive absorption in virtual immersion. In addition, the use of validated virtual sexual stimuli is presented as a methodological improvement over static and moving pictures, since it paves the way for the study of the role of social interaction in an ecologically valid and well-controlled way.

  20. Multisensory Integration in the Virtual Hand Illusion with Active Movement

    PubMed Central

    Satoh, Satoru; Hachimura, Kozaburo

    2016-01-01

    Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality. PMID:27847822

  1. KinImmerse: Macromolecular VR for NMR ensembles

    PubMed Central

    Block, Jeremy N; Zielinski, David J; Chen, Vincent B; Davis, Ian W; Vinson, E Claire; Brady, Rachael; Richardson, Jane S; Richardson, David C

    2009-01-01

    Background In molecular applications, virtual reality (VR) and immersive virtual environments have generally been used and valued for the visual and interactive experience – to enhance intuition and communicate excitement – rather than as part of the actual research process. In contrast, this work develops a software infrastructure for research use and illustrates such use on a specific case. Methods The Syzygy open-source toolkit for VR software was used to write the KinImmerse program, which translates the molecular capabilities of the kinemage graphics format into software for display and manipulation in the DiVE (Duke immersive Virtual Environment) or other VR system. KinImmerse is supported by the flexible display construction and editing features in the KiNG kinemage viewer and it implements new forms of user interaction in the DiVE. Results In addition to molecular visualizations and navigation, KinImmerse provides a set of research tools for manipulation, identification, co-centering of multiple models, free-form 3D annotation, and output of results. The molecular research test case analyzes the local neighborhood around an individual atom within an ensemble of nuclear magnetic resonance (NMR) models, enabling immersive visual comparison of the local conformation with the local NMR experimental data, including target curves for residual dipolar couplings (RDCs). Conclusion The promise of KinImmerse for production-level molecular research in the DiVE is shown by the locally co-centered RDC visualization developed there, which gave new insights now being pursued in wider data analysis. PMID:19222844

  2. Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters

    NASA Astrophysics Data System (ADS)

    Apostolellis, Panagiotis; Daradoumis, Thanasis

    Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.

  3. The Adaptive Effects Of Virtual Interfaces: Vestibulo-Ocular Reflex and Simulator Sickness.

    DTIC Science & Technology

    1998-08-07

    rearrangement: a pattern of stimulation differing from that existing as a result of normal interactions with the real world. Stimulus rearrangements can...is immersive and interactive . virtual interface: a system of transducers, signal processors, computer hardware and software that create an... interactive medium through which: 1) information is transmitted to the senses in the form of two- and three dimensional virtual images and 2) psychomotor

  4. IMMERSE: Interactive Mentoring for Multimodal Experiences in Realistic Social Encounters

    DTIC Science & Technology

    2015-08-28

    undergraduates funded by your agreement who graduated during this period and will receive scholarships or fellowships for further studies in science... Player Locomotion 9.2 Interacting with Real and Virtual Objects 9.3 Animation Combinations and Stage Management 10. Recommendations on the Way Ahead...Interaction with Virtual Characters ................................52! 9.1! Player Locomotion

  5. Harnessing Neuroplasticity to Promote Rehabilitation: CI Therapy for TBI

    DTIC Science & Technology

    2016-10-01

    scheduled plus 33 to be enrolled, because we assume that the proportion of withdrawals will be the same as experienced to date, i.e., 24%. This plan will...period? Victor Mark, Investigator Interactive Immersive Virtual Reality Walking for SCI Neuropathic Pain (Trost) 0.24 calendar months Kim Cerise...Direct Costs: $149,999 This project designs and test an immersive virtual reality treatment method to control neuropathic pain following traumatic spinal

  6. SciEthics Interactive: Science and Ethics Learning in a Virtual Environment

    ERIC Educational Resources Information Center

    Nadolny, Larysa; Woolfrey, Joan; Pierlott, Matthew; Kahn, Seth

    2013-01-01

    Learning in immersive 3D environments allows students to collaborate, build, and interact with difficult course concepts. This case study examines the design and development of the TransGen Island within the SciEthics Interactive project, a National Science Foundation-funded, 3D virtual world emphasizing learning science content in the context of…

  7. An Immersive VR System for Sports Education

    NASA Astrophysics Data System (ADS)

    Song, Peng; Xu, Shuhong; Fong, Wee Teck; Chin, Ching Ling; Chua, Gim Guan; Huang, Zhiyong

    The development of new technologies has undoubtedly promoted the advances of modern education, among which Virtual Reality (VR) technologies have made the education more visually accessible for students. However, classroom education has been the focus of VR applications whereas not much research has been done in promoting sports education using VR technologies. In this paper, an immersive VR system is designed and implemented to create a more intuitive and visual way of teaching tennis. A scalable system architecture is proposed in addition to the hardware setup layout, which can be used for various immersive interactive applications such as architecture walkthroughs, military training simulations, other sports game simulations, interactive theaters, and telepresent exhibitions. Realistic interaction experience is achieved through accurate and robust hybrid tracking technology, while the virtual human opponent is animated in real time using shader-based skin deformation. Potential future extensions are also discussed to improve the teaching/learning experience.

  8. Scenario-Based Spoken Interaction with Virtual Agents

    ERIC Educational Resources Information Center

    Morton, Hazel; Jack, Mervyn A.

    2005-01-01

    This paper describes a CALL approach which integrates software for speaker independent continuous speech recognition with embodied virtual agents and virtual worlds to create an immersive environment in which learners can converse in the target language in contextualised scenarios. The result is a self-access learning package: SPELL (Spoken…

  9. Virtually compliant: Immersive video gaming increases conformity to false computer judgments.

    PubMed

    Weger, Ulrich W; Loughnan, Stephen; Sharma, Dinkar; Gonidis, Lazaros

    2015-08-01

    Real-life encounters with face-to-face contact are on the decline in a world in which many routine tasks are delegated to virtual characters-a development that bears both opportunities and risks. Interacting with such virtual-reality beings is particularly common during role-playing videogames, in which we incarnate into the virtual reality of an avatar. Video gaming is known to lead to the training and development of real-life skills and behaviors; hence, in the present study we sought to explore whether role-playing video gaming primes individuals' identification with a computer enough to increase computer-related social conformity. Following immersive video gaming, individuals were indeed more likely to give up their own best judgment and to follow the vote of computers, especially when the stimulus context was ambiguous. Implications for human-computer interactions and for our understanding of the formation of identity and self-concept are discussed.

  10. Agency and Gender Influence Older Adults' Presence-Related Experiences in an Interactive Virtual Environment.

    PubMed

    Kothgassner, Oswald D; Goreis, Andreas; Kafka, Johanna X; Hlavacs, Helmut; Beutl, Leon; Kryspin-Exner, Ilse; Felnhofer, Anna

    2018-05-01

    While virtual humans are increasingly used to benefit the elderly, considerably little is still known about older adults' virtual experiences. However, due to age-related changes, older adults' perceptions of virtual environments (VEs) may be unique. Hence, our objective was to examine possible gender differences in immersion, flow, and emotional states as well as physical and social presence in elderly males and females interacting either with a computer-controlled agent or a human-controlled avatar. Seventy-eight German-speaking older adults were randomly assigned to an avatar or an agent condition and were exposed to a brief social encounter in a virtual café. Results indicate no overall gender differences, but a significant effect of agency on social presence, physical presence, immersion, and flow. Participants in the avatar condition reported higher levels in all measures, except for involvement. Furthermore, significant gender × agency interactions were found, with females showing more social presence, spatial presence, and flow when interacting with a human-controlled avatar and more realism when conversing with an agent. Also, all participants showed significant changes in their affect post exposure. In sum, older adults' virtual experiences seem to follow unique patterns, yet, they do not preclude the elderly from successfully participating in VEs.

  11. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  12. CaveCAD: a tool for architectural design in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Schulze, Jürgen P.; Hughes, Cathleen E.; Zhang, Lelin; Edelstein, Eve; Macagno, Eduardo

    2014-02-01

    Existing 3D modeling tools were designed to run on desktop computers with monitor, keyboard and mouse. To make 3D modeling possible with mouse and keyboard, many 3D interactions, such as point placement or translations of geometry, had to be mapped to the 2D parameter space of the mouse, possibly supported by mouse buttons or keyboard keys. We hypothesize that had the designers of these existing systems had been able to assume immersive virtual reality systems as their target platforms, they would have been able to design 3D interactions much more intuitively. In collaboration with professional architects, we created a simple, but complete 3D modeling tool for virtual environments from the ground up and use direct 3D interaction wherever possible and adequate. In this publication, we present our approaches for interactions for typical 3D modeling functions, such as geometry creation, modification of existing geometry, and assignment of surface materials. We also discuss preliminary user experiences with this system.

  13. Immersive Virtual Reality Therapy with Myoelectric Control for Treatment-resistant Phantom Limb Pain: Case Report.

    PubMed

    Chau, Brian; Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc

    2017-01-01

    Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain.

  14. Immersive Virtual Reality Therapy with Myoelectric Control for Treatment-resistant Phantom Limb Pain: Case Report

    PubMed Central

    Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc

    2017-01-01

    Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain. PMID:29616149

  15. A System for Governmental Virtual Institutions Based on Ontologies and Interaction Protocols

    ERIC Educational Resources Information Center

    de Araujo, Claudia J. Abrao; da Silva, Flavio S. Correa

    2012-01-01

    The authors believe that the adoption of virtual worlds is suitable for electronic government applications as it can increase the capillarity of public services, facilitate the access to government services and provide citizens with a natural and immersive experience. They present a Government Virtual Institution Model (GVI) for the provision of…

  16. Exploring "Magic Cottage": A Virtual Reality Environment for Stimulating Children's Imaginative Writing

    ERIC Educational Resources Information Center

    Patera, Marianne; Draper, Steve; Naef, Martin

    2008-01-01

    This paper presents an exploratory study that created a virtual reality environment (VRE) to stimulate motivation and creativity in imaginative writing at primary school level. The main aim of the study was to investigate if an interactive, semi-immersive virtual reality world could increase motivation and stimulate pupils' imagination in the…

  17. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  18. Highly immersive virtual reality laparoscopy simulation: development and future aspects.

    PubMed

    Huber, Tobias; Wunderling, Tom; Paschold, Markus; Lang, Hauke; Kneist, Werner; Hansen, Christian

    2018-02-01

    Virtual reality (VR) applications with head-mounted displays (HMDs) have had an impact on information and multimedia technologies. The current work aimed to describe the process of developing a highly immersive VR simulation for laparoscopic surgery. We combined a VR laparoscopy simulator (LapSim) and a VR-HMD to create a user-friendly VR simulation scenario. Continuous clinical feedback was an essential aspect of the development process. We created an artificial VR (AVR) scenario by integrating the simulator video output with VR game components of figures and equipment in an operating room. We also created a highly immersive VR surrounding (IVR) by integrating the simulator video output with a [Formula: see text] video of a standard laparoscopy scenario in the department's operating room. Clinical feedback led to optimization of the visualization, synchronization, and resolution of the virtual operating rooms (in both the IVR and the AVR). Preliminary testing results revealed that individuals experienced a high degree of exhilaration and presence, with rare events of motion sickness. The technical performance showed no significant difference compared to that achieved with the standard LapSim. Our results provided a proof of concept for the technical feasibility of an custom highly immersive VR-HMD setup. Future technical research is needed to improve the visualization, immersion, and capability of interacting within the virtual scenario.

  19. iVFTs - immersive virtual field trips for interactive learning about Earth's environment.

    NASA Astrophysics Data System (ADS)

    Bruce, G.; Anbar, A. D.; Semken, S. C.; Summons, R. E.; Oliver, C.; Buxner, S.

    2014-12-01

    Innovations in immersive interactive technologies are changing the way students explore Earth and its environment. State-of-the-art hardware has given developers the tools needed to capture high-resolution spherical content, 360° panoramic video, giga-pixel imagery, and unique viewpoints via unmanned aerial vehicles as they explore remote and physically challenging regions of our planet. Advanced software enables integration of these data into seamless, dynamic, immersive, interactive, content-rich, and learner-driven virtual field explorations, experienced online via HTML5. These surpass conventional online exercises that use 2-D static imagery and enable the student to engage in these virtual environments that are more like games than like lectures. Grounded in the active learning of exploration, inquiry, and application of knowledge as it is acquired, users interact non-linearly in conjunction with an intelligent tutoring system (ITS). The integration of this system allows the educational experience to be adapted to each individual student as they interact within the program. Such explorations, which we term "immersive virtual field trips" (iVFTs), are being integrated into cyber-learning allowing science teachers to take students to scientifically significant but inaccessible environments. Our team and collaborators are producing a diverse suite of freely accessible, iVFTs to teach key concepts in geology, astrobiology, ecology, and anthropology. Topics include Early Life, Biodiversity, Impact craters, Photosynthesis, Geologic Time, Stratigraphy, Tectonics, Volcanism, Surface Processes, The Rise of Oxygen, Origin of Water, Early Civilizations, Early Multicellular Organisms, and Bioarcheology. These diverse topics allow students to experience field sites all over the world, including, Grand Canyon (USA), Flinders Ranges (Australia), Shark Bay (Australia), Rainforests (Panama), Teotihuacan (Mexico), Upheaval Dome (USA), Pilbara (Australia), Mid-Atlantic Ridge (Iceland), and Mauna Kea (Hawaii). iVFTs are being beta-tested and used at ASU in several large-enrollment courses to assess its usability and effectiveness in meeting specific learning objectives. We invite geoscience educators to partake of this resource and find new applications to their own teaching.

  20. "Immersive Education" Submerges Students in Online Worlds Made for Learning

    ERIC Educational Resources Information Center

    Foster, Andrea L.

    2007-01-01

    Immersive Education is a multimillion-dollar project devoted to build virtual-reality software exclusively for education within commercial and nonprofit fantasy spaces like Second Life. The project combines interactive three-dimensional graphics, Web cameras, Internet-based telephony, and other digital media. Some critics have complained that…

  1. Constraint, Intelligence, and Control Hierarchy in Virtual Environments. Chapter 1

    NASA Technical Reports Server (NTRS)

    Sheridan, Thomas B.

    2007-01-01

    This paper seeks to deal directly with the question of what makes virtual actors and objects that are experienced in virtual environments seem real. (The term virtual reality, while more common in public usage, is an oxymoron; therefore virtual environment is the preferred term in this paper). Reality is difficult topic, treated for centuries in those sub-fields of philosophy called ontology- "of or relating to being or existence" and epistemology- "the study of the method and grounds of knowledge, especially with reference to its limits and validity" (both from Webster s, 1965). Advances in recent decades in the technologies of computers, sensors and graphics software have permitted human users to feel present or experience immersion in computer-generated virtual environments. This has motivated a keen interest in probing this phenomenon of presence and immersion not only philosophically but also psychologically and physiologically in terms of the parameters of the senses and sensory stimulation that correlate with the experience (Ellis, 1991). The pages of the journal Presence: Teleoperators and Virtual Environments have seen much discussion of what makes virtual environments seem real (see, e.g., Slater, 1999; Slater et al. 1994; Sheridan, 1992, 2000). Stephen Ellis, when organizing the meeting that motivated this paper, suggested to invited authors that "We may adopt as an organizing principle for the meeting that the genesis of apparently intelligent interaction arises from an upwelling of constraints determined by a hierarchy of lower levels of behavioral interaction. "My first reaction was "huh?" and my second was "yeah, that seems to make sense." Accordingly the paper seeks to explain from the author s viewpoint, why Ellis s hypothesis makes sense. What is the connection of "presence" or "immersion" of an observer in a virtual environment, to "constraints" and what types of constraints. What of "intelligent interaction," and is it the intelligence of the observer or the intelligence of the environment (whatever the latter may mean) that is salient? And finally, what might be relevant about "upwelling" of constraints as determined by a hierarchy of levels of interaction?

  2. Enhance Learning on Software Project Management through a Role-Play Game in a Virtual World

    ERIC Educational Resources Information Center

    Maratou, Vicky; Chatzidaki, Eleni; Xenos, Michalis

    2016-01-01

    This article presents a role-play game for software project management (SPM) in a three-dimensional online multiuser virtual world. The Opensimulator platform is used for the creation of an immersive virtual environment that facilitates students' collaboration and realistic interaction, in order to manage unexpected events occurring during the…

  3. Teaching Physics to Deaf College Students in a 3-D Virtual Lab

    ERIC Educational Resources Information Center

    Robinson, Vicki

    2013-01-01

    Virtual worlds are used in many educational and business applications. At the National Technical Institute for the Deaf at Rochester Institute of Technology (NTID/RIT), deaf college students are introduced to the virtual world of Second Life, which is a 3-D immersive, interactive environment, accessed through computer software. NTID students use…

  4. L2 Immersion in 3D Virtual Worlds: The Next Thing to Being There?

    ERIC Educational Resources Information Center

    Paillat, Edith

    2014-01-01

    Second Life is one of the many three-dimensional virtual environments accessible through a computer and a fast broadband connection. Thousands of participants connect to this platform to interact virtually with the world, join international communities of practice and, for some, role play groups. Unlike online role play games however, Second Life…

  5. The Use of Virtual Reality for Creating Unusual Environmental Stimulation to Motivate Students to Explore Creative Ideas

    ERIC Educational Resources Information Center

    Lau, Kung Wong; Lee, Pui Yuen

    2015-01-01

    This paper discusses the roles of simulation in creativity education and how to apply immersive virtual environments to enhance students' learning experiences in university, through the provision of interactive simulations. An empirical study of a simulated virtual reality was carried out in order to investigate the effectiveness of providing…

  6. Interactive and Stereoscopic Hybrid 3D Viewer of Radar Data with Gesture Recognition

    NASA Astrophysics Data System (ADS)

    Goenetxea, Jon; Moreno, Aitor; Unzueta, Luis; Galdós, Andoni; Segura, Álvaro

    This work presents an interactive and stereoscopic 3D viewer of weather information coming from a Doppler radar. The hybrid system shows a GIS model of the regional zone where the radar is located and the corresponding reconstructed 3D volume weather data. To enhance the immersiveness of the navigation, stereoscopic visualization has been added to the viewer, using a polarized glasses based system. The user can interact with the 3D virtual world using a Nintendo Wiimote for navigating through it and a Nintendo Wii Nunchuk for giving commands by means of hand gestures. We also present a dynamic gesture recognition procedure that measures the temporal advance of the performed gesture postures. Experimental results show how dynamic gestures are effectively recognized so that a more natural interaction and immersive navigation in the virtual world is achieved.

  7. Headphone and Head-Mounted Visual Displays for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Begault, Duran R.; Ellis, Stephen R.; Wenzel, Elizabeth M.; Trejo, Leonard J. (Technical Monitor)

    1998-01-01

    A realistic auditory environment can contribute to both the overall subjective sense of presence in a virtual display, and to a quantitative metric predicting human performance. Here, the role of audio in a virtual display and the importance of auditory-visual interaction are examined. Conjectures are proposed regarding the effectiveness of audio compared to visual information for creating a sensation of immersion, the frame of reference within a virtual display, and the compensation of visual fidelity by supplying auditory information. Future areas of research are outlined for improving simulations of virtual visual and acoustic spaces. This paper will describe some of the intersensory phenomena that arise during operator interaction within combined visual and auditory virtual environments. Conjectures regarding audio-visual interaction will be proposed.

  8. Coupled auralization and virtual video for immersive multimedia displays

    NASA Astrophysics Data System (ADS)

    Henderson, Paul D.; Torres, Rendell R.; Shimizu, Yasushi; Radke, Richard; Lonsway, Brian

    2003-04-01

    The implementation of maximally-immersive interactive multimedia in exhibit spaces requires not only the presentation of realistic visual imagery but also the creation of a perceptually accurate aural experience. While conventional implementations treat audio and video problems as essentially independent, this research seeks to couple the visual sensory information with dynamic auralization in order to enhance perceptual accuracy. An implemented system has been developed for integrating accurate auralizations with virtual video techniques for both interactive presentation and multi-way communication. The current system utilizes a multi-channel loudspeaker array and real-time signal processing techniques for synthesizing the direct sound, early reflections, and reverberant field excited by a moving sound source whose path may be interactively defined in real-time or derived from coupled video tracking data. In this implementation, any virtual acoustic environment may be synthesized and presented in a perceptually-accurate fashion to many participants over a large listening and viewing area. Subject tests support the hypothesis that the cross-modal coupling of aural and visual displays significantly affects perceptual localization accuracy.

  9. Learning Relative Motion Concepts in Immersive and Non-Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria

    2013-01-01

    The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop…

  10. Foreign language learning in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Chang, Benjamin; Sheldon, Lee; Si, Mei; Hand, Anton

    2012-03-01

    Virtual reality has long been used for training simulations in fields from medicine to welding to vehicular operation, but simulations involving more complex cognitive skills present new design challenges. Foreign language learning, for example, is increasingly vital in the global economy, but computer-assisted education is still in its early stages. Immersive virtual reality is a promising avenue for language learning as a way of dynamically creating believable scenes for conversational training and role-play simulation. Visual immersion alone, however, only provides a starting point. We suggest that the addition of social interactions and motivated engagement through narrative gameplay can lead to truly effective language learning in virtual environments. In this paper, we describe the development of a novel application for teaching Mandarin using CAVE-like VR, physical props, human actors and intelligent virtual agents, all within a semester-long multiplayer mystery game. Students travel (virtually) to China on a class field trip, which soon becomes complicated with intrigue and mystery surrounding the lost manuscript of an early Chinese literary classic. Virtual reality environments such as the Forbidden City and a Beijing teahouse provide the setting for learning language, cultural traditions, and social customs, as well as the discovery of clues through conversation in Mandarin with characters in the game.

  11. PC-Based Virtual Reality for CAD Model Viewing

    ERIC Educational Resources Information Center

    Seth, Abhishek; Smith, Shana S.-F.

    2004-01-01

    Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…

  12. Local and Remote Cooperation With Virtual and Robotic Agents: A P300 BCI Study in Healthy and People Living With Spinal Cord Injury.

    PubMed

    Tidoni, Emmanuele; Abu-Alqumsan, Mohammad; Leonardis, Daniele; Kapeller, Christoph; Fusco, Gabriele; Guger, Cristoph; Hintermuller, Cristoph; Peer, Angelika; Frisoli, Antonio; Tecchia, Franco; Bergamasco, Massimo; Aglioti, Salvatore Maria

    2017-09-01

    The development of technological applications that allow people to control and embody external devices within social interaction settings represents a major goal for current and future brain-computer interface (BCI) systems. Prior research has suggested that embodied systems may ameliorate BCI end-user's experience and accuracy in controlling external devices. Along these lines, we developed an immersive P300-based BCI application with a head-mounted display for virtual-local and robotic-remote social interactions and explored in a group of healthy participants the role of proprioceptive feedback in the control of a virtual surrogate (Study 1). Moreover, we compared the performance of a small group of people with spinal cord injury (SCI) to a control group of healthy subjects during virtual and robotic social interactions (Study 2), where both groups received a proprioceptive stimulation. Our attempt to combine immersive environments, BCI technologies and neuroscience of body ownership suggests that providing realistic multisensory feedback still represents a challenge. Results have shown that healthy and people living with SCI used the BCI within the immersive scenarios with good levels of performance (as indexed by task accuracy, optimizations calls and Information Transfer Rate) and perceived control of the surrogates. Proprioceptive feedback did not contribute to alter performance measures and body ownership sensations. Further studies are necessary to test whether sensorimotor experience represents an opportunity to improve the use of future embodied BCI applications.

  13. The Use of Immersive Virtual Reality in the Learning Sciences: Digital Transformations of Teachers, Students, and Social Context

    ERIC Educational Resources Information Center

    Bailenson, Jeremy N.; Yee, Nick; Blascovich, Jim; Beall, Andrew C.; Lundblad, Nicole; Jin, Michael

    2008-01-01

    This article illustrates the utility of using virtual environments to transform social interaction via behavior and context, with the goal of improving learning in digital environments. We first describe the technology and theories behind virtual environments and then report data from 4 empirical studies. In Experiment 1, we demonstrated that…

  14. Academic Library Services in Virtual Worlds: An Examination of the Potential for Library Services in Immersive Environments

    ERIC Educational Resources Information Center

    Ryan, Jenna; Porter, Marjorie; Miller, Rebecca

    2010-01-01

    Current literature on libraries is abundant with articles about the uses and the potential of new interactive communication technology, including Web 2.0 tools. Recently, the advent and use of virtual worlds have received top billing in these works. Many library institutions are exploring these virtual environments; this exploration and the…

  15. Emerging Conceptual Understanding of Complex Astronomical Phenomena by Using a Virtual Solar System

    ERIC Educational Resources Information Center

    Gazit, Elhanan; Yair, Yoav; Chen, David

    2005-01-01

    This study describes high school students' conceptual development of the basic astronomical phenomena during real-time interactions with a Virtual Solar System (VSS). The VSS is a non-immersive virtual environment which has a dynamic frame of reference that can be altered by the user. Ten 10th grade students were given tasks containing a set of…

  16. Along the Virtuality Continuum - Two Showcases on how xR Technologies Transform Geoscience Research and Education

    NASA Astrophysics Data System (ADS)

    Klippel, A.; Zhao, J.; Masrur, A.; Wallgruen, J. O.; La Femina, P. C.

    2017-12-01

    We present work along the virtuality continuum showcasing both AR and VR environments for geoscience applications and research. The AR/VR project focusses on one of the most prominent landmarks on the Penn State campus which, at the same time, is a representation of the geology of Pennsylvania. The Penn State Obelisk is a 32" high, 51 ton monument composed of 281 rocks collected from across Pennsylvania. While information about its origins and composition are scattered in articles and some web databases, we compiled all the available data from the web and archives and curated them as a basis for an immersive xR experience. Tabular data was amended by xR data such as 360° photos, videos, and 3D models (e.g., the Obelisk). Our xR (both AR and VR) prototype provides an immersive analytical environment that supports interactive data visualization and virtual navigation in a natural environment (a campus model of today and of 1896, the year of the Obelisk's installation). This work-in-progress project can provide an interactive immersive learning platform (specifically, for K-12 and introductory level geosciences students) where learning process is enhanced through seamless navigation between 3D data space and physical space. The, second, VR focused application is creating and empirically evaluating virtual reality (VR) experiences for geosciences research, specifically, an interactive volcano experience based on LiDAR and image data of Iceland's Thrihnukar volcano. The prototype addresses the lack of content and tools for immersive virtual reality (iVR) in geoscientific education and research and how to make it easier to integrate iVR into research and classroom experiences. It makes use of environmentally sensed data such that interaction and linked content can be integrated into a single experience. We discuss our workflows as well as methods and authoring tools for iVR analysis and creation of virtual experiences. These methods and tools aim to enhance the utility of geospatial data from repositories such as OpenTopography.org through unlocking treasure-troves of geospatial data for VR applications. Their enhanced accessibility in education and research for the geosciences and beyond will benefit geoscientists and educators who cannot be expected to be VR and 3D application experts.

  17. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    PubMed Central

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-01-01

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680

  18. Force sensitive handles and capacitive touch sensor for driving a flexible haptic-based immersive system.

    PubMed

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-10-09

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  19. Virtual Reality to Train Diagnostic Skills in Eating Disorders. Comparison of two Low Cost Systems.

    PubMed

    Gutiérrez-Maldonado, José; Ferrer-García, Marta; Plasanjuanelo, Joana; Andrés-Pueyo, Antonio; Talarn-Caparrós, Antoni

    2015-01-01

    Enhancing the ability to perform differential diagnosis and psychopathological exploration is important for students who wish to work in the clinical field, as well as for professionals already working in this area. Virtual reality (VR) simulations can immerse students totally in educational experiences in a way that is not possible using other methods. Learning in a VR environment can also be more effective and motivating than usual classroom practices. Traditionally, immersion has been considered central to the quality of a VR system; immersive VR is considered a special and unique experience that cannot achieved by three-dimensional (3D) interactions on desktop PCs. However, some authors have suggested that if the content design is emotionally engaging, immersive systems are not always necessary. The main purpose of this study is to compare the efficacy and usability of two low-cost VR systems, offering different levels of immersion, in order to develop the ability to perform diagnostic interviews in eating disorders by means of simulations of psychopathological explorations.

  20. Radiological tele-immersion for next generation networks.

    PubMed

    Ai, Z; Dech, F; Rasmussen, M; Silverstein, J C

    2000-01-01

    Since the acquisition of high-resolution three-dimensional patient images has become widespread, medical volumetric datasets (CT or MR) larger than 100 MB and encompassing more than 250 slices are common. It is important to make this patient-specific data quickly available and usable to many specialists at different geographical sites. Web-based systems have been developed to provide volume or surface rendering of medical data over networks with low fidelity, but these cannot adequately handle stereoscopic visualization or huge datasets. State-of-the-art virtual reality techniques and high speed networks have made it possible to create an environment for clinicians geographically distributed to immersively share these massive datasets in real-time. An object-oriented method for instantaneously importing medical volumetric data into Tele-Immersive environments has been developed at the Virtual Reality in Medicine Laboratory (VRMedLab) at the University of Illinois at Chicago (UIC). This networked-VR setup is based on LIMBO, an application framework or template that provides the basic capabilities of Tele-Immersion. We have developed a modular general purpose Tele-Immersion program that automatically combines 3D medical data with the methods for handling the data. For this purpose a DICOM loader for IRIS Performer has been developed. The loader was designed for SGI machines as a shared object, which is executed at LIMBO's runtime. The loader loads not only the selected DICOM dataset, but also methods for rendering, handling, and interacting with the data, bringing networked, real-time, stereoscopic interaction with radiological data to reality. Collaborative, interactive methods currently implemented in the loader include cutting planes and windowing. The Tele-Immersive environment has been tested on the UIC campus over an ATM network. We tested the environment with 3 nodes; one ImmersaDesk at the VRMedLab, one CAVE at the Electronic Visualization Laboratory (EVL) on east campus, and a CT scan machine in UIC Hospital. CT data was pulled directly from the scan machine to the Tele-Immersion server in our Laboratory, and then the data was synchronously distributed by our Onyx2 Rack server to all the VR setups. Instead of permitting medical volume visualization at one VR device, by combining teleconferencing, tele-presence, and virtual reality, the Tele-Immersive environment will enable geographically distributed clinicians to intuitively interact with the same medical volumetric models, point, gesture, converse, and see each other. This environment will bring together clinicians at different geographic locations to participate in Tele-Immersive consultation and collaboration.

  1. Virtual reality enhanced mannequin (VREM) that is well received by resuscitation experts.

    PubMed

    Semeraro, Federico; Frisoli, Antonio; Bergamasco, Massimo; Cerchiari, Erga L

    2009-04-01

    The objective of this study was to test acceptance of, and interest in, a newly developed prototype of virtual reality enhanced mannequin (VREM) on a sample of congress attendees who volunteered to participate in the evaluation session and to respond to a specifically designed questionnaire. A commercial Laerdal HeartSim 4000 mannequin was developed to integrate virtual reality (VR) technologies with specially developed virtual reality software to increase the immersive perception of emergency scenarios. To evaluate the acceptance of a virtual reality enhanced mannequin (VREM), we presented it to a sample of 39 possible users. Each evaluation session involved one trainee and two instructors with a standardized procedure and scenario: the operator was invited by the instructor to wear the data-gloves and the head mounted display and was briefly introduced to the scope of the simulation. The instructor helped the operator familiarize himself with the environment. After the patient's collapse, the operator was asked to check the patient's clinical conditions and start CPR. Finally, the patient started to recover signs of circulation and the evaluation session was concluded. Each participant was then asked to respond to a questionnaire designed to explore the trainee's perception in the areas of user-friendliness, realism, and interaction/immersion. Overall, the evaluation of the system was very positive, as was the feeling of immersion and realism of the environment and simulation. Overall, 84.6% of the participants judged the virtual reality experience as interesting and believed that its development could be very useful for healthcare training. The prototype of the virtual reality enhanced mannequin was well-liked, without interfence by interaction devices, and deserves full technological development and validation in emergency medical training.

  2. Virtual Reality as a Story Telling Platform for Geoscience Communication

    NASA Astrophysics Data System (ADS)

    Lazar, K.; Moysey, S. M.

    2017-12-01

    Capturing the attention of students and the public is a critical step for increasing societal interest and literacy in earth science issues. Virtual reality (VR) provides a means for geoscience engagement that is well suited to place-based learning through exciting and immersive experiences. One approach is to create fully-immersive virtual gaming environments where players interact with physical objects, such as rock samples and outcrops, to pursue geoscience learning goals. Developing an experience like this, however, can require substantial programming expertise and resources. At the other end of the development spectrum, it is possible for anyone to create immersive virtual experiences with 360-degree imagery, which can be made interactive using easy to use VR editing software to embed videos, audio, images, and other content within the 360-degree image. Accessible editing tools like these make the creation of VR experiences something that anyone can tackle. Using the VR editor ThingLink and imagery from Google Maps, for example, we were able to create an interactive tour of the Grand Canyon, complete with embedded assessments, in a matter of hours. The true power of such platforms, however, comes from the potential to engage students as content authors to create and share stories of place that explore geoscience issues from their personal perspective. For example, we have used combinations of 360-degree images with interactive mapping and web platforms to enable students with no programming experience to create complex web apps as highly engaging story telling platforms. We highlight here examples of how we have implemented such story telling approaches with students to assess learning in courses, to share geoscience research outcomes, and to communicate issues of societal importance.

  3. Controlled interaction: strategies for using virtual reality to study perception.

    PubMed

    Durgin, Frank H; Li, Zhi

    2010-05-01

    Immersive virtual reality systems employing head-mounted displays offer great promise for the investigation of perception and action, but there are well-documented limitations to most virtual reality systems. In the present article, we suggest strategies for studying perception/action interactions that try to depend on both scale-invariant metrics (such as power function exponents) and careful consideration of the requirements of the interactions under investigation. New data concerning the effect of pincushion distortion on the perception of surface orientation are presented, as well as data documenting the perception of dynamic distortions associated with head movements with uncorrected optics. A review of several successful uses of virtual reality to study the interaction of perception and action emphasizes scale-free analysis strategies that can achieve theoretical goals while minimizing assumptions about the accuracy of virtual simulations.

  4. Using Interactive Visualization to Analyze Solid Earth Data and Geodynamics Models

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.; Kreylos, O.; Billen, M. I.; Hamann, B.; Jadamec, M. A.; Rundle, J. B.; van Aalsburg, J.; Yikilmaz, M. B.

    2008-12-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. Major projects such as EarthScope and GeoEarthScope are producing the data needed to characterize the structure and kinematics of Earth's surface and interior at unprecedented resolution. At the same time, high-performance computing enables high-precision and fine- detail simulation of geodynamics processes, complementing the observational data. To facilitate interpretation and analysis of these datasets, to evaluate models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. VR has traditionally been used primarily as a presentation tool allowing active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for accelerated scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. Our approach to VR takes advantage of the specialized skills of geoscientists who are trained to interpret geological and geophysical data generated from field observations. Interactive tools allow the scientist to explore and interpret geodynamic models, tomographic models, and topographic observations, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulations or field observations. The use of VR technology enables us to improve our interpretation of crust and mantle structure and of geodynamical processes. Mapping tools based on computer visualization allow virtual "field studies" in inaccessible regions, and an interactive tool allows us to construct digital fault models for use in numerical models. Using the interactive tools on a high-end platform such as an immersive virtual reality room known as a Cave Automatic Virtual Environment (CAVE), enables the scientist to stand in data three-dimensional dataset while taking measurements. The CAVE involves three or more projection surfaces arranged as walls in a room. Stereo projectors combined with a motion tracking system and immersion recreates the experience of carrying out research in the field. This high-end system provides significant advantages for scientists working with complex volumetric data. The interactive tools also work on low-cost platforms that provide stereo views and the potential for interactivity such as a Geowall or a 3D enabled TV. The Geowall is also a well-established tool for education, and in combination with the tools we have developed, enables the rapid transfer of research data and new knowledge to the classroom. The interactive visualization tools can also be used on a desktop or laptop with or without stereo capability. Further information about the Virtual Reality User Interface (VRUI), the 3DVisualizer, the Virtual mapping tools, and the LIDAR viewer, can be found on the KeckCAVES website, www.keckcaves.org.

  5. From Cognitive Capability to Social Reform? Shifting Perceptions of Learning in Immersive Virtual Worlds

    ERIC Educational Resources Information Center

    Savin-Baden, Maggi

    2008-01-01

    Learning in immersive virtual worlds (simulations and virtual worlds such as Second Life) could become a central learning approach in many curricula, but the socio-political impact of virtual world learning on higher education remains under-researched. Much of the recent research into learning in immersive virtual worlds centres around games and…

  6. A Virtual Walk through London: Culture Learning through a Cultural Immersion Experience

    ERIC Educational Resources Information Center

    Shih, Ya-Chun

    2015-01-01

    Integrating Google Street View into a three-dimensional virtual environment in which users control personal avatars provides these said users with access to an innovative, interactive, and real-world context for communication and culture learning. We have selected London, a city famous for its rich historical, architectural, and artistic heritage,…

  7. Learning to Teach in Second Life: A Novice Adventure in Virtual Reality

    ERIC Educational Resources Information Center

    Ellis, Maureen; Anderson, Patricia

    2011-01-01

    Second Life (SL) is a social virtual world, which emphasizes the general use of immersive worlds for supporting a variety of human activities and interactions, presenting a plethora of new opportunities and challenges for enriching how we learn, work and play (Boulos, Hetherington & Wheeler, 2007; Prasolova-Førland, Sourin & Sourina,…

  8. Using CLIPS to represent knowledge in a VR simulation

    NASA Technical Reports Server (NTRS)

    Engelberg, Mark L.

    1994-01-01

    Virtual reality (VR) is an exciting use of advanced hardware and software technologies to achieve an immersive simulation. Until recently, the majority of virtual environments were merely 'fly-throughs' in which a user could freely explore a 3-dimensional world or a visualized dataset. Now that the underlying technologies are reaching a level of maturity, programmers are seeking ways to increase the complexity and interactivity of immersive simulations. In most cases, interactivity in a virtual environment can be specified in the form 'whenever such-and-such happens to object X, it reacts in the following manner.' CLIPS and COOL provide a simple and elegant framework for representing this knowledge-base in an efficient manner that can be extended incrementally. The complexity of a detailed simulation becomes more manageable when the control flow is governed by CLIPS' rule-based inference engine as opposed to by traditional procedural mechanisms. Examples in this paper will illustrate an effective way to represent VR information in CLIPS, and to tie this knowledge base to the input and output C routines of a typical virtual environment.

  9. Immersive Environments - A Connectivist Approach

    NASA Astrophysics Data System (ADS)

    Loureiro, Ana; Bettencourt, Teresa

    We are conducting a research project with the aim of achieving better and more efficient ways to facilitate teaching and learning in Higher Level Education. We have chosen virtual environments, with particular emphasis to Second Life® platform augmented by web 2.0 tools, to develop the study. The Second Life® environment has some interesting characteristics that captured our attention, it is immersive; it is a real world simulator; it is a social network; it allows real time communication, cooperation, collaboration and interaction; it is a safe and controlled environment. We specifically chose tools from web 2.0 that enable sharing and collaborative way of learning. Through understanding the characteristics of this learning environment, we believe that immersive learning along with other virtual tools can be integrated in today's pedagogical practices.

  10. Designing Virtual Museum Using Web3D Technology

    NASA Astrophysics Data System (ADS)

    Zhao, Jianghai

    VRT was born to have the potentiality of constructing an effective learning environment due to its 3I characteristics: Interaction, Immersion and Imagination. It is now applied in education in a more profound way along with the development of VRT. Virtual Museum is one of the applications. The Virtual Museum is based on the WEB3D technology and extensibility is the most important factor. Considering the advantage and disadvantage of each WEB3D technology, VRML, CULT3D AND VIEWPOINT technologies are chosen. A web chatroom based on flash and ASP technology is also been created in order to make the Virtual Museum an interactive learning environment.

  11. Immersive and interactive virtual reality to improve learning and retention of neuroanatomy in medical students: a randomized controlled study.

    PubMed

    Ekstrand, Chelsea; Jamal, Ali; Nguyen, Ron; Kudryk, Annalise; Mann, Jennifer; Mendez, Ivar

    2018-02-23

    Spatial 3-dimensional understanding of the brain is essential to learning neuroanatomy, and 3-dimensional learning techniques have been proposed as tools to enhance neuroanatomy training. The aim of this study was to examine the impact of immersive virtual-reality neuroanatomy training and compare it to traditional paper-based methods. In this randomized controlled study, participants consisted of first- or second-year medical students from the University of Saskatchewan recruited via email and posters displayed throughout the medical school. Participants were randomly assigned to the virtual-reality group or the paper-based group and studied the spatial relations between neural structures for 12 minutes after performing a neuroanatomy baseline test, with both test and control questions. A postintervention test was administered immediately after the study period and 5-9 days later. Satisfaction measures were obtained. Of the 66 participants randomly assigned to the study groups, 64 were included in the final analysis, 31 in the virtual-reality group and 33 in the paper-based group. The 2 groups performed comparably on the baseline questions and showed significant performance improvement on the test questions following study. There were no significant differences between groups for the control questions, the postintervention test questions or the 7-day postintervention test questions. Satisfaction survey results indicated that neurophobia was decreased. Results from this study provide evidence that training in neuroanatomy in an immersive and interactive virtual-reality environment may be an effective neuroanatomy learning tool that warrants further study. They also suggest that integration of virtual-reality into neuroanatomy training may improve knowledge retention, increase study motivation and decrease neurophobia. Copyright 2018, Joule Inc. or its licensors.

  12. Immersive and interactive virtual reality to improve learning and retention of neuroanatomy in medical students: a randomized controlled study

    PubMed Central

    Ekstrand, Chelsea; Jamal, Ali; Nguyen, Ron; Kudryk, Annalise; Mann, Jennifer; Mendez, Ivar

    2018-01-01

    Background: Spatial 3-dimensional understanding of the brain is essential to learning neuroanatomy, and 3-dimensional learning techniques have been proposed as tools to enhance neuroanatomy training. The aim of this study was to examine the impact of immersive virtual-reality neuroanatomy training and compare it to traditional paper-based methods. Methods: In this randomized controlled study, participants consisted of first- or second-year medical students from the University of Saskatchewan recruited via email and posters displayed throughout the medical school. Participants were randomly assigned to the virtual-reality group or the paper-based group and studied the spatial relations between neural structures for 12 minutes after performing a neuroanatomy baseline test, with both test and control questions. A postintervention test was administered immediately after the study period and 5-9 days later. Satisfaction measures were obtained. Results: Of the 66 participants randomly assigned to the study groups, 64 were included in the final analysis, 31 in the virtual-reality group and 33 in the paper-based group. The 2 groups performed comparably on the baseline questions and showed significant performance improvement on the test questions following study. There were no significant differences between groups for the control questions, the postintervention test questions or the 7-day postintervention test questions. Satisfaction survey results indicated that neurophobia was decreased. Interpretation: Results from this study provide evidence that training in neuroanatomy in an immersive and interactive virtual-reality environment may be an effective neuroanatomy learning tool that warrants further study. They also suggest that integration of virtual-reality into neuroanatomy training may improve knowledge retention, increase study motivation and decrease neurophobia. PMID:29510979

  13. Immersive participation: Smartphone-Apps and Virtual Reality - tools for knowledge transfer, citizen science and interactive collaboration

    NASA Astrophysics Data System (ADS)

    Dotterweich, Markus

    2017-04-01

    In the last few years, the use of smartphone-apps has become a daily routine in our life. However, only a few approaches have been undertaken to use apps for transferring scientific knowledge to the public audience. The development of learning apps or serious games requires large efforts and several levels of simplification which is different to traditional text books or learning webpages. Current approaches often lack a connection to the real life and/or innovative gamification concepts. Another almost untapped potential is the use of Virtual Reality, a fast growing technology which replicates a virtual environment in order to simulate physical experiences in artificial or real worlds. Hence, smartphone-apps and VR provides new opportunities for capacity building, knowledge transfer, citizen science or interactive engagement in the realm of environmental sciences. This presentation will show some examples and discuss the advantages of these immersive approaches to improve the knowledge transfer between scientists and citizens and to stimulate actions in the real world.

  14. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  15. Investigating Learners' Attitudes toward Virtual Reality Learning Environments: Based on a Constructivist Approach

    ERIC Educational Resources Information Center

    Huang, Hsiu-Mei; Rauch, Ulrich; Liaw, Shu-Sheng

    2010-01-01

    The use of animation and multimedia for learning is now further extended by the provision of entire Virtual Reality Learning Environments (VRLE). This highlights a shift in Web-based learning from a conventional multimedia to a more immersive, interactive, intuitive and exciting VR learning environment. VRLEs simulate the real world through the…

  16. Safety in Construction Using Virtual Reality (SAVR): A Model for Labor Safety. Working Paper Series WP-022.

    ERIC Educational Resources Information Center

    Hadipriono, Fabian C.; And Others

    An interactive training model called SAVR (Safety in Construction Using Virtual Reality) was developed to train construction students, novice engineers, and construction workers to prevent falls from scaffolding. The model was implemented in a graphics supercomputer, the ONYX Reality Engine2. The SAVR model provides trainees with an immersive,…

  17. A Nationwide Experimental Multi-Gigabit Network

    DTIC Science & Technology

    2003-03-01

    television and cinema , and to real- time interactive teleconferencing. There is another variable which affects this happy growth in network bandwidth and...render large scientific data sets with interactive frame rates on the desktop or in an immersive virtual reality ( VR ) environment. In our design, we

  18. Crowd behaviour during high-stress evacuations in an immersive virtual environment

    PubMed Central

    Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W.; Gross, Markus; Helbing, Dirk; Hölscher, Christoph

    2016-01-01

    Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. PMID:27605166

  19. Crowd behaviour during high-stress evacuations in an immersive virtual environment.

    PubMed

    Moussaïd, Mehdi; Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W; Gross, Markus; Helbing, Dirk; Hölscher, Christoph

    2016-09-01

    Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. © 2016 The Authors.

  20. The Immersive Virtual Reality Experience: A Typology of Users Revealed Through Multiple Correspondence Analysis Combined with Cluster Analysis Technique.

    PubMed

    Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz

    2016-03-01

    Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.

  1. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  2. Exploring Design Requirements for Repurposing Dental Virtual Patients From the Web to Second Life: A Focus Group Study

    PubMed Central

    Antoniou, Panagiotis E; Athanasopoulou, Christina A; Dafli, Eleni

    2014-01-01

    Background Since their inception, virtual patients have provided health care educators with a way to engage learners in an experience simulating the clinician’s environment without danger to learners and patients. This has led this learning modality to be accepted as an essential component of medical education. With the advent of the visually and audio-rich 3-dimensional multi-user virtual environment (MUVE), a new deployment platform has emerged for educational content. Immersive, highly interactive, multimedia-rich, MUVEs that seamlessly foster collaboration provide a new hotbed for the deployment of medical education content. Objective This work aims to assess the suitability of the Second Life MUVE as a virtual patient deployment platform for undergraduate dental education, and to explore the requirements and specifications needed to meaningfully repurpose Web-based virtual patients in MUVEs. Methods Through the scripting capabilities and available art assets in Second Life, we repurposed an existing Web-based periodontology virtual patient into Second Life. Through a series of point-and-click interactions and multiple-choice queries, the user experienced a specific periodontology case and was asked to provide the optimal responses for each of the challenges of the case. A focus group of 9 undergraduate dentistry students experienced both the Web-based and the Second Life version of this virtual patient. The group convened 3 times and discussed relevant issues such as the group’s computer literacy, the assessment of Second Life as a virtual patient deployment platform, and compared the Web-based and MUVE-deployed virtual patients. Results A comparison between the Web-based and the Second Life virtual patient revealed the inherent advantages of the more experiential and immersive Second Life virtual environment. However, several challenges for the successful repurposing of virtual patients from the Web to the MUVE were identified. The identified challenges for repurposing of Web virtual patients to the MUVE platform from the focus group study were (1) increased case complexity to facilitate the user’s gaming preconception in a MUVE, (2) necessity to decrease textual narration and provide the pertinent information in a more immersive sensory way, and (3) requirement to allow the user to actuate the solutions of problems instead of describing them through narration. Conclusions For a successful systematic repurposing effort of virtual patients to MUVEs such as Second Life, the best practices of experiential and immersive game design should be organically incorporated in the repurposing workflow (automated or not). These findings are pivotal in an era in which open educational content is transferred to and shared among users, learners, and educators of various open repositories/environments. PMID:24927470

  3. Virtual reality and telerobotics applications of an Address Recalculation Pipeline

    NASA Technical Reports Server (NTRS)

    Regan, Matthew; Pose, Ronald

    1994-01-01

    The technology described in this paper was designed to reduce latency to user interactions in immersive virtual reality environments. It is also ideally suited to telerobotic applications such as interaction with remote robotic manipulators in space or in deep sea operations. in such circumstances the significant latency is observed response to user stimulus which is due to communications delays, and the disturbing jerkiness due to low and unpredictable frame rates on compressed video user feedback or computationally limited virtual worlds, can be masked by our techniques. The user is provided with highly responsive visual feedback independent of communication or computational delays in providing physical video feedback or in rendering virtual world images. Virtual and physical environments can be combined seamlessly using these techniques.

  4. Towards Gesture-Based Multi-User Interactions in Collaborative Virtual Environments

    NASA Astrophysics Data System (ADS)

    Pretto, N.; Poiesi, F.

    2017-11-01

    We present a virtual reality (VR) setup that enables multiple users to participate in collaborative virtual environments and interact via gestures. A collaborative VR session is established through a network of users that is composed of a server and a set of clients. The server manages the communication amongst clients and is created by one of the users. Each user's VR setup consists of a Head Mounted Display (HMD) for immersive visualisation, a hand tracking system to interact with virtual objects and a single-hand joypad to move in the virtual environment. We use Google Cardboard as a HMD for the VR experience and a Leap Motion for hand tracking, thus making our solution low cost. We evaluate our VR setup though a forensics use case, where real-world objects pertaining to a simulated crime scene are included in a VR environment, acquired using a smartphone-based 3D reconstruction pipeline. Users can interact using virtual gesture-based tools such as pointers and rulers.

  5. Immersive virtual reality simulations in nursing education.

    PubMed

    Kilmon, Carol A; Brown, Leonard; Ghosh, Sumit; Mikitiuk, Artur

    2010-01-01

    This article explores immersive virtual reality as a potential educational strategy for nursing education and describes an immersive learning experience now being developed for nurses. This pioneering project is a virtual reality application targeting speed and accuracy of nurse response in emergency situations requiring cardiopulmonary resuscitation. Other potential uses and implications for the development of virtual reality learning programs are discussed.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric A. Wernert; William R. Sherman; Patrick O'Leary

    Immersive visualization makes use of the medium of virtual reality (VR) - it is a subset of virtual reality focused on the application of VR technologies to scientific and information visualization. As the name implies, there is a particular focus on the physically immersive aspect of VR that more fully engages the perceptual and kinesthetic capabilities of the scientist with the goal of producing greater insight. The immersive visualization community is uniquely positioned to address the analysis needs of the wide spectrum of domain scientists who are becoming increasingly overwhelmed by data. The outputs of computational science simulations and high-resolutionmore » sensors are creating a data deluge. Data is coming in faster than it can be analyzed, and there are countless opportunities for discovery that are missed as the data speeds by. By more fully utilizing the scientists visual and other sensory systems, and by offering a more natural user interface with which to interact with computer-generated representations, immersive visualization offers great promise in taming this data torrent. However, increasing the adoption of immersive visualization in scientific research communities can only happen by simultaneously lowering the engagement threshold while raising the measurable benefits of adoption. Scientists time spent immersed with their data will thus be rewarded with higher productivity, deeper insight, and improved creativity. Immersive visualization ties together technologies and methodologies from a variety of related but frequently disjoint areas, including hardware, software and human-computer interaction (HCI) disciplines. In many ways, hardware is a solved problem. There are well established technologies including large walk-in systems such as the CAVE{trademark} and head-based systems such as the Wide-5{trademark}. The advent of new consumer-level technologies now enable an entirely new generation of immersive displays, with smaller footprints and costs, widening the potential consumer base. While one would be hard-pressed to call software a solved problem, we now understand considerably more about best practices for designing and developing sustainable, scalable software systems, and we have useful software examples that illuminate the way to even better implementations. As with any research endeavour, HCI will always be exploring new topics in interface design, but we now have a sizable knowledge base of the strengths and weaknesses of the human perceptual systems and we know how to design effective interfaces for immersive systems. So, in a research landscape with a clear need for better visualization and analysis tools, a methodology in immersive visualization that has been shown to effectively address some of those needs, and vastly improved supporting technologies and knowledge of hardware, software, and HCI, why hasn't immersive visualization 'caught on' more with scientists? What can we do as a community of immersive visualization researchers and practitioners to facilitate greater adoption by scientific communities so as to make the transition from 'the promise of virtual reality' to 'the reality of virtual reality'.« less

  7. eduCRATE--a Virtual Hospital architecture.

    PubMed

    Stoicu-Tivadar, Lăcrimioara; Stoicu-Tivadar, Vasile; Berian, Dorin; Drăgan, Simona; Serban, Alexandru; Serban, Corina

    2014-01-01

    eduCRATE is a complex project proposal which aims to develop a virtual learning environment offering interactive digital content through original and integrated solutions using cloud computing, complex multimedia systems in virtual space and personalized design with avatars. Compared to existing similar products the project brings the novelty of using languages for medical guides in order to ensure a maximum of flexibility. The Virtual Hospital simulations will create interactive clinical scenarios for which students will find solutions for positive diagnosis and therapeutic management. The solution based on cloud computing and immersive multimedia is an attractive option in education because is economical and it matches the current working style of the young generation to whom it addresses.

  8. Ontological implications of being in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Morie, Jacquelyn F.

    2008-02-01

    The idea of Virtual Reality once conjured up visions of new territories to explore, and expectations of awaiting worlds of wonder. VR has matured to become a practical tool for therapy, medicine and commercial interests, yet artists, in particular, continue to expand the possibilities for the medium. Artistic virtual environments created over the past two decades probe the phenomenological nature of these virtual environments. When we inhabit a fully immersive virtual environment, we have entered into a new form of Being. Not only does our body continue to exist in the real, physical world, we are also embodied within the virtual by means of technology that translates our bodied actions into interactions with the virtual environment. Very few states in human existence allow this bifurcation of our Being, where we can exist simultaneously in two spaces at once, with the possible exception of meta-physical states such as shamanistic trance and out-of-body experiences. This paper discusses the nature of this simultaneous Being, how we enter the virtual space, what forms of persona we can don there, what forms of spaces we can inhabit, and what type of wondrous experiences we can both hope for and expect.

  9. The German VR Simulation Realism Scale--psychometric construction for virtual reality applications with virtual humans.

    PubMed

    Poeschl, Sandra; Doering, Nicola

    2013-01-01

    Virtual training applications with high levels of immersion or fidelity (for example for social phobia treatment) produce high levels of presence and therefore belong to the most successful Virtual Reality developments. Whereas display and interaction fidelity (as sub-dimensions of immersion) and their influence on presence are well researched, realism of the displayed simulation depends on the specific application and is therefore difficult to measure. We propose to measure simulation realism by using a self-report questionnaire. The German VR Simulation Realism Scale for VR training applications was developed based on a translation of scene realism items from the Witmer-Singer-Presence Questionnaire. Items for realism of virtual humans (for example for social phobia training applications) were supplemented. A sample of N = 151 students rated simulation realism of a Fear of Public Speaking application. Four factors were derived by item- and principle component analysis (Varimax rotation), representing Scene Realism, Audience Behavior, Audience Appearance and Sound Realism. The scale developed can be used as a starting point for future research and measurement of simulation realism for applications including virtual humans.

  10. ConfocalVR: Immersive Visualization Applied to Confocal Microscopy.

    PubMed

    Stefani, Caroline; Lacy-Hulbert, Adam; Skillman, Thomas

    2018-06-24

    ConfocalVR is a virtual reality (VR) application created to improve the ability of researchers to study the complexity of cell architecture. Confocal microscopes take pictures of fluorescently labeled proteins or molecules at different focal planes to create a stack of 2D images throughout the specimen. Current software applications reconstruct the 3D image and render it as a 2D projection onto a computer screen where users need to rotate the image to expose the full 3D structure. This process is mentally taxing, breaks down if you stop the rotation, and does not take advantage of the eye's full field of view. ConfocalVR exploits consumer-grade virtual reality (VR) systems to fully immerse the user in the 3D cellular image. In this virtual environment the user can: 1) adjust image viewing parameters without leaving the virtual space, 2) reach out and grab the image to quickly rotate and scale the image to focus on key features, and 3) interact with other users in a shared virtual space enabling real-time collaborative exploration and discussion. We found that immersive VR technology allows the user to rapidly understand cellular architecture and protein or molecule distribution. We note that it is impossible to understand the value of immersive visualization without experiencing it first hand, so we encourage readers to get access to a VR system, download this software, and evaluate it for yourself. The ConfocalVR software is available for download at http://www.confocalvr.com, and is free for nonprofits. Copyright © 2018. Published by Elsevier Ltd.

  11. Re-Dimensional Thinking in Earth Science: From 3-D Virtual Reality Panoramas to 2-D Contour Maps

    ERIC Educational Resources Information Center

    Park, John; Carter, Glenda; Butler, Susan; Slykhuis, David; Reid-Griffin, Angelia

    2008-01-01

    This study examines the relationship of gender and spatial perception on student interactivity with contour maps and non-immersive virtual reality. Eighteen eighth-grade students elected to participate in a six-week activity-based course called "3-D GeoMapping." The course included nine days of activities related to topographic mapping.…

  12. Problem-Based Learning Spanning Real and Virtual Words: A Case Study in Second Life

    ERIC Educational Resources Information Center

    Good, Judith; Howland, Katherine; Thackray, Liz

    2008-01-01

    There is a growing use of immersive virtual environments for educational purposes. However, much of this activity is not yet documented in the public domain, or is descriptive rather than analytical. This paper presents a case study in which university students were tasked with building an interactive learning experience using Second Life as a…

  13. Immersive virtual reality improves movement patterns in patients after ACL reconstruction: implications for enhanced criteria-based return-to-sport rehabilitation.

    PubMed

    Gokeler, Alli; Bisschop, Marsha; Myer, Gregory D; Benjaminse, Anne; Dijkstra, Pieter U; van Keeken, Helco G; van Raay, Jos J A M; Burgerhof, Johannes G M; Otten, Egbert

    2016-07-01

    The purpose of this study was to evaluate the influence of immersion in a virtual reality environment on knee biomechanics in patients after ACL reconstruction (ACLR). It was hypothesized that virtual reality techniques aimed to change attentional focus would influence altered knee flexion angle, knee extension moment and peak vertical ground reaction force (vGRF) in patients following ACLR. Twenty athletes following ACLR and 20 healthy controls (CTRL) performed a step-down task in both a non-virtual reality environment and a virtual reality environment displaying a pedestrian traffic scene. A motion analysis system and force plates were used to measure kinematics and kinetics during a step-down task to analyse each single-leg landing. A significant main effect was found for environment for knee flexion excursion (P = n.s.). Significant interaction differences were found between environment and groups for vGRF (P = 0.004), knee moment (P < 0.001), knee angle at peak vGRF (P = 0.01) and knee flexion excursion (P = 0.03). There was larger effect of virtual reality environment on knee biomechanics in patients after ACLR compared with controls. Patients after ACLR immersed in virtual reality environment demonstrated knee joint biomechanics that approximate those of CTRL. The results of this study indicate that a realistic virtual reality scenario may distract patients after ACLR from conscious motor control. Application of clinically available technology may aid in current rehabilitation programmes to target altered movement patterns after ACLR. Diagnostic study, Level III.

  14. Presence Relates to Distinct Outcomes in Two Virtual Environments Employing Different Learning Modalities

    PubMed Central

    Persky, Susan; Kaphingst, Kimberly A.; McCall, Cade; Lachance, Christina; Beall, Andrew C.; Blascovich, Jim

    2009-01-01

    Presence in virtual learning environments (VLEs) has been associated with a number of outcome factors related to a user’s ability and motivation to learn. The extant but relatively small body of research suggests that a high level of presence is related to better performance on learning outcomes in VLEs. Different configurations of form and content variables such as those associated with active (self-driven, interactive activities) versus didactic (reading or lecture) learning may, however, influence how presence operates and on what content it operates. We compared the influence of presence between two types of immersive VLEs (i.e., active versus didactic techniques) on comprehension and engagement-related outcomes. The findings revealed that the active VLE promoted greater presence. Although we found no relationship between presence and learning comprehension outcomes for either virtual environment, presence was related to information engagement variables in the didactic immersive VLE but not the active environment. Results demonstrate that presence is not uniformly elicited or effective across immersive VLEs. Educational delivery mode and environment complexity may influence the impact of presence on engagement. PMID:19366319

  15. Presence relates to distinct outcomes in two virtual environments employing different learning modalities.

    PubMed

    Persky, Susan; Kaphingst, Kimberly A; McCall, Cade; Lachance, Christina; Beall, Andrew C; Blascovich, Jim

    2009-06-01

    Presence in virtual learning environments (VLEs) has been associated with a number of outcome factors related to a user's ability and motivation to learn. The extant but relatively small body of research suggests that a high level of presence is related to better performance on learning outcomes in VLEs. Different configurations of form and content variables such as those associated with active (self-driven, interactive activities) versus didactic (reading or lecture) learning may, however, influence how presence operates and on what content it operates. We compared the influence of presence between two types of immersive VLEs (i.e., active versus didactic techniques) on comprehension and engagement-related outcomes. The findings revealed that the active VLE promoted greater presence. Although we found no relationship between presence and learning comprehension outcomes for either virtual environment, presence was related to information engagement variables in the didactic immersive VLE but not the active environment. Results demonstrate that presence is not uniformly elicited or effective across immersive VLEs. Educational delivery mode and environment complexity may influence the impact of presence on engagement.

  16. Usability Testing of an Interactive Virtual Reality Distraction Intervention to Reduce Procedural Pain in Children and Adolescents With Cancer.

    PubMed

    Birnie, Kathryn A; Kulandaivelu, Yalinie; Jibb, Lindsay; Hroch, Petra; Positano, Karyn; Robertson, Simon; Campbell, Fiona; Abla, Oussama; Stinson, Jennifer

    2018-06-01

    Needle procedures are among the most distressing aspects of pediatric cancer-related treatment. Virtual reality (VR) distraction offers promise for needle-related pain and distress given its highly immersive and interactive virtual environment. This study assessed the usability (ease of use and understanding, acceptability) of a custom VR intervention for children with cancer undergoing implantable venous access device (IVAD) needle insertion. Three iterative cycles of mixed-method usability testing with semistructured interviews were undertaken to refine the VR. Participants included 17 children and adolescents (8-18 years old) with cancer who used the VR intervention prior to or during IVAD access. Most participants reported the VR as easy to use (82%) and understand (94%), and would like to use it during subsequent needle procedures (94%). Based on usability testing, refinements were made to VR hardware, software, and clinical implementation. Refinements focused on increasing responsiveness, interaction, and immersion of the VR program, reducing head movement for VR interaction, and enabling participant alerts to steps of the procedure by clinical staff. No adverse events of nausea or dizziness were reported. The VR intervention was deemed acceptable and safe. Next steps include assessing feasibility and effectiveness of the VR intervention for pain and distress.

  17. Machinima Interventions: Innovative Approaches to Immersive Virtual World Curriculum Integration

    ERIC Educational Resources Information Center

    Middleton, Andrew John; Mather, Richard

    2008-01-01

    The educational value of Immersive Virtual Worlds (IVWs) seems to be in their social immersive qualities and as an accessible simulation technology. In contrast to these synchronous applications this paper discusses the use of educational machinima developed in IVW virtual film sets. It also introduces the concept of media intervention, proposing…

  18. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this visualization tool freely available to the academic community within a few months, on an experimental (beta testing) basis.

  19. Estimating the gaze of a virtuality human.

    PubMed

    Roberts, David J; Rae, John; Duckworth, Tobias W; Moore, Carl M; Aspin, Rob

    2013-04-01

    The aim of our experiment is to determine if eye-gaze can be estimated from a virtuality human: to within the accuracies that underpin social interaction; and reliably across gaze poses and camera arrangements likely in every day settings. The scene is set by explaining why Immersive Virtuality Telepresence has the potential to meet the grand challenge of faithfully communicating both the appearance and the focus of attention of a remote human participant within a shared 3D computer-supported context. Within the experiment n=22 participants rotated static 3D virtuality humans, reconstructed from surround images, until they felt most looked at. The dependent variable was absolute angular error, which was compared to that underpinning social gaze behaviour in the natural world. Independent variables were 1) relative orientations of eye, head and body of captured subject; and 2) subset of cameras used to texture the form. Analysis looked for statistical and practical significance and qualitative corroborating evidence. The analysed results tell us much about the importance and detail of the relationship between gaze pose, method of video based reconstruction, and camera arrangement. They tell us that virtuality can reproduce gaze to an accuracy useful in social interaction, but with the adopted method of Video Based Reconstruction, this is highly dependent on combination of gaze pose and camera arrangement. This suggests changes in the VBR approach in order to allow more flexible camera arrangements. The work is of interest to those wanting to support expressive meetings that are both socially and spatially situated, and particular those using or building Immersive Virtuality Telepresence to accomplish this. It is also of relevance to the use of virtuality humans in applications ranging from the study of human interactions to gaming and the crossing of the stage line in films and TV.

  20. The perception of spatial layout in real and virtual worlds.

    PubMed

    Arthur, E J; Hancock, P A; Chrysler, S T

    1997-01-01

    As human-machine interfaces grow more immersive and graphically-oriented, virtual environment systems become more prominent as the medium for human-machine communication. Often, virtual environments (VE) are built to provide exact metrical representations of existing or proposed physical spaces. However, it is not known how individuals develop representational models of these spaces in which they are immersed and how those models may be distorted with respect to both the virtual and real-world equivalents. To evaluate the process of model development, the present experiment examined participant's ability to reproduce a complex spatial layout of objects having experienced them previously under different viewing conditions. The layout consisted of nine common objects arranged on a flat plane. These objects could be viewed in a free binocular virtual condition, a free binocular real-world condition, and in a static monocular view of the real world. The first two allowed active exploration of the environment while the latter condition allowed the participant only a passive opportunity to observe from a single viewpoint. Viewing conditions were a between-subject variable with 10 participants randomly assigned to each condition. Performance was assessed using mapping accuracy and triadic comparisons of relative inter-object distances. Mapping results showed a significant effect of viewing condition where, interestingly, the static monocular condition was superior to both the active virtual and real binocular conditions. Results for the triadic comparisons showed a significant interaction for gender by viewing condition in which males were more accurate than females. These results suggest that the situation model resulting from interaction with a virtual environment was indistinguishable from interaction with real objects at least within the constraints of the present procedure.

  1. Real-time interactive virtual tour on the World Wide Web (WWW)

    NASA Astrophysics Data System (ADS)

    Yoon, Sanghyuk; Chen, Hai-jung; Hsu, Tom; Yoon, Ilmi

    2003-12-01

    Web-based Virtual Tour has become a desirable and demanded application, yet challenging due to the nature of web application's running environment such as limited bandwidth and no guarantee of high computation power on the client side. Image-based rendering approach has attractive advantages over traditional 3D rendering approach in such Web Applications. Traditional approach, such as VRML, requires labor-intensive 3D modeling process, high bandwidth and computation power especially for photo-realistic virtual scenes. QuickTime VR and IPIX as examples of image-based approach, use panoramic photos and the virtual scenes that can be generated from photos directly skipping the modeling process. But, these image-based approaches may require special cameras or effort to take panoramic views and provide only one fixed-point look-around and zooming in-out rather than 'walk around', that is a very important feature to provide immersive experience to virtual tourists. The Web-based Virtual Tour using Tour into the Picture employs pseudo 3D geometry with image-based rendering approach to provide viewers with immersive experience of walking around the virtual space with several snap shots of conventional photos.

  2. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  3. Playing in or out of character: user role differences in the experience of interactive storytelling.

    PubMed

    Roth, Christian; Vermeulen, Ivar; Vorderer, Peter; Klimmt, Christoph; Pizzi, David; Lugrin, Jean-Luc; Cavazza, Marc

    2012-11-01

    Interactive storytelling (IS) is a promising new entertainment technology synthesizing preauthored narrative with dynamic user interaction. Existing IS prototypes employ different modes to involve users in a story, ranging from individual avatar control to comprehensive control over the virtual environment. The current experiment tested whether different player modes (exerting local vs. global influence) yield different user experiences (e.g., senses of immersion vs. control). A within-subject design involved 34 participants playing the cinematic IS drama "Emo Emma"( 1 ) both in the local (actor) and in global (ghost) mode. The latter mode allowed free movement in the virtual environment and hidden influence on characters, objects, and story development. As expected, control-related experiential qualities (effectance, autonomy, flow, and pride) were more intense for players in the global (ghost) mode. Immersion-related experiences did not differ over modes. Additionally, men preferred the sense of command facilitated by the ghost mode, whereas women preferred the sense of involvement facilitated by the actor mode.

  4. Mathematical Basis of Knowledge Discovery and Autonomous Intelligent Architectures - Technology for the Creation of Virtual objects in the Real World

    DTIC Science & Technology

    2005-12-14

    control of position/orientation of mobile TV cameras. 9 Unit 9 Force interaction system Unit 6 Helmet mounted displays robot like device drive...joints of the master arm (see Unit 1) which joint coordinates are tracked by the virtual manipulator. Unit 6 . Two displays built in the helmet...special device for simulating the tactile- kinaesthetic effect of immersion. When virtual body is a manipulator it comprises: − master arm with 6

  5. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  6. Eye height scaling of absolute size in immersive and nonimmersive displays

    NASA Technical Reports Server (NTRS)

    Dixon, M. W.; Wraga, M.; Proffitt, D. R.; Williams, G. C.; Kaiser, M. K. (Principal Investigator)

    2000-01-01

    Eye-height (EH) scaling of absolute height was investigated in three experiments. In Experiment 1, standing observers viewed cubes in an immersive virtual environment. Observers' center of projection was placed at actual EH and at 0.7 times actual EH. Observers' size judgments revealed that the EH manipulation was 76.8% effective. In Experiment 2, seated observers viewed the same cubes on an interactive desktop display; however, no effect of EH was found in response to the simulated EH manipulation. Experiment 3 tested standing observers in the immersive environment with the field of view reduced to match that of the desktop. Comparable to Experiment 1, the effect of EH was 77%. These results suggest that EH scaling is not generally used when people view an interactive desktop display because the altitude of the center of projection is indeterminate. EH scaling is spontaneously evoked, however, in immersive environments.

  7. Interactive Visualization to Advance Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn

    2008-04-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.

  8. The Efficacy of an Immersive 3D Virtual versus 2D Web Environment in Intercultural Sensitivity Acquisition

    ERIC Educational Resources Information Center

    Coffey, Amy Jo; Kamhawi, Rasha; Fishwick, Paul; Henderson, Julie

    2017-01-01

    Relatively few studies have empirically tested computer-based immersive virtual environments' efficacy in teaching or enhancing pro-social attitudes, such as intercultural sensitivity. This channel study experiment was conducted (N = 159) to compare what effects, if any, an immersive 3D virtual environment would have upon subjects' intercultural…

  9. Simulation Exploration through Immersive Parallel Planes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  10. Simulation Exploration through Immersive Parallel Planes: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  11. Interactive Molecular Graphics for Augmented Reality Using HoloLens.

    PubMed

    Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas

    2018-06-13

    Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.

  12. Immersive Training Systems: Virtual Reality and Education and Training.

    ERIC Educational Resources Information Center

    Psotka, Joseph

    1995-01-01

    Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…

  13. Inclusion of Immersive Virtual Learning Environments and Visual Control Systems to Support the Learning of Students with Asperger Syndrome

    ERIC Educational Resources Information Center

    Lorenzo, Gonzalo; Pomares, Jorge; Lledo, Asuncion

    2013-01-01

    This paper presents the use of immersive virtual reality systems in the educational intervention with Asperger students. The starting points of this study are features of these students' cognitive style that requires an explicit teaching style supported by visual aids and highly structured environments. The proposed immersive virtual reality…

  14. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  15. Cognitive evaluation for the diagnosis of Alzheimer's disease based on Turing Test and Virtual Environments.

    PubMed

    Fernandez Montenegro, Juan Manuel; Argyriou, Vasileios

    2017-05-01

    Alzheimer's screening tests are commonly used by doctors to diagnose the patient's condition and stage as early as possible. Most of these tests are based on pen-paper interaction and do not embrace the advantages provided by new technologies. This paper proposes novel Alzheimer's screening tests based on virtual environments and game principles using new immersive technologies combined with advanced Human Computer Interaction (HCI) systems. These new tests are focused on the immersion of the patient in a virtual room, in order to mislead and deceive the patient's mind. In addition, we propose two novel variations of Turing Test proposed by Alan Turing as a method to detect dementia. As a result, four tests are introduced demonstrating the wide range of screening mechanisms that could be designed using virtual environments and game concepts. The proposed tests are focused on the evaluation of memory loss related to common objects, recent conversations and events; the diagnosis of problems in expressing and understanding language; the ability to recognize abnormalities; and to differentiate between virtual worlds and reality, or humans and machines. The proposed screening tests were evaluated and tested using both patients and healthy adults in a comparative study with state-of-the-art Alzheimer's screening tests. The results show the capacity of the new tests to distinguish healthy people from Alzheimer's patients. Copyright © 2017. Published by Elsevier Inc.

  16. Linking Immersive Virtual Field Trips with an Adaptive Learning Platform

    NASA Astrophysics Data System (ADS)

    Bruce, G.; Taylor, W.; Anbar, A. D.; Semken, S. C.; Buxner, S.; Mead, C.; El-Moujaber, E.; Summons, R. E.; Oliver, C.

    2016-12-01

    The use of virtual environments in science education has been constrained by the difficulty of guiding a learner's actions within the those environments. In this work, we demonstrate how advances in education software technology allow educators to create interactive learning experiences that respond and adapt intelligently to learner input within the virtual environment. This innovative technology provides a far greater capacity for delivering authentic inquiry-driven educational experiences in unique settings from around the world. Our immersive virtual field trips (iVFT) bring students virtually to geologically significant but inaccessible environments, where they learn through authentic practices of scientific inquiry. In one recent example, students explore the fossil beds in Nilpena, South Australia to learn about the Ediacaran fauna. Students interactively engage in 360° recreations of the environment, uncover the nature of the historical ecosystem by identifying fossils with a dichotomous key, explore actual fossil beds in high resolution imagery, and reconstruct what an ecosystem might have looked like millions of years ago in an interactive simulation. With the new capacity to connect actions within the iVFT to an intelligent tutoring system, these learning experiences can be tracked, guided, and tailored individually to the immediate actions of the student. This new capacity also has great potential for learning designers to take a data-driven approach to lesson improvement and for education researchers to study learning in virtual environments. Thus, we expect iVFT will be fertile ground for novel research. Such iVFT are currently in use in several introductory classes offered online at Arizona State University in anthropology, introductory biology, and astrobiology, reaching thousands of students to date. Drawing from these experiences, we are designing a curriculum for historical geology that will be built around iVFT-based exploration of Earth history.

  17. Using voice input and audio feedback to enhance the reality of a virtual experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miner, N.E.

    1994-04-01

    Virtual Reality (VR) is a rapidly emerging technology which allows participants to experience a virtual environment through stimulation of the participant`s senses. Intuitive and natural interactions with the virtual world help to create a realistic experience. Typically, a participant is immersed in a virtual environment through the use of a 3-D viewer. Realistic, computer-generated environment models and accurate tracking of a participant`s view are important factors for adding realism to a virtual experience. Stimulating a participant`s sense of sound and providing a natural form of communication for interacting with the virtual world are equally important. This paper discusses the advantagesmore » and importance of incorporating voice recognition and audio feedback capabilities into a virtual world experience. Various approaches and levels of complexity are discussed. Examples of the use of voice and sound are presented through the description of a research application developed in the VR laboratory at Sandia National Laboratories.« less

  18. Experience with V-STORE: considerations on presence in virtual environments for effective neuropsychological rehabilitation of executive functions.

    PubMed

    Lo Priore, Corrado; Castelnuovo, Gianluca; Liccione, Diego; Liccione, Davide

    2003-06-01

    The paper discusses the use of immersive virtual reality systems for the cognitive rehabilitation of dysexecutive syndrome, usually caused by prefrontal brain injuries. With respect to classical P&P and flat-screen computer rehabilitative tools, IVR systems might prove capable of evoking a more intense and compelling sense of presence, thanks to the highly naturalistic subject-environment interaction allowed. Within a constructivist framework applied to holistic rehabilitation, we suggest that this difference might enhance the ecological validity of cognitive training, partly overcoming the implicit limits of a lab setting, which seem to affect non-immersive procedures especially when applied to dysexecutive symptoms. We tested presence in a pilot study applied to a new VR-based rehabilitation tool for executive functions, V-Store; it allows patients to explore a virtual environment where they solve six series of tasks, ordered for complexity and designed to stimulate executive functions, programming, categorical abstraction, short-term memory and attention. We compared sense of presence experienced by unskilled normal subjects, randomly assigned to immersive or non-immersive (flat screen) sessions of V-Store, through four different indexes: self-report questionnaire, psychophysiological (GSR, skin conductance), neuropsychological (incidental recall memory test related to auditory information coming from the "real" environment) and count of breaks in presence (BIPs). Preliminary results show in the immersive group a significantly higher GSR response during tasks; neuropsychological data (fewer recalled elements from "reality") and less BIPs only show a congruent but yet non-significant advantage for the immersive condition; no differences were evident from the self-report questionnaire. A larger experimental group is currently under examination to evaluate significance of these data, which also might prove interesting with respect to the question of objective-subjective measures of presence.

  19. Evaluating Remapped Physical Reach for Hand Interactions with Passive Haptics in Virtual Reality.

    PubMed

    Han, Dustin T; Suhail, Mohamed; Ragan, Eric D

    2018-04-01

    Virtual reality often uses motion tracking to incorporate physical hand movements into interaction techniques for selection and manipulation of virtual objects. To increase realism and allow direct hand interaction, real-world physical objects can be aligned with virtual objects to provide tactile feedback and physical grasping. However, unless a physical space is custom configured to match a specific virtual reality experience, the ability to perfectly match the physical and virtual objects is limited. Our research addresses this challenge by studying methods that allow one physical object to be mapped to multiple virtual objects that can exist at different virtual locations in an egocentric reference frame. We study two such techniques: one that introduces a static translational offset between the virtual and physical hand before a reaching action, and one that dynamically interpolates the position of the virtual hand during a reaching motion. We conducted two experiments to assess how the two methods affect reaching effectiveness, comfort, and ability to adapt to the remapping techniques when reaching for objects with different types of mismatches between physical and virtual locations. We also present a case study to demonstrate how the hand remapping techniques could be used in an immersive game application to support realistic hand interaction while optimizing usability. Overall, the translational technique performed better than the interpolated reach technique and was more robust for situations with larger mismatches between virtual and physical objects.

  20. Restorative effects of virtual nature settings.

    PubMed

    Valtchanov, Deltcho; Barton, Kevin R; Ellard, Colin

    2010-10-01

    Previous research regarding the potential benefits of exposing individuals to surrogate nature (photographs and videos) has found that such immersion results in restorative effects such as increased positive affect, decreased negative affect, and decreased stress. In the current experiment, we examined whether immersion in a virtual computer-generated nature setting could produce restorative effects. Twenty-two participants were equally divided between two conditions, while controlling for gender. In each condition, participants performed a stress-induction task, and were then immersed in virtual reality (VR) for 10 minutes. The control condition featured a slide show in VR, and the nature experimental condition featured an active exploration of a virtual forest. Participants in the nature condition were found to exhibit increased positive affect and decreased stress after immersion in VR when compared to those in the control condition. The results suggest that immersion in virtual nature settings has similar beneficial effects as exposure to surrogate nature. These results also suggest that VR can be used as a tool to study and understand restorative effects.

  1. [Virtual reality therapy in anxiety disorders].

    PubMed

    Mitrousia, V; Giotakos, O

    2016-01-01

    During the last decade a number of studies have been conducted in order to examine if virtual reality exposure therapy can be an alternative form of therapy for the treatment of mental disorders and particularly for the treatment of anxiety disorders. Imaginal exposure therapy, which is one of the components of Cognitive Behavioral Therapy, cannot be easily applied to all patients and in cases like those virtual reality can be used as an alternative or a supportive psychotherapeutic technique. Most studies using virtual reality have focused on anxiety disorders, mainly in specific phobias, but some extend to other disorders such as eating disorders, drug dependence, pain control and palliative care and rehabilitation. Main characteristics of virtual reality therapy are: "interaction", "immersion", and "presence". High levels of "immersion" and "presence" are associated with increased response to exposure therapy in virtual environments, as well as better therapeutic outcomes and sustained therapeutic gains. Typical devices that are used in order patient's immersion to be achieved are the Head-Mounted Displays (HMD), which are only for individual use, and the computer automatic virtual environment (CAVE), which is a multiuser. Virtual reality therapy's disadvantages lie in the difficulties that arise due to the demanded specialized technology skills, devices' cost and side effects. Therapists' training is necessary in order for them to be able to manipulate the software and the hardware and to adjust it to each case's needs. Devices' cost is high but as technology continuously improves it constantly decreases. Immersion during virtual reality therapy can induce mild and temporary side effects such as nausea, dizziness or headache. Until today, however, experience shows that virtual reality offers several advantages. Patient's avoidance to be exposed in phobic stimuli is reduced via the use of virtual reality since the patient is exposed to them as many times as he wishes and under the supervision of the therapist. The technique takes place in the therapist's office which ensures confidentiality and privacy. The therapist is able to control unpredicted events that can occur during patient's exposure in real environments. Mainly the therapist can control the intensity of exposure and adapt it to the patient's needs. Virtual reality can be proven particularly useful in some specific psychological states. For instance, patients with post-traumatic stress disorder (PTSD) who prone to avoid the reminders of the traumatic events. Exposure in virtual reality can solve this problem providing to the patient a large number of stimuli that activate the senses causing the necessary physiological and psychological anxiety reactions, regardless of his willingness or ability to recall in his imagination the traumatic event.

  2. Virtual Heritage Tours: Developing Interactive Narrative-Based Environments for Historical Sites

    NASA Astrophysics Data System (ADS)

    Tuck, Deborah; Kuksa, Iryna

    In the last decade there has been a noticeable growth in the use of virtual reality (VR) technologies for reconstructing cultural heritage sites. However, many of these virtual reconstructions evidence little of sites' social histories. Narrating the Past is a research project that aims to re-address this issue by investigating methods for embedding social histories within cultural heritage sites and by creating narrative based virtual environments (VEs) within them. The project aims to enhance the visitor's knowledge and understanding by developing a navigable 3D story space, in which participants are immersed. This has the potential to create a malleable virtual environment allowing the visitor to configure their own narrative paths.

  3. A hardware and software architecture to deal with multimodal and collaborative interactions in multiuser virtual reality environments

    NASA Astrophysics Data System (ADS)

    Martin, P.; Tseu, A.; Férey, N.; Touraine, D.; Bourdot, P.

    2014-02-01

    Most advanced immersive devices provide collaborative environment within several users have their distinct head-tracked stereoscopic point of view. Combining with common used interactive features such as voice and gesture recognition, 3D mouse, haptic feedback, and spatialized audio rendering, these environments should faithfully reproduce a real context. However, even if many studies have been carried out on multimodal systems, we are far to definitively solve the issue of multimodal fusion, which consists in merging multimodal events coming from users and devices, into interpretable commands performed by the application. Multimodality and collaboration was often studied separately, despite of the fact that these two aspects share interesting similarities. We discuss how we address this problem, thought the design and implementation of a supervisor that is able to deal with both multimodal fusion and collaborative aspects. The aim of this supervisor is to ensure the merge of user's input from virtual reality devices in order to control immersive multi-user applications. We deal with this problem according to a practical point of view, because the main requirements of this supervisor was defined according to a industrial task proposed by our automotive partner, that as to be performed with multimodal and collaborative interactions in a co-located multi-user environment. In this task, two co-located workers of a virtual assembly chain has to cooperate to insert a seat into the bodywork of a car, using haptic devices to feel collision and to manipulate objects, combining speech recognition and two hands gesture recognition as multimodal instructions. Besides the architectural aspect of this supervisor, we described how we ensure the modularity of our solution that could apply on different virtual reality platforms, interactive contexts and virtual contents. A virtual context observer included in this supervisor in was especially designed to be independent to the content of the virtual scene of targeted application, and is use to report high-level interactive and collaborative events. This context observer allows the supervisor to merge these interactive and collaborative events, but is also used to deal with new issues coming from our observation of two co-located users in an immersive device performing this assembly task. We highlight the fact that when speech recognition features are provided to the two users, it is required to automatically detect according to the interactive context, whether the vocal instructions must be translated into commands that have to be performed by the machine, or whether they take a part of the natural communication necessary for collaboration. Information coming from this context observer that indicates a user is looking at its collaborator, is important to detect if the user is talking to its partner. Moreover, as the users are physically co-localised and head-tracking is used to provide high fidelity stereoscopic rendering, and natural walking navigation in the virtual scene, we have to deals with collision and screen occlusion between the co-located users in the physical work space. Working area and focus of each user, computed and reported by the context observer is necessary to prevent or avoid these situations.

  4. Being There in the Midst of the Story: How Immersive Journalism Affects Our Perceptions and Cognitions.

    PubMed

    Sundar, S Shyam; Kang, Jin; Oprean, Danielle

    2017-11-01

    Immersive journalism in the form of virtual reality (VR) headsets and 360°-video is becoming more mainstream and is much touted for inducing greater "presence" than traditional text. But, does this presence influence psychological outcomes of reading news, such as memory for story content, perceptions of credibility, and empathy felt toward story characters? We propose that two key technological affordances of VR (modality and interactivity) are responsible for triggering three presence-related cognitive heuristics (being-there, interaction, and realism), which influence news readers' memory and their perceptions of credibility, empathy, and story-sharing intentions. We report a 3 (storytelling medium: VR vs. 360°-video vs. Text) × 2 (story: "The displaced" and "The click effect") mixed-factorial experiment, in which participants (N = 129) experienced two New York Times stories (that differed in their emotional intensity) using one of the three mediums (VR, 360°-video, Text). Participants who experienced the stories using VR and 360°-video outperformed those who read the same stories using text with pictures, not only on such presence-related outcomes as being-there, interaction, and realism, but also on perceived source credibility, story-sharing intention, and feelings of empathy. Moreover, we found that senses of being-there, interaction, and realism mediated the relationship between storytelling medium and reader perceptions of credibility, story recall, and story-sharing intention. These findings have theoretical implications for the psychology of virtual reality, and practical applications for immersive journalism in particular and interactive media in general.

  5. Wireless physiological monitoring and ocular tracking: 3D calibration in a fully-immersive virtual health care environment.

    PubMed

    Zhang, Lelin; Chi, Yu Mike; Edelstein, Eve; Schulze, Jurgen; Gramann, Klaus; Velasquez, Alvaro; Cauwenberghs, Gert; Macagno, Eduardo

    2010-01-01

    Wireless physiological/neurological monitoring in virtual reality (VR) offers a unique opportunity for unobtrusively quantifying human responses to precisely controlled and readily modulated VR representations of health care environments. Here we present such a wireless, light-weight head-mounted system for measuring electrooculogram (EOG) and electroencephalogram (EEG) activity in human subjects interacting with and navigating in the Calit2 StarCAVE, a five-sided immersive 3-D visualization VR environment. The system can be easily expanded to include other measurements, such as cardiac activity and galvanic skin responses. We demonstrate the capacity of the system to track focus of gaze in 3-D and report a novel calibration procedure for estimating eye movements from responses to the presentation of a set of dynamic visual cues in the StarCAVE. We discuss cyber and clinical applications that include a 3-D cursor for visual navigation in VR interactive environments, and the monitoring of neurological and ocular dysfunction in vision/attention disorders.

  6. Time Series Data Visualization in World Wide Telescope

    NASA Astrophysics Data System (ADS)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  7. Generating Contextual Descriptions of Virtual Reality (VR) Spaces

    NASA Astrophysics Data System (ADS)

    Olson, D. M.; Zaman, C. H.; Sutherland, A.

    2017-12-01

    Virtual reality holds great potential for science communication, education, and research. However, interfaces for manipulating data and environments in virtual worlds are limited and idiosyncratic. Furthermore, speech and vision are the primary modalities by which humans collect information about the world, but the linking of visual and natural language domains is a relatively new pursuit in computer vision. Machine learning techniques have been shown to be effective at image and speech classification, as well as at describing images with language (Karpathy 2016), but have not yet been used to describe potential actions. We propose a technique for creating a library of possible context-specific actions associated with 3D objects in immersive virtual worlds based on a novel dataset generated natively in virtual reality containing speech, image, gaze, and acceleration data. We will discuss the design and execution of a user study in virtual reality that enabled the collection and the development of this dataset. We will also discuss the development of a hybrid machine learning algorithm linking vision data with environmental affordances in natural language. Our findings demonstrate that it is possible to develop a model which can generate interpretable verbal descriptions of possible actions associated with recognized 3D objects within immersive VR environments. This suggests promising applications for more intuitive user interfaces through voice interaction within 3D environments. It also demonstrates the potential to apply vast bodies of embodied and semantic knowledge to enrich user interaction within VR environments. This technology would allow for applications such as expert knowledge annotation of 3D environments, complex verbal data querying and object manipulation in virtual spaces, and computer-generated, dynamic 3D object affordances and functionality during simulations.

  8. Immersive Education, an Annotated Webliography

    ERIC Educational Resources Information Center

    Pricer, Wayne F.

    2011-01-01

    In this second installment of a two-part feature on immersive education a webliography will provide resources discussing the use of various types of computer simulations including: (a) augmented reality, (b) virtual reality programs, (c) gaming resources for teaching with technology, (d) virtual reality lab resources, (e) virtual reality standards…

  9. Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life

    NASA Astrophysics Data System (ADS)

    Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia

    2011-03-01

    Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.

  10. Immersive virtual reality as a teaching tool for neuroanatomy.

    PubMed

    Stepan, Katelyn; Zeiger, Joshua; Hanchuk, Stephanie; Del Signore, Anthony; Shrivastava, Raj; Govindaraj, Satish; Iloreta, Alfred

    2017-10-01

    Three-dimensional (3D) computer modeling and interactive virtual reality (VR) simulation are validated teaching techniques used throughout medical disciplines. Little objective data exists supporting its use in teaching clinical anatomy. Learner motivation is thought to limit the rate of utilization of such novel technologies. The purpose of this study is to evaluate the effectiveness, satisfaction, and motivation associated with immersive VR simulation in teaching medical students neuroanatomy. Images of normal cerebral anatomy were reconstructed from human Digital Imaging and Communications in Medicine (DICOM) computed tomography (CT) imaging and magnetic resonance imaging (MRI) into 3D VR formats compatible with the Oculus Rift VR System, a head-mounted display with tracking capabilities allowing for an immersive VR experience. The ventricular system and cerebral vasculature were highlighted and labeled to create a focused interactive model. We conducted a randomized controlled study with 66 medical students (33 in both the control and experimental groups). Pertinent neuroanatomical structures were studied using either online textbooks or the VR interactive model, respectively. We then evaluated the students' anatomy knowledge, educational experience, and motivation (using the Instructional Materials Motivation Survey [IMMS], a previously validated assessment). There was no significant difference in anatomy knowledge between the 2 groups on preintervention, postintervention, or retention quizzes. The VR group found the learning experience to be significantly more engaging, enjoyable, and useful (all p < 0.01) and scored significantly higher on the motivation assessment (p < 0.01). Immersive VR educational tools awarded a more positive learner experience and enhanced student motivation. However, the technology was equally as effective as the traditional text books in teaching neuroanatomy. © 2017 ARS-AAOA, LLC.

  11. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  12. How incorporation of scents could enhance immersive virtual experiences

    PubMed Central

    Ischer, Matthieu; Baron, Naëm; Mermoud, Christophe; Cayeux, Isabelle; Porcherot, Christelle; Sander, David; Delplanque, Sylvain

    2014-01-01

    Under normal everyday conditions, senses all work together to create experiences that fill a typical person's life. Unfortunately for behavioral and cognitive researchers who investigate such experiences, standard laboratory tests are usually conducted in a nondescript room in front of a computer screen. They are very far from replicating the complexity of real world experiences. Recently, immersive virtual reality (IVR) environments became promising methods to immerse people into an almost real environment that involves more senses. IVR environments provide many similarities to the complexity of the real world and at the same time allow experimenters to constrain experimental parameters to obtain empirical data. This can eventually lead to better treatment options and/or new mechanistic hypotheses. The idea that increasing sensory modalities improve the realism of IVR environments has been empirically supported, but the senses used did not usually include olfaction. In this technology report, we will present an odor delivery system applied to a state-of-the-art IVR technology. The platform provides a three-dimensional, immersive, and fully interactive visualization environment called “Brain and Behavioral Laboratory—Immersive System” (BBL-IS). The solution we propose can reliably deliver various complex scents during different virtual scenarios, at a precise time and space and without contamination of the environment. The main features of this platform are: (i) the limited cross-contamination between odorant streams with a fast odor delivery (< 500 ms), (ii) the ease of use and control, and (iii) the possibility to synchronize the delivery of the odorant with pictures, videos or sounds. How this unique technology could be used to investigate typical research questions in olfaction (e.g., emotional elicitation, memory encoding or attentional capture by scents) will also be addressed. PMID:25101017

  13. Augmented Virtuality: A Real-time Process for Presenting Real-world Visual Sensory Information in an Immersive Virtual Environment for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.

    2017-12-01

    Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.

  14. Design of a 3D Navigation Technique Supporting VR Interaction

    NASA Astrophysics Data System (ADS)

    Boudoin, Pierre; Otmane, Samir; Mallem, Malik

    2008-06-01

    Multimodality is a powerful paradigm to increase the realness and the easiness of the interaction in Virtual Environments (VEs). In particular, the search for new metaphors and techniques for 3D interaction adapted to the navigation task is an important stage for the realization of future 3D interaction systems that support multimodality, in order to increase efficiency and usability. In this paper we propose a new multimodal 3D interaction model called Fly Over. This model is especially devoted to the navigation task. We present a qualitative comparison between Fly Over and a classical navigation technique called gaze-directed steering. The results from preliminary evaluation on the IBISC semi-immersive Virtual Reality/Augmented Realty EVR@ platform show that Fly Over is a user friendly and efficient navigation technique.

  15. Immersive Interaction, Manipulation and Analysis of Large 3D Datasets for Planetary and Earth Sciences

    NASA Astrophysics Data System (ADS)

    Pariser, O.; Calef, F.; Manning, E. M.; Ardulov, V.

    2017-12-01

    We will present implementation and study of several use-cases of utilizing Virtual Reality (VR) for immersive display, interaction and analysis of large and complex 3D datasets. These datasets have been acquired by the instruments across several Earth, Planetary and Solar Space Robotics Missions. First, we will describe the architecture of the common application framework that was developed to input data, interface with VR display devices and program input controllers in various computing environments. Tethered and portable VR technologies will be contrasted and advantages of each highlighted. We'll proceed to presenting experimental immersive analytics visual constructs that enable augmentation of 3D datasets with 2D ones such as images and statistical and abstract data. We will conclude by presenting comparative analysis with traditional visualization applications and share the feedback provided by our users: scientists and engineers.

  16. The Effect of Realistic Appearance of Virtual Characters in Immersive Environments - Does the Character's Personality Play a Role?

    PubMed

    Zibrek, Katja; Kokkinara, Elena; Mcdonnell, Rachel

    2018-04-01

    Virtual characters that appear almost photo-realistic have been shown to induce negative responses from viewers in traditional media, such as film and video games. This effect, described as the uncanny valley, is the reason why realism is often avoided when the aim is to create an appealing virtual character. In Virtual Reality, there have been few attempts to investigate this phenomenon and the implications of rendering virtual characters with high levels of realism on user enjoyment. In this paper, we conducted a large-scale experiment on over one thousand members of the public in order to gather information on how virtual characters are perceived in interactive virtual reality games. We were particularly interested in whether different render styles (realistic, cartoon, etc.) would directly influence appeal, or if a character's personality was the most important indicator of appeal. We used a number of perceptual metrics such as subjective ratings, proximity, and attribution bias in order to test our hypothesis. Our main result shows that affinity towards virtual characters is a complex interaction between the character's appearance and personality, and that realism is in fact a positive choice for virtual characters in virtual reality.

  17. Immersion and the illusion of presence in virtual reality.

    PubMed

    Slater, Mel

    2018-05-21

    This commentary briefly reviews the history of virtual reality and its use for psychology research, and clarifies the concepts of immersion and the illusion of presence. © 2018 The British Psychological Society.

  18. Learning Relative Motion Concepts in Immersive and Non-immersive Virtual Environments

    NASA Astrophysics Data System (ADS)

    Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria

    2013-12-01

    The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop virtual environment (DVE) conditions. Our results show that after the simulation activities, both IVE and DVE groups exhibited a significant shift toward a scientific understanding in their conceptual models and epistemological beliefs about the nature of relative motion, and also a significant improvement on relative motion problem-solving tests. In addition, we analyzed students' performance on one-dimensional and two-dimensional questions in the relative motion problem-solving test separately and found that after training in the simulation, the IVE group performed significantly better than the DVE group on solving two-dimensional relative motion problems. We suggest that egocentric encoding of the scene in IVE (where the learner constitutes a part of a scene they are immersed in), as compared to allocentric encoding on a computer screen in DVE (where the learner is looking at the scene from "outside"), is more beneficial than DVE for studying more complex (two-dimensional) relative motion problems. Overall, our findings suggest that such aspects of virtual realities as immersivity, first-hand experience, and the possibility of changing different frames of reference can facilitate understanding abstract scientific phenomena and help in displacing intuitive misconceptions with more accurate mental models.

  19. Full Immersive Virtual Environment Cave[TM] in Chemistry Education

    ERIC Educational Resources Information Center

    Limniou, M.; Roberts, D.; Papadopoulos, N.

    2008-01-01

    By comparing two-dimensional (2D) chemical animations designed for computer's desktop with three-dimensional (3D) chemical animations designed for the full immersive virtual reality environment CAVE[TM] we studied how virtual reality environments could raise student's interest and motivation for learning. By using the 3ds max[TM], we can visualize…

  20. A Framework for Aligning Instructional Design Strategies with Affordances of CAVE Immersive Virtual Reality Systems

    ERIC Educational Resources Information Center

    Ritz, Leah T.; Buss, Alan R.

    2016-01-01

    Increasing availability of immersive virtual reality (IVR) systems, such as the Cave Automatic Virtual Environment (CAVE) and head-mounted displays, for use in education contexts is providing new opportunities and challenges for instructional designers. By highlighting the affordances of IVR specific to the CAVE, the authors emphasize the…

  1. Student Responses to Their Immersion in a Virtual Environment.

    ERIC Educational Resources Information Center

    Taylor, Wayne

    Undertaken in conjunction with a larger study that investigated the educational efficacy of students building their own virtual worlds, this study measures the reactions of students in grades 4-12 to the experience of being immersed in virtual reality (VR). The study investigated the sense of "presence" experienced by the students, the…

  2. Virtually the ultimate research lab.

    PubMed

    Kulik, Alexander

    2018-04-26

    Virtual reality (VR) can serve as a viable platform for psychological research. The real world with many uncontrolled variables can be masked to immerse participants in complex interactive environments that are under full experimental control. However, as any other laboratory setting, these simulations are not perceived equally to reality and they also afford different behaviour. We need a better understanding of these differences, which are often related to parameters of the technical setup, to support valid interpretations of experimental results. © 2018 The British Psychological Society.

  3. Initial Assessment of Human Performance Using the Gaiter Interaction Technique to Control Locomotion in Fully Immersive Virtual Environments

    DTIC Science & Technology

    2004-06-30

    virtual space, as well as to match specific attributes of natural locomotion, such as perceived velocity and caloric expenditure . Moreover, the...wide range of postural motions (e.g., crouching, jumping , and bending to look around objects) with gestural stepping motions. The attributes of in...approximately 8 ¥ 8 ¥ 8 ft. The harness itself was an initial design made out of PVC pipe at the waist and above the head, with rope connecting the

  4. Artificial Versus Video-Based Immersive Virtual Surroundings: Analysis of Performance and User's Preference.

    PubMed

    Huber, Tobias; Paschold, Markus; Hansen, Christian; Lang, Hauke; Kneist, Werner

    2018-06-01

    Immersive virtual reality (VR) laparoscopy simulation connects VR simulation with head-mounted displays to increase presence during VR training. The goal of the present study was the comparison of 2 different surroundings according to performance and users' preference. With a custom immersive virtual reality laparoscopy simulator, an artificially created VR operating room (AVR) and a highly immersive VR operating room (IVR) were compared. Participants (n = 30) performed 3 tasks (peg transfer, fine dissection, and cholecystectomy) in AVR and IVR in a crossover study design. No overall difference in virtual laparoscopic performance was obtained when comparing results from AVR with IVR. Most participants preferred the IVR surrounding (n = 24). Experienced participants (n = 10) performed significantly better than novices (n = 10) in all tasks regardless of the surrounding ( P < .05). Participants with limited experience (n = 10) showed differing results. Presence, immersion, and exhilaration were significantly higher in IVR. Two thirds assumed that IVR would have a positive influence on their laparoscopic simulator use. This first study comparing AVR and IVR did not reveal differences in virtual laparoscopic performance. IVR is considered the more realistic surrounding and is therefore preferred by the participants.

  5. Employing immersive virtual environments for innovative experiments in health care communication.

    PubMed

    Persky, Susan

    2011-03-01

    This report reviews the literature for studies that employ immersive virtual environment technology methods to conduct experimental studies in health care communication. Advantages and challenges of using these tools for research in this area are also discussed. A literature search was conducted using the Scopus database. Results were hand searched to identify the body of studies, conducted since 1995, that are related to the report objective. The review identified four relevant studies that stem from two unique projects. One project focused on the impact of a clinician's characteristics and behavior on health care communication, the other focused on the characteristics of the patient. Both projects illustrate key methodological advantages conferred by immersive virtual environments, including, ability to maintain simultaneously high experimental control and realism, ability to manipulate variables in new ways, and unique behavioral measurement opportunities. Though implementation challenges exist for immersive virtual environment-based research methods, given the technology's unique capabilities, benefits can outweigh the costs in many instances. Immersive virtual environments may therefore prove an important addition to the array of tools available for advancing our understanding of communication in health care. Published by Elsevier Ireland Ltd.

  6. Enhancing the immersive reality of virtual simulators for easily accessible laparoscopic surgical training

    NASA Astrophysics Data System (ADS)

    McKenna, Kyra; McMenemy, Karen; Ferguson, R. S.; Dick, Alistair; Potts, Stephen

    2008-02-01

    Computer simulators are a popular method of training surgeons in the techniques of laparoscopy. However, for the trainee to feel totally immersed in the process, the graphical display should be as lifelike as possible and two-handed force feedback interaction is required. This paper reports on how a compelling immersive experience can be delivered at low cost using commonly available hardware components. Three specific themes are brought together. Firstly, programmable shaders executing in standard PC graphics adapter's deliver the appearance of anatomical realism, including effects of: translucent tissue surfaces, semi-transparent membranes, multilayer image texturing and real-time shadowing. Secondly, relatively inexpensive 'off the shelf' force feedback devices contribute to a holistic immersive experience. The final element described is the custom software that brings these together with hierarchically organized and optimized polygonal models for abdominal anatomy.

  7. VILLAGE--Virtual Immersive Language Learning and Gaming Environment: Immersion and Presence

    ERIC Educational Resources Information Center

    Wang, Yi Fei; Petrina, Stephen; Feng, Francis

    2017-01-01

    3D virtual worlds are promising for immersive learning in English as a Foreign Language (EFL). Unlike English as a Second Language (ESL), EFL typically takes place in the learners' home countries, and the potential of the language is limited by geography. Although learning contexts where English is spoken is important, in most EFL courses at the…

  8. Applied virtual reality at the Research Triangle Institute

    NASA Technical Reports Server (NTRS)

    Montoya, R. Jorge

    1994-01-01

    Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.

  9. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve

    2011-03-01

    Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.

  10. Sculpting 3D worlds with music: advanced texturing techniques

    NASA Astrophysics Data System (ADS)

    Greuel, Christian; Bolas, Mark T.; Bolas, Niko; McDowall, Ian E.

    1996-04-01

    Sound within the virtual environment is often considered to be secondary to the graphics. In a typical scenario, either audio cues are locally associated with specific 3D objects or a general aural ambiance is supplied in order to alleviate the sterility of an artificial experience. This paper discusses a completely different approach, in which cues are extracted from live or recorded music in order to create geometry and control object behaviors within a computer- generated environment. Advanced texturing techniques used to generate complex stereoscopic images are also discussed. By analyzing music for standard audio characteristics such as rhythm and frequency, information is extracted and repackaged for processing. With the Soundsculpt Toolkit, this data is mapped onto individual objects within the virtual environment, along with one or more predetermined behaviors. Mapping decisions are implemented with a user definable schedule and are based on the aesthetic requirements of directors and designers. This provides for visually active, immersive environments in which virtual objects behave in real-time correlation with the music. The resulting music-driven virtual reality opens up several possibilities for new types of artistic and entertainment experiences, such as fully immersive 3D `music videos' and interactive landscapes for live performance.

  11. Hybrid 2-D and 3-D Immersive and Interactive User Interface for Scientific Data Visualization

    DTIC Science & Technology

    2017-08-01

    visualization, 3-D interactive visualization, scientific visualization, virtual reality, real -time ray tracing 16. SECURITY CLASSIFICATION OF: 17...scientists to employ in the real world. Other than user-friendly software and hardware setup, scientists also need to be able to perform their usual...and scientific visualization communities mostly have different research priorities. For the VR community, the ability to support real -time user

  12. A Proposed Treatment for Visual Field Loss caused by Traumatic Brain Injury using Interactive Visuotactile Virtual Environment

    NASA Astrophysics Data System (ADS)

    Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella

    In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.

  13. From stereoscopic recording to virtual reality headsets: Designing a new way to learn surgery.

    PubMed

    Ros, M; Trives, J-V; Lonjon, N

    2017-03-01

    To improve surgical practice, there are several different approaches to simulation. Due to wearable technologies, recording 3D movies is now easy. The development of a virtual reality headset allows imagining a different way of watching these videos: using dedicated software to increase interactivity in a 3D immersive experience. The objective was to record 3D movies via a main surgeon's perspective, to watch files using virtual reality headsets and to validate pedagogic interest. Surgical procedures were recorded using a system combining two side-by-side cameras placed on a helmet. We added two LEDs just below the cameras to enhance luminosity. Two files were obtained in mp4 format and edited using dedicated software to create 3D movies. Files obtained were then played using a virtual reality headset. Surgeons who tried the immersive experience completed a questionnaire to evaluate the interest of this procedure for surgical learning. Twenty surgical procedures were recorded. The movies capture a scene which is extended 180° horizontally and 90° vertically. The immersive experience created by the device conveys a genuine feeling of being in the operating room and seeing the procedure first-hand through the eyes of the main surgeon. All surgeons indicated that they believe in pedagogical interest of this method. We succeeded in recording the main surgeon's point of view in 3D and watch it on a virtual reality headset. This new approach enhances the understanding of surgery; most of the surgeons appreciated its pedagogic value. This method could be an effective learning tool in the future. Copyright © 2016. Published by Elsevier Masson SAS.

  14. The Virtual Shop: A new immersive virtual reality environment and scenario for the assessment of everyday memory.

    PubMed

    Ouellet, Émilie; Boller, Benjamin; Corriveau-Lecavalier, Nick; Cloutier, Simon; Belleville, Sylvie

    2018-06-01

    Assessing and predicting memory performance in everyday life is a common assignment for neuropsychologists. However, most traditional neuropsychological tasks are not conceived to capture everyday memory performance. The Virtual Shop is a fully immersive task developed to assess memory in a more ecological way than traditional neuropsychological assessments. Two studies were undertaken to assess the feasibility of the Virtual Shop and to appraise its ecological and construct validity. In study 1, 20 younger and 19 older adults completed the Virtual Shop task to evaluate its level of difficulty and the way the participants interacted with the VR material. The construct validity was examined with the contrasted-group method, by comparing the performance of younger and older adults. In study 2, 35 individuals with subjective cognitive decline completed the Virtual Shop task. Performance was correlated with an existing questionnaire evaluating everyday memory in order to appraise its ecological validity. To add further support to its construct validity, performance was correlated with traditional episodic memory and executive tasks. All participants successfully completed the Virtual Shop. The task had an appropriate level of difficulty that helped differentiate younger and older adults, supporting the feasibility and construct validity of the task. The performance on the Virtual Shop was significantly and moderately correlated with the performance on the questionnaire and on the traditional memory and executive tasks. Results support the feasibility and both the ecological and construct validity of the Virtual Shop. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Virtual exertions: evoking the sense of exerting forces in virtual reality using gestures and muscle activity.

    PubMed

    Chen, Karen B; Ponto, Kevin; Tredinnick, Ross D; Radwin, Robert G

    2015-06-01

    This study was a proof of concept for virtual exertions, a novel method that involves the use of body tracking and electromyography for grasping and moving projections of objects in virtual reality (VR). The user views objects in his or her hands during rehearsed co-contractions of the same agonist-antagonist muscles normally used for the desired activities to suggest exerting forces. Unlike physical objects, virtual objects are images and lack mass. There is currently no practical physically demanding way to interact with virtual objects to simulate strenuous activities. Eleven participants grasped and lifted similar physical and virtual objects of various weights in an immersive 3-D Cave Automatic Virtual Environment. Muscle activity, localized muscle fatigue, ratings of perceived exertions, and NASA Task Load Index were measured. Additionally, the relationship between levels of immersion (2-D vs. 3-D) was studied. Although the overall magnitude of biceps activity and workload were greater in VR, muscle activity trends and fatigue patterns for varying weights within VR and physical conditions were the same. Perceived exertions for varying weights were not significantly different between VR and physical conditions. Perceived exertion levels and muscle activity patterns corresponded to the assigned virtual loads, which supported the hypothesis that the method evoked the perception of physical exertions and showed that the method was promising. Ultimately this approach may offer opportunities for research and training individuals to perform strenuous activities under potentially safer conditions that mimic situations while seeing their own body and hands relative to the scene. © 2014, Human Factors and Ergonomics Society.

  16. The interplays among technology and content, immersant and VE

    NASA Astrophysics Data System (ADS)

    Song, Meehae; Gromala, Diane; Shaw, Chris; Barnes, Steven J.

    2010-01-01

    The research program aims to explore and examine the fine balance necessary for maintaining the interplays between technology and the immersant, including identifying qualities that contribute to creating and maintaining a sense of "presence" and "immersion" in an immersive virtual reality (IVR) experience. Building upon and extending previous work, we compare sitting meditation with walking meditation in a virtual environment (VE). The Virtual Meditative Walk, a new work-in-progress, integrates VR and biofeedback technologies with a self-directed, uni-directional treadmill. As immersants learn how to meditate while walking, robust, real-time biofeedback technology continuously measures breathing, skin conductance and heart rate. The physiological states of the immersant will in turn affect the audio and stereoscopic visual media through shutter glasses. We plan to test the potential benefits and limitations of this physically active form of meditation with data from a sitting form of meditation. A mixed-methods approach to testing user outcomes parallels the knowledge bases of the collaborative team: a physician, computer scientists and artists.

  17. Journey to the centre of the cell: Virtual reality immersion into scientific data.

    PubMed

    Johnston, Angus P R; Rae, James; Ariotti, Nicholas; Bailey, Benjamin; Lilja, Andrew; Webb, Robyn; Ferguson, Charles; Maher, Sheryl; Davis, Thomas P; Webb, Richard I; McGhee, John; Parton, Robert G

    2018-02-01

    Visualization of scientific data is crucial not only for scientific discovery but also to communicate science and medicine to both experts and a general audience. Until recently, we have been limited to visualizing the three-dimensional (3D) world of biology in 2 dimensions. Renderings of 3D cells are still traditionally displayed using two-dimensional (2D) media, such as on a computer screen or paper. However, the advent of consumer grade virtual reality (VR) headsets such as Oculus Rift and HTC Vive means it is now possible to visualize and interact with scientific data in a 3D virtual world. In addition, new microscopic methods provide an unprecedented opportunity to obtain new 3D data sets. In this perspective article, we highlight how we have used cutting edge imaging techniques to build a 3D virtual model of a cell from serial block-face scanning electron microscope (SBEM) imaging data. This model allows scientists, students and members of the public to explore and interact with a "real" cell. Early testing of this immersive environment indicates a significant improvement in students' understanding of cellular processes and points to a new future of learning and public engagement. In addition, we speculate that VR can become a new tool for researchers studying cellular architecture and processes by populating VR models with molecular data. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Software for math and science education for the deaf.

    PubMed

    Adamo-Villani, Nicoletta; Wilbur, Ronnie

    2010-01-01

    In this article, we describe the development of two novel approaches to teaching math and science concepts to deaf children using 3D animated interactive software. One approach, Mathsigner, is non-immersive and the other, SMILE, is a virtual reality immersive environment. The content is curriculum-based, and the animated signing characters are constructed with state-of-the art technology and design. We report preliminary development findings of usability and appeal based on programme features (e.g. 2D/3D, immersiveness, interaction type, avatar and interface design) and subject features (hearing status, gender and age). Programme features of 2D/3D, immersiveness and interaction type were very much affected by subject features. Among subject features, we find significant effects of hearing status (deaf children take longer time and make more mistakes than hearing children) and gender (girls take longer than boys; girls prefer immersive environments rather than desktop presentation; girls are more interested in content than technology compared to boys). For avatar type, we found a preference for seamless, deformable characters over segmented ones. For interface comparisons, there were no subject effects, but an animated interface resulted in reduced time to task completion compared to static interfaces with and without sound and highlighting. These findings identify numerous features that affect software design and appeal and suggest that designers must be careful in their assumptions during programme development.

  19. Research on three-dimensional visualization based on virtual reality and Internet

    NASA Astrophysics Data System (ADS)

    Wang, Zongmin; Yang, Haibo; Zhao, Hongling; Li, Jiren; Zhu, Qiang; Zhang, Xiaohong; Sun, Kai

    2007-06-01

    To disclose and display water information, a three-dimensional visualization system based on Virtual Reality (VR) and Internet is researched for demonstrating "digital water conservancy" application and also for routine management of reservoir. To explore and mine in-depth information, after completion of modeling high resolution DEM with reliable quality, topographical analysis, visibility analysis and reservoir volume computation are studied. And also, some parameters including slope, water level and NDVI are selected to classify easy-landslide zone in water-level-fluctuating zone of reservoir area. To establish virtual reservoir scene, two kinds of methods are used respectively for experiencing immersion, interaction and imagination (3I). First virtual scene contains more detailed textures to increase reality on graphical workstation with virtual reality engine Open Scene Graph (OSG). Second virtual scene is for internet users with fewer details for assuring fluent speed.

  20. Development and Deployment of a Library of Industrially Focused Advanced Immersive VR Learning Environments

    ERIC Educational Resources Information Center

    Cameron, Ian; Crosthwaite, Caroline; Norton, Christine; Balliu, Nicoleta; Tadé, Moses; Hoadley, Andrew; Shallcross, David; Barton, Geoff

    2008-01-01

    This work presents a unique education resource for both process engineering students and the industry workforce. The learning environment is based around spherical imagery of real operating plants coupled with interactive embedded activities and content. This Virtual Reality (VR) learning tool has been developed by applying aspects of relevant…

  1. An Interactive Virtual Tour of a Milk Powder Plant

    ERIC Educational Resources Information Center

    Herritsch, Alfred; Rahim, Elin Abdul; Fee, Conan J.; Morison, Ken R.; Gostomski, Peter A.

    2013-01-01

    Immersive learning applications in chemical and process engineering are creating the opportunity to bring entire process plants to the student. While meant to complement field trips, in some cases, this is the only opportunity for students to engage with certain industrial sites due to site regulations (health and safety, hygiene, intellectual…

  2. Blended Learning Environments: Using Social Networking Sites to Enhance the First Year Experience

    ERIC Educational Resources Information Center

    McCarthy, Joshua

    2010-01-01

    This study explores blending virtual and physical learning environments to enhance the experience of first year by immersing students into university culture through social and academic interaction between peers. It reports on the progress made from 2008 to 2009 using an existing academic platform, the first year design elective course…

  3. Evaluation of knowledge transfer in an immersive virtual learning environment for the transportation community.

    DOT National Transportation Integrated Search

    2014-05-01

    Immersive Virtual Learning Environments (IVLEs) are extensively used in training, but few rigorous scientific investigations regarding the : transfer of learning have been conducted. Measurement of learning transfer through evaluative methods is key ...

  4. Simultaneous neural and movement recording in large-scale immersive virtual environments.

    PubMed

    Snider, Joseph; Plank, Markus; Lee, Dongpyo; Poizner, Howard

    2013-10-01

    Virtual reality (VR) allows precise control and manipulation of rich, dynamic stimuli that, when coupled with on-line motion capture and neural monitoring, can provide a powerful means both of understanding brain behavioral relations in the high dimensional world and of assessing and treating a variety of neural disorders. Here we present a system that combines state-of-the-art, fully immersive, 3D, multi-modal VR with temporally aligned electroencephalographic (EEG) recordings. The VR system is dynamic and interactive across visual, auditory, and haptic interactions, providing sight, sound, touch, and force. Crucially, it does so with simultaneous EEG recordings while subjects actively move about a 20 × 20 ft² space. The overall end-to-end latency between real movement and its simulated movement in the VR is approximately 40 ms. Spatial precision of the various devices is on the order of millimeters. The temporal alignment with the neural recordings is accurate to within approximately 1 ms. This powerful combination of systems opens up a new window into brain-behavioral relations and a new means of assessment and rehabilitation of individuals with motor and other disorders.

  5. Manipulation of volumetric patient data in a distributed virtual reality environment.

    PubMed

    Dech, F; Ai, Z; Silverstein, J C

    2001-01-01

    Due to increases in network speed and bandwidth, distributed exploration of medical data in immersive Virtual Reality (VR) environments is becoming increasingly feasible. The volumetric display of radiological data in such environments presents a unique set of challenges. The shear size and complexity of the datasets involved not only make them difficult to transmit to remote sites, but these datasets also require extensive user interaction in order to make them understandable to the investigator and manageable to the rendering hardware. A sophisticated VR user interface is required in order for the clinician to focus on the aspects of the data that will provide educational and/or diagnostic insight. We will describe a software system of data acquisition, data display, Tele-Immersion, and data manipulation that supports interactive, collaborative investigation of large radiological datasets. The hardware required in this strategy is still at the high-end of the graphics workstation market. Future software ports to Linux and NT, along with the rapid development of PC graphics cards, open the possibility for later work with Linux or NT PCs and PC clusters.

  6. [Virtual reality in neurosurgery].

    PubMed

    Tronnier, V M; Staubert, A; Bonsanto, M M; Wirtz, C R; Kunze, S

    2000-03-01

    Virtual reality enables users to immerse themselves in a virtual three-dimensional world and to interact in this world. The simulation is different from the kind in computer games, in which the viewer is active but acts in a nonrealistic world, or on the TV screen, where we are passively driven in an active world. In virtual reality elements look realistic, they change their characteristics and have almost real-world unpredictability. Virtual reality is not only implemented in gambling dens and the entertainment industry but also in manufacturing processes (cars, furniture etc.), military applications and medicine. Especially the last two areas are strongly correlated, because telemedicine or telesurgery was originated for military reasons to operate on war victims from a secure distance or to perform surgery on astronauts in an orbiting space station. In medicine and especially neurosurgery virtual-reality methods are used for education, surgical planning and simulation on a virtual patient.

  7. Vroom: designing an augmented environment for remote collaboration in digital cinema production

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy

    2013-03-01

    As media technologies become increasingly affordable, compact and inherently networked, new generations of telecollaborative platforms continue to arise which integrate these new affordances. Virtual reality has been primarily concerned with creating simulations of environments that can transport participants to real or imagined spaces that replace the "real world". Meanwhile Augmented Reality systems have evolved to interleave objects from Virtual Reality environments into the physical landscape. Perhaps now there is a new class of systems that reverse this precept to enhance dynamic media landscapes and immersive physical display environments to enable intuitive data exploration through collaboration. Vroom (Virtual Room) is a next-generation reconfigurable tiled display environment in development at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. Vroom enables freely scalable digital collaboratories, connecting distributed, high-resolution visualization resources for collaborative work in the sciences, engineering and the arts. Vroom transforms a physical space into an immersive media environment with large format interactive display surfaces, video teleconferencing and spatialized audio built on a highspeed optical network backbone. Vroom enables group collaboration for local and remote participants to share knowledge and experiences. Possible applications include: remote learning, command and control, storyboarding, post-production editorial review, high resolution video playback, 3D visualization, screencasting and image, video and multimedia file sharing. To support these various scenarios, Vroom features support for multiple user interfaces (optical tracking, touch UI, gesture interface, etc.), support for directional and spatialized audio, giga-pixel image interactivity, 4K video streaming, 3D visualization and telematic production. This paper explains the design process that has been utilized to make Vroom an accessible and intuitive immersive environment for remote collaboration specifically for digital cinema production.

  8. Virtual reality environments for post-stroke arm rehabilitation.

    PubMed

    Subramanian, Sandeep; Knaut, Luiz A; Beaudoin, Christian; McFadyen, Bradford J; Feldman, Anatol G; Levin, Mindy F

    2007-06-22

    Optimal practice and feedback elements are essential requirements for maximal motor recovery in patients with motor deficits due to central nervous system lesions. A virtual environment (VE) was created that incorporates practice and feedback elements necessary for maximal motor recovery. It permits varied and challenging practice in a motivating environment that provides salient feedback. The VE gives the user knowledge of results feedback about motor behavior and knowledge of performance feedback about the quality of pointing movements made in a virtual elevator. Movement distances are related to length of body segments. We describe an immersive and interactive experimental protocol developed in a virtual reality environment using the CAREN system. The VE can be used as a training environment for the upper limb in patients with motor impairments.

  9. Virtual worlds: a new frontier for nurse education?

    PubMed

    Green, Janet; Wyllie, Aileen; Jackson, Debra

    2014-01-01

    Virtual worlds have the potential to offer nursing students social networking and, learning, opportunities through the use of collaborative and immersive learning. If nursing educators, are to stay, abreast of contemporary learning opportunities an exploration of the potential benefits of, virtual, worlds and their possibilities is needed. Literature was sourced that explored virtual worlds, and their, use in education, but nursing education specifically. It is clear that immersive learning has, positive, benefits for nursing, however the best way to approach virtual reality in nursing education, has yet to, be ascertained.

  10. Learning immersion without getting wet

    NASA Astrophysics Data System (ADS)

    Aguilera, Julieta C.

    2012-03-01

    This paper describes the teaching of an immersive environments class on the Spring of 2011. The class had students from undergraduate as well as graduate art related majors. Their digital background and interests were also diverse. These variables were channeled as different approaches throughout the semester. Class components included fundamentals of stereoscopic computer graphics to explore spatial depth, 3D modeling and skeleton animation to in turn explore presence, exposure to formats like a stereo projection wall and dome environments to compare field of view across devices, and finally, interaction and tracking to explore issues of embodiment. All these components were supported by theoretical readings discussed in class. Guest artists presented their work in Virtual Reality, Dome Environments and other immersive formats. Museum professionals also introduced students to space science visualizations, which utilize immersive formats. Here I present the assignments and their outcome, together with insights as to how the creation of immersive environments can be learned through constraints that expose students to situations of embodied cognition.

  11. Saliency in VR: How Do People Explore Virtual Environments?

    PubMed

    Sitzmann, Vincent; Serrano, Ana; Pavel, Amy; Agrawala, Maneesh; Gutierrez, Diego; Masia, Belen; Wetzstein, Gordon

    2018-04-01

    Understanding how people explore immersive virtual environments is crucial for many applications, such as designing virtual reality (VR) content, developing new compression algorithms, or learning computational models of saliency or visual attention. Whereas a body of recent work has focused on modeling saliency in desktop viewing conditions, VR is very different from these conditions in that viewing behavior is governed by stereoscopic vision and by the complex interaction of head orientation, gaze, and other kinematic constraints. To further our understanding of viewing behavior and saliency in VR, we capture and analyze gaze and head orientation data of 169 users exploring stereoscopic, static omni-directional panoramas, for a total of 1980 head and gaze trajectories for three different viewing conditions. We provide a thorough analysis of our data, which leads to several important insights, such as the existence of a particular fixation bias, which we then use to adapt existing saliency predictors to immersive VR conditions. In addition, we explore other applications of our data and analysis, including automatic alignment of VR video cuts, panorama thumbnails, panorama video synopsis, and saliency-basedcompression.

  12. Multi-Robot Interfaces and Operator Situational Awareness: Study of the Impact of Immersion and Prediction

    PubMed Central

    Peña-Tapia, Elena; Martín-Barrio, Andrés; Olivares-Méndez, Miguel A.

    2017-01-01

    Multi-robot missions are a challenge for operators in terms of workload and situational awareness. These operators have to receive data from the robots, extract information, understand the situation properly, make decisions, generate the adequate commands, and send them to the robots. The consequences of excessive workload and lack of awareness can vary from inefficiencies to accidents. This work focuses on the study of future operator interfaces of multi-robot systems, taking into account relevant issues such as multimodal interactions, immersive devices, predictive capabilities and adaptive displays. Specifically, four interfaces have been designed and developed: a conventional, a predictive conventional, a virtual reality and a predictive virtual reality interface. The four interfaces have been validated by the performance of twenty-four operators that supervised eight multi-robot missions of fire surveillance and extinguishing. The results of the workload and situational awareness tests show that virtual reality improves the situational awareness without increasing the workload of operators, whereas the effects of predictive components are not significant and depend on their implementation. PMID:28749407

  13. An Interactive Logistics Centre Information Integration System Using Virtual Reality

    NASA Astrophysics Data System (ADS)

    Hong, S.; Mao, B.

    2018-04-01

    The logistics industry plays a very important role in the operation of modern cities. Meanwhile, the development of logistics industry has derived various problems that are urgent to be solved, such as the safety of logistics products. This paper combines the study of logistics industry traceability and logistics centre environment safety supervision with virtual reality technology, creates an interactive logistics centre information integration system. The proposed system utilizes the immerse characteristic of virtual reality, to simulate the real logistics centre scene distinctly, which can make operation staff conduct safety supervision training at any time without regional restrictions. On the one hand, a large number of sensor data can be used to simulate a variety of disaster emergency situations. On the other hand, collecting personnel operation data, to analyse the improper operation, which can improve the training efficiency greatly.

  14. Virtual community centre for power wheelchair training: Experience of children and clinicians.

    PubMed

    Torkia, Caryne; Ryan, Stephen E; Reid, Denise; Boissy, Patrick; Lemay, Martin; Routhier, François; Contardo, Resi; Woodhouse, Janet; Archambault, Phillipe S

    2017-11-02

    To: 1) characterize the overall experience in using the McGill immersive wheelchair - community centre (miWe-CC) simulator; and 2) investigate the experience of presence (i.e., sense of being in the virtual rather than in the real, physical environment) while driving a PW in the miWe-CC. A qualitative research design with structured interviews was used. Fifteen clinicians and 11 children were interviewed after driving a power wheelchair (PW) in the miWe-CC simulator. Data were analyzed using the conventional and directed content analysis approaches. Overall, participants enjoyed using the simulator and experienced a sense of presence in the virtual space. They felt a sense of being in the virtual environment, involved and focused on driving the virtual PW rather than on the surroundings of the actual room where they were. Participants reported several similarities between the virtual community centre layout and activities of the miWe-CC and the day-to-day reality of paediatric PW users. The simulator replicated participants' expectations of real-life PW use and promises to have an effect on improving the driving skills of new PW users. Implications for rehabilitation Among young users, the McGill immersive wheelchair (miWe) simulator provides an experience of presence within the virtual environment. This experience of presence is generated by a sense of being in the virtual scene, a sense of being involved, engaged, and focused on interacting within the virtual environment, and by the perception that the virtual environment is consistent with the real world. The miWe is a relevant and accessible approach, complementary to real world power wheelchair training for young users.

  15. The Evolution of Constructivist Learning Environments: Immersion in Distributed, Virtual Worlds.

    ERIC Educational Resources Information Center

    Dede, Chris

    1995-01-01

    Discusses the evolution of constructivist learning environments and examines the collaboration of simulated software models, virtual environments, and evolving mental models via immersion in artificial realities. A sidebar gives a realistic example of a student navigating through cyberspace. (JMV)

  16. Research on Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Lobeck, William E.

    2002-01-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  17. Research on Intelligent Synthesis Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.; Loftin, R. Bowen

    2002-12-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  18. Can hazard risk be communicated through a virtual experience?

    PubMed

    Mitchell, J T

    1997-09-01

    Cyberspace, defined by William Gibson as a consensual hallucination, now refers to all computer-generated interactive environments. Virtual reality, one of a class of interactive cyberspaces, allows us to create and interact directly with objects not available in the everyday world. Despite successes in the entertainment and aviation industries, this technology has been called a 'solution in search of a problem'. The purpose of this commentary is to suggest such a problem: the inability to acquire experience with a hazard to motivate mitigation. Direct experience with a hazard has been demonstrated as a powerful incentive to adopt mitigation measures. While we lack the ability to summon hazard events at will in order to gain access to that experience, a virtual environment can provide an arena where potential victims are exposed to a hazard's effects. Immersion as an active participant within the hazard event through virtual reality may stimulate users to undertake mitigation steps that might otherwise remain undone. This paper details the possible direction in which virtual reality may be applied to hazards mitigation through a discussion of the technology, the role of hazard experience, the creation of a hazard stimulation and the issues constraining implementation.

  19. Studying and Treating Schizophrenia Using Virtual Reality: A New Paradigm

    PubMed Central

    Freeman, Daniel

    2008-01-01

    Understanding schizophrenia requires consideration of patients’ interactions in the social world. Misinterpretation of other peoples’ behavior is a key feature of persecutory ideation. The occurrence and intensity of hallucinations is affected by the social context. Negative symptoms such as anhedonia, asociality, and blunted affect reflect difficulties in social interactions. Withdrawal and avoidance of other people is frequent in schizophrenia, leading to isolation and rumination. The use of virtual reality (VR)—interactive immersive computer environments—allows one of the key variables in understanding psychosis, social environments, to be controlled, providing exciting applications to research and treatment. Seven applications of virtual social environments to schizophrenia are set out: symptom assessment, identification of symptom markers, establishment of predictive factors, tests of putative causal factors, investigation of the differential prediction of symptoms, determination of toxic elements in the environment, and development of treatment. The initial VR studies of persecutory ideation, which illustrate the ascription of personalities and mental states to virtual people, are highlighted. VR, suitably applied, holds great promise in furthering the understanding and treatment of psychosis. PMID:18375568

  20. Exploring 4D Flow Data in an Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Butkiewicz, T.

    2017-12-01

    Ocean models help us to understand and predict a wide range of intricate physical processes which comprise the atmospheric and oceanic systems of the Earth. Because these models output an abundance of complex time-varying three-dimensional (i.e., 4D) data, effectively conveying the myriad information from a given model poses a significant visualization challenge. The majority of the research effort into this problem has concentrated around synthesizing and examining methods for representing the data itself; by comparison, relatively few studies have looked into the potential merits of various viewing conditions and virtual environments. We seek to improve our understanding of the benefits offered by current consumer-grade virtual reality (VR) systems through an immersive, interactive 4D flow visualization system. Our dataset is a Regional Ocean Modeling System (ROMS) model representing a 12-hour tidal cycle of the currents within New Hampshire's Great Bay estuary. The model data was loaded into a custom VR particle system application using the OpenVR software library and the HTC Vive hardware, which tracks a headset and two six-degree-of-freedom (6DOF) controllers within a 5m-by-5m area. The resulting visualization system allows the user to coexist in the same virtual space as the data, enabling rapid and intuitive analysis of the flow model through natural interactions with the dataset and within the virtual environment. Whereas a traditional computer screen typically requires the user to reposition a virtual camera in the scene to obtain the desired view of the data, in virtual reality the user can simply move their head to the desired viewpoint, completely eliminating the mental context switches from data exploration/analysis to view adjustment and back. The tracked controllers become tools to quickly manipulate (reposition, reorient, and rescale) the dataset and to interrogate it by, e.g., releasing dye particles into the flow field, probing scalar velocities, placing a cutting plane through a region of interest, etc. It is hypothesized that the advantages afforded by head-tracked viewing and 6DOF interaction devices will lead to faster and more efficient examination of 4D flow data. A human factors study is currently being prepared to empirically evaluate this method of visualization and interaction.

  1. The Virtual Pelvic Floor, a tele-immersive educational environment.

    PubMed Central

    Pearl, R. K.; Evenhouse, R.; Rasmussen, M.; Dech, F.; Silverstein, J. C.; Prokasy, S.; Panko, W. B.

    1999-01-01

    This paper describes the development of the Virtual Pelvic Floor, a new method of teaching the complex anatomy of the pelvic region utilizing virtual reality and advanced networking technology. Virtual reality technology allows improved visualization of three-dimensional structures over conventional media because it supports stereo vision, viewer-centered perspective, large angles of view, and interactivity. Two or more ImmersaDesk systems, drafting table format virtual reality displays, are networked together providing an environment where teacher and students share a high quality three-dimensional anatomical model, and are able to converse, see each other, and to point in three dimensions to indicate areas of interest. This project was realized by the teamwork of surgeons, medical artists and sculptors, computer scientists, and computer visualization experts. It demonstrates the future of virtual reality for surgical education and applications for the Next Generation Internet. Images Figure 1 Figure 2 Figure 3 PMID:10566378

  2. The effects of immersiveness on physiology.

    PubMed

    Wiederhold, B K; Davis, R; Wiederhold, M D

    1998-01-01

    The effects of varying levels of immersion in virtual reality environments on participant's heart rate, respiration rate, peripheral skin temperature, and skin resistance levels were examined. Subjective reports of presence were also noted. Participants were presented with a virtual environment of an airplane flight both as seen from a two-dimensional computer screen and as seen from within a head-mounted display. Subjects were randomly assigned to different order of conditions presented, but all subjects received both conditions. Differences between the non-phobics' physiological responses and the phobic's response when placed in a virtual environment related to the phobia were noted. Also noted were changes in physiology based on degree of immersion.

  3. Virtually Ostracized: Studying Ostracism in Immersive Virtual Environments

    PubMed Central

    Wesselmann, Eric D.; Law, Alvin Ty; Williams, Kipling D.

    2012-01-01

    Abstract Electronic-based communication (such as Immersive Virtual Environments; IVEs) may offer new ways of satisfying the need for social connection, but they also provide ways this need can be thwarted. Ostracism, being ignored and excluded, is a common social experience that threatens fundamental human needs (i.e., belonging, control, self-esteem, and meaningful existence). Previous ostracism research has made use of a variety of paradigms, including minimal electronic-based interactions (e.g., Cyberball) and communication (e.g., chatrooms and Short Message Services). These paradigms, however, lack the mundane realism that many IVEs now offer. Further, IVE paradigms designed to measure ostracism may allow researchers to test more nuanced hypotheses about the effects of ostracism. We created an IVE in which ostracism could be manipulated experimentally, emulating a previously validated minimal ostracism paradigm. We found that participants who were ostracized in this IVE experienced the same negative effects demonstrated in other ostracism paradigms, providing, to our knowledge, the first evidence of the negative effects of ostracism in virtual environments. Though further research directly exploring these effects in online virtual environments is needed, this research suggests that individuals encountering ostracism in other virtual environments (such as massively multiplayer online role playing games; MMORPGs) may experience negative effects similar to those of being ostracized in real life. This possibility may have serious implications for individuals who are marginalized in their real life and turn to IVEs to satisfy their need for social connection. PMID:22897472

  4. Virtually ostracized: studying ostracism in immersive virtual environments.

    PubMed

    Kassner, Matthew P; Wesselmann, Eric D; Law, Alvin Ty; Williams, Kipling D

    2012-08-01

    Electronic-based communication (such as Immersive Virtual Environments; IVEs) may offer new ways of satisfying the need for social connection, but they also provide ways this need can be thwarted. Ostracism, being ignored and excluded, is a common social experience that threatens fundamental human needs (i.e., belonging, control, self-esteem, and meaningful existence). Previous ostracism research has made use of a variety of paradigms, including minimal electronic-based interactions (e.g., Cyberball) and communication (e.g., chatrooms and Short Message Services). These paradigms, however, lack the mundane realism that many IVEs now offer. Further, IVE paradigms designed to measure ostracism may allow researchers to test more nuanced hypotheses about the effects of ostracism. We created an IVE in which ostracism could be manipulated experimentally, emulating a previously validated minimal ostracism paradigm. We found that participants who were ostracized in this IVE experienced the same negative effects demonstrated in other ostracism paradigms, providing, to our knowledge, the first evidence of the negative effects of ostracism in virtual environments. Though further research directly exploring these effects in online virtual environments is needed, this research suggests that individuals encountering ostracism in other virtual environments (such as massively multiplayer online role playing games; MMORPGs) may experience negative effects similar to those of being ostracized in real life. This possibility may have serious implications for individuals who are marginalized in their real life and turn to IVEs to satisfy their need for social connection.

  5. High-immersion three-dimensional display of the numerical computer model

    NASA Astrophysics Data System (ADS)

    Xing, Shujun; Yu, Xunbo; Zhao, Tianqi; Cai, Yuanfa; Chen, Duo; Chen, Zhidong; Sang, Xinzhu

    2013-08-01

    High-immersion three-dimensional (3D) displays making them valuable tools for many applications, such as designing and constructing desired building houses, industrial architecture design, aeronautics, scientific research, entertainment, media advertisement, military areas and so on. However, most technologies provide 3D display in the front of screens which are in parallel with the walls, and the sense of immersion is decreased. To get the right multi-view stereo ground image, cameras' photosensitive surface should be parallax to the public focus plane and the cameras' optical axes should be offset to the center of public focus plane both atvertical direction and horizontal direction. It is very common to use virtual cameras, which is an ideal pinhole camera to display 3D model in computer system. We can use virtual cameras to simulate the shooting method of multi-view ground based stereo image. Here, two virtual shooting methods for ground based high-immersion 3D display are presented. The position of virtual camera is determined by the people's eye position in the real world. When the observer stand in the circumcircle of 3D ground display, offset perspective projection virtual cameras is used. If the observer stands out the circumcircle of 3D ground display, offset perspective projection virtual cameras and the orthogonal projection virtual cameras are adopted. In this paper, we mainly discussed the parameter setting of virtual cameras. The Near Clip Plane parameter setting is the main point in the first method, while the rotation angle of virtual cameras is the main point in the second method. In order to validate the results, we use the D3D and OpenGL to render scenes of different viewpoints and generate a stereoscopic image. A realistic visualization system for 3D models is constructed and demonstrated for viewing horizontally, which provides high-immersion 3D visualization. The displayed 3D scenes are compared with the real objects in the real world.

  6. The Student Experience With Varying Immersion Levels of Virtual Reality Simulation.

    PubMed

    Farra, Sharon L; Smith, Sherrill J; Ulrich, Deborah L

    With increasing use of virtual reality simulation (VRS) in nursing education and given the vast array of technologies available, a variety of levels of immersion and experiences can be provided to students. This study explored two different levels of immersive VRS capability. Study participants included baccalaureate nursing students from three universities across four campuses. Students were trained in the skill of decontamination using traditional methods or with VRS options of mouse and keyboard or head-mounted display technology. Results of focus group interviews reflect the student experience and satisfaction with two different immersive levels of VRS.

  7. Level of Immersion in Virtual Environments Impacts the Ability to Assess and Teach Social Skills in Autism Spectrum Disorder

    PubMed Central

    Bugnariu, Nicoleta L.

    2016-01-01

    Abstract Virtual environments (VEs) may be useful for delivering social skills interventions to individuals with autism spectrum disorder (ASD). Immersive VEs provide opportunities for individuals with ASD to learn and practice skills in a controlled replicable setting. However, not all VEs are delivered using the same technology, and the level of immersion differs across settings. We group studies into low-, moderate-, and high-immersion categories by examining five aspects of immersion. In doing so, we draw conclusions regarding the influence of this technical manipulation on the efficacy of VEs as a tool for assessing and teaching social skills. We also highlight ways in which future studies can advance our understanding of how manipulating aspects of immersion may impact intervention success. PMID:26919157

  8. Virtual reality training improves balance function.

    PubMed

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-09-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function.

  9. Virtual reality training improves balance function

    PubMed Central

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-01-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function. PMID:25368651

  10. InSPAL: A Novel Immersive Virtual Learning Programme.

    PubMed

    Byrne, Julia; Ip, Horace H S; Shuk-Ying Lau, Kate; Chen Li, Richard; Tso, Amy; Choi, Catherine

    2015-01-01

    In this paper we introduce The Interactive Sensory Program for Affective Learning (InSPAL) a pioneering virtual learning programme designed for the severely intellectually disabled (SID) students, who are having cognitive deficiencies and other sensory-motor handicaps, and thus need more help and attention in overcoming their learning difficulties. Through combining and integrating interactive media and virtual reality technology with the principles of art therapy and relevant pedagogical techniques, InSPAL aims to strengthen SID students' pre-learning abilities, promote their self-awareness, decrease behavioral interferences with learning as well as social interaction, enhance their communication and thus promote their quality of life. Results of our study show that students who went through our programme were more focused, and the ability to do things more independently increased by 15%. Moreover, 50% of the students showed a marked improvement in the ability to raise their hands in response, thus increasing their communication skills. The use of therapeutic interventions enabled a better control to the body, mind and emotions, resulting a greater performance and better participation.

  11. Improving post-stroke cognitive and behavioral abnormalities by using virtual reality: A case report on a novel use of nirvana.

    PubMed

    De Luca, Rosaria; Torrisi, Michele; Piccolo, Adriana; Bonfiglio, Giovanni; Tomasello, Provvidenza; Naro, Antonino; Calabrò, Rocco Salvatore

    2017-10-11

    Cognitive impairment, as well as mood and anxiety disorders, occur frequently in patients following stroke. Aim of this study was to evaluate the effects of a combined rehabilitative treatment using conventional relaxation and respiratory techniques, in a specific rehabilitative virtual environment (by using Bts-Nirvana). A 58-year-old woman, affected by hemorrhagic stroke, underwent two different rehabilitation trainings, including either standard relaxation techniques alone in a common clinical setting or the same psychological approach in a semi-immersive virtual environment with an augmented sensorial (audio-video) and motor feedback (sensory motor-interaction). We evaluated the patient's cognitive and psychological profile before and after the two different trainings, by using a specific psychometric battery, aimed to assess cognitive status, attention processes and to estimate the presence of mood alterations, anxiety and coping strategies. Only at the end of the combined approach, we observed a significant improvement in attention and memory functions, with a nearly complete relief of anxiety symptoms and an improvement in coping strategies. Relaxation and respiratory techniques in a semi-immersive virtual reality environment, using Bts-Nirvana, may be a promising tool in improving attention process, coping strategies, and anxiety in individuals with neurological disorders, including stroke.

  12. The impact of self-avatars on trust and collaboration in shared virtual environments.

    PubMed

    Pan, Ye; Steed, Anthony

    2017-01-01

    A self-avatar is known to have a potentially significant impact on the user's experience of the immersive content but it can also affect how users interact with each other in a shared virtual environment (SVE). We implemented an SVE for a consumer virtual reality system where each user's body could be represented by a jointed self-avatar that was dynamically controlled by head and hand controllers. We investigated the impact of a self-avatar on collaborative outcomes such as completion time and trust formation during competitive and cooperative tasks. We used two different embodiment levels: no self-avatar and self-avatar, and compared these to an in-person face to face version of the tasks. We found that participants could finish the task more quickly when they cooperated than when they competed, for both the self-avatar condition and the face to face condition, but not for the no self-avatar condition. In terms of trust formation, both the self-avatar condition and the face to face condition led to higher scores than the no self-avatar condition; however, collaboration style had no significant effect on trust built between partners. The results are further evidence of the importance of a self-avatar representation in immersive virtual reality.

  13. The impact of self-avatars on trust and collaboration in shared virtual environments

    PubMed Central

    Steed, Anthony

    2017-01-01

    A self-avatar is known to have a potentially significant impact on the user’s experience of the immersive content but it can also affect how users interact with each other in a shared virtual environment (SVE). We implemented an SVE for a consumer virtual reality system where each user’s body could be represented by a jointed self-avatar that was dynamically controlled by head and hand controllers. We investigated the impact of a self-avatar on collaborative outcomes such as completion time and trust formation during competitive and cooperative tasks. We used two different embodiment levels: no self-avatar and self-avatar, and compared these to an in-person face to face version of the tasks. We found that participants could finish the task more quickly when they cooperated than when they competed, for both the self-avatar condition and the face to face condition, but not for the no self-avatar condition. In terms of trust formation, both the self-avatar condition and the face to face condition led to higher scores than the no self-avatar condition; however, collaboration style had no significant effect on trust built between partners. The results are further evidence of the importance of a self-avatar representation in immersive virtual reality. PMID:29240837

  14. The effect of visual and interaction fidelity on spatial cognition in immersive virtual environments.

    PubMed

    Mania, Katerina; Wooldridge, Dave; Coxon, Matthew; Robinson, Andrew

    2006-01-01

    Accuracy of memory performance per se is an imperfect reflection of the cognitive activity (awareness states) that underlies performance in memory tasks. The aim of this research is to investigate the effect of varied visual and interaction fidelity of immersive virtual environments on memory awareness states. A between groups experiment was carried out to explore the effect of rendering quality on location-based recognition memory for objects and associated states of awareness. The experimental space, consisting of two interconnected rooms, was rendered either flat-shaded or using radiosity rendering. The computer graphics simulations were displayed on a stereo head-tracked Head Mounted Display. Participants completed a recognition memory task after exposure to the experimental space and reported one of four states of awareness following object recognition. These reflected the level of visual mental imagery involved during retrieval, the familiarity of the recollection, and also included guesses. Experimental results revealed variations in the distribution of participants' awareness states across conditions while memory performance failed to reveal any. Interestingly, results revealed a higher proportion of recollections associated with mental imagery in the flat-shaded condition. These findings comply with similar effects revealed in two earlier studies summarized here, which demonstrated that the less "naturalistic" interaction interface or interface of low interaction fidelity provoked a higher proportion of recognitions based on visual mental images.

  15. Distributed interactive virtual environments for collaborative experiential learning and training independent of distance over Internet2.

    PubMed

    Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P

    2004-01-01

    Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully between Western Australia and UNM. We successfully demonstrated the ability to fully immerse participants in a distributed virtual environment independent of distance for collaborative team interaction in medical simulation designed for education and training. The ability to make mistakes in a safe environment is well received by students and has a positive impact on their understanding, as well as memory of the principles involved in correcting those mistakes. Bringing people together as virtual teams for interactive experiential learning and collaborative training, independent of distance, provides a platform for distributed "just-in-time" training, performance assessment and credentialing. Further validation is necessary to determine the potential value of the distributed VRE in knowledge transfer, improved future performance and should entail training participants to competence in using these tools.

  16. New Desktop Virtual Reality Technology in Technical Education

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2008-01-01

    Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…

  17. Situating Pedagogies, Positions and Practices in Immersive Virtual Worlds

    ERIC Educational Resources Information Center

    Savin-Baden, Maggi; Gourlay, Lesley; Tombs, Cathy; Steils, Nicole; Tombs, Gemma; Mawer, Matt

    2010-01-01

    Background: The literature on immersive virtual worlds and e-learning to date largely indicates that technology has led the pedagogy. Although rationales for implementing e-learning have included flexibility of provision and supporting diversity, none of these recommendations has helped to provide strong pedagogical location. Furthermore, there is…

  18. A Virtual World for Collaboration: The AETZone

    ERIC Educational Resources Information Center

    Cheney, Amelia W.; Sanders, Robert L.; Matzen, Nita J.; Bronack, Stephen C.; Riedl, Richard E.; Tashner, John H.

    2009-01-01

    Participation in learning communities, and the construction of knowledge in communities of practice, are important considerations in the use of 3D immersive worlds. This article describes the creation of this type of learning environment in AETZone, an immersive virtual environment in use within graduate programs at Appalachian State University…

  19. Building a Collaborative Online Literary Experience

    ERIC Educational Resources Information Center

    Essid, Joe; Wilde, Fran

    2011-01-01

    Effective virtual simulations can embed participants in imaginary worlds. Researchers working in virtual worlds and gaming often refer to "immersion," a state in which a participant or player loses track of time and becomes one with the simulation. Immersive settings have been shown to deepen learning. Ken Hudson's work with students…

  20. Experiencing Soil Science from your office through virtual experiences

    NASA Astrophysics Data System (ADS)

    Beato, M. Carmen; González-Merino, Ramón; Campillo, M. Carmen; Fernández-Ahumada, Elvira; Ortiz, Leovigilda; Taguas, Encarnación V.; Guerrero, José Emilio

    2017-04-01

    Currently, numerous tools based on the new information and communication technologies offer a wide range of possibilities for the implementation of interactive methodologies in Education and Science. In particular, virtual reality and immersive worlds - artificially generated computer environments where users interact through a figurative individual that represents them in that environment (their "avatar") - have been identified as the technology that will change the way we live, particularly in educational terms, product development and entertainment areas (Schmorrow, 2009). Gisbert-Cervera et al. (2011) consider that the 3D worlds in education, among others, provide a unique training and exchange of knowledge environment which allows a goal reflection to support activities and achieve learning outcomes. In Soil Sciences, the experimental component is essential to acquire the necessary knowledge to understand the biogeochemical processes taking place and their interactions with time, climate, topography and living organisms present. In this work, an immersive virtual environment which reproduces a series of pits have been developed to evaluate and differentiate soil characteristics such as texture, structure, consistency, color and other physical-chemical and biological properties for educational purposes. Bibliographical material such as pictures, books, papers and were collected in order to classify the information needed and to build the soil profiles into the virtual environment. The programming language for the virtual recreation was Unreal Engine4 (UE4; https://www.unrealengine.com/unreal-engine-4). This program was chosen because it provides two toolsets for programmers and it can also be used in tandem to accelerate development workflows. In addition, Unreal Engine4 technology powers hundreds of games as well as real-time 3D films, training simulations, visualizations and it creates very realistic graphics. For the evaluation of its impact and its usefulness in teaching, a series of surveys will be presented to undergraduate students and teachers. REFERENCES: Gisbert-Cervera M, Esteve-Gonzalez V., Camacho-Marti M.M. (2011). Delve into the Deep: Learning Potential in Metaverses and 3D Worlds. eLearning (25) Papers ISSN: 1887-1542 Schmorrow D.D. (2009). Why virtual? Theoretical Issues in Ergonomics Science 10(3): 279-282.

  1. The Components of Effective Teacher Training in the Use of Three-Dimensional Immersive Virtual Worlds for Learning and Instruction Purposes: A Literature Review

    ERIC Educational Resources Information Center

    Nussli, Natalie; Oh, Kevin

    2014-01-01

    The overarching question that guides this review is to identify the key components of effective teacher training in virtual schooling, with a focus on three-dimensional (3D) immersive virtual worlds (IVWs). The process of identifying the essential components of effective teacher training in the use of 3D IVWs will be described step-by-step. First,…

  2. The influence of action on episodic memory: a virtual reality study.

    PubMed

    Plancher, Gaën; Barra, Julien; Orriols, Eric; Piolino, Pascale

    2013-01-01

    A range of empirical findings suggest that active learning is important for memory. However, few studies have focused on the mechanisms underlying this enactment effect in episodic memory using complex environments. Research using virtual reality has yielded inconsistent results. We postulated that the effect of action depends on the degree of interaction with the environment and freedom in the planning of an itinerary. To test these hypotheses, we disentangled the interaction and planning components of action to investigate whether each enhances factual and spatial memory. Seventy-two participants (36 male and 36 female) explored a virtual town in one of three experimental conditions: (a) a passive condition where participants were immersed as passenger of the car (no interaction, no planning); (b) a planning-only condition (the subject chose the itinerary but did not drive the car); (c) an interaction-only condition (the subject drove the car but the itinerary was fixed). We found that itinerary choice and motor control both enhanced spatial memory, while factual memory was impaired by online motor control. The role of action in memory is discussed.

  3. a Low-Cost and Lightweight 3d Interactive Real Estate-Purposed Indoor Virtual Reality Application

    NASA Astrophysics Data System (ADS)

    Ozacar, K.; Ortakci, Y.; Kahraman, I.; Durgut, R.; Karas, I. R.

    2017-11-01

    Interactive 3D architectural indoor design have been more popular after it benefited from Virtual Reality (VR) technologies. VR brings computer-generated 3D content to real life scale and enable users to observe immersive indoor environments so that users can directly modify it. This opportunity enables buyers to purchase a property off-the-plan cheaper through virtual models. Instead of showing property through 2D plan or renders, this visualized interior architecture of an on-sale unbuilt property is demonstrated beforehand so that the investors have an impression as if they were in the physical building. However, current applications either use highly resource consuming software, or are non-interactive, or requires specialist to create such environments. In this study, we have created a real-estate purposed low-cost high quality fully interactive VR application that provides a realistic interior architecture of the property by using free and lightweight software: Sweet Home 3D and Unity. A preliminary study showed that participants generally liked proposed real estate-purposed VR application, and it satisfied the expectation of the property buyers.

  4. Semi-Immersive Virtual Turbine Engine Simulation System

    NASA Astrophysics Data System (ADS)

    Abidi, Mustufa H.; Al-Ahmari, Abdulrahman M.; Ahmad, Ali; Darmoul, Saber; Ameen, Wadea

    2018-05-01

    The design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.

  5. Two Impurities in a Bose-Einstein Condensate: From Yukawa to Efimov Attracted Polarons

    NASA Astrophysics Data System (ADS)

    Naidon, Pascal

    2018-04-01

    The well-known Yukawa and Efimov potentials are two different mediated interaction potentials. The first one arises in quantum field theory from the exchange of virtual particles. The second one is mediated by a real particle resonantly interacting with two other particles. This Letter shows how two impurities immersed in a Bose-Einstein condensate can exhibit both phenomena. For a weak attraction with the condensate, the two impurities form two polarons that interact through a weak Yukawa attraction mediated by virtual excitations. For a resonant attraction with the condensate, the exchanged excitation becomes a real boson and the mediated interaction changes to a strong Efimov attraction that can bind the two polarons. The resulting bipolarons turn into in-medium Efimov trimers made of the two impurities and one boson. Evidence of this physics could be seen in ultracold mixtures of atoms.

  6. Virtual environment display for a 3D audio room simulation

    NASA Technical Reports Server (NTRS)

    Chapin, William L.; Foster, Scott H.

    1992-01-01

    The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.

  7. Wildcard: A wearable virtual reality storytelling tool for children with intellectual developmental disability.

    PubMed

    Gelsomini, Mirko; Garzotto, Franca; Montesano, Daniele; Occhiuto, Daniele

    2016-08-01

    Our research aims at supporting existing therapies for children with intellectual and developmental disorders (IDD). The personal and social autonomy is the desired end state to be achieved to enable a smooth integration in the real world. We developed and tested a framework for storytelling and learning activities that exploits an immersive virtual reality viewer to interact with target users. We co-designed our system with experts from the medical sector, identifying features that allow patients to stay focused on exercises to perform. Our approach triggers a learning process for a seamless assimilation of common behavioral skills useful in every day's life. This paper highlights the technologic challenges in healthcare and discusses cutting-edge interaction paradigms.

  8. Enhancing Tele-robotics with Immersive Virtual Reality

    DTIC Science & Technology

    2017-11-03

    graduate and undergraduate students within the Digital Gaming and Simulation, Computer Science, and psychology programs have actively collaborated...investigates the use of artificial intelligence and visual computing. Numerous fields across the human-computer interaction and gaming research areas...invested in digital gaming and simulation to cognitively stimulate humans by computers, forming a $10.5B industry [1]. On the other hand, cognitive

  9. Image-Based Virtual Tours and 3d Modeling of Past and Current Ages for the Enhancement of Archaeological Parks: the Visualversilia 3d Project

    NASA Astrophysics Data System (ADS)

    Castagnetti, C.; Giannini, M.; Rivola, R.

    2017-05-01

    The research project VisualVersilia 3D aims at offering a new way to promote the territory and its heritage by matching the traditional reading of the document and the potential use of modern communication technologies for the cultural tourism. Recently, the research on the use of new technologies applied to cultural heritage have turned their attention mainly to technologies to reconstruct and narrate the complexity of the territory and its heritage, including 3D scanning, 3D printing and augmented reality. Some museums and archaeological sites already exploit the potential of digital tools to preserve and spread their heritage but interactive services involving tourists in an immersive and more modern experience are still rare. The innovation of the project consists in the development of a methodology for documenting current and past historical ages and integrating their 3D visualizations with rendering capable of returning an immersive virtual reality for a successful enhancement of the heritage. The project implements the methodology in the archaeological complex of Massaciuccoli, one of the best preserved roman site of the Versilia Area (Tuscany, Italy). The activities of the project briefly consist in developing: 1. the virtual tour of the site in its current configuration on the basis of spherical images then enhanced by texts, graphics and audio guides in order to enable both an immersive and remote tourist experience; 2. 3D reconstruction of the evidences and buildings in their current condition for documentation and conservation purposes on the basis of a complete metric survey carried out through laser scanning; 3. 3D virtual reconstructions through the main historical periods on the basis of historical investigation and the analysis of data acquired.

  10. Scalable metadata environments (MDE): artistically impelled immersive environments for large-scale data exploration

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Margolis, Todd; Prudhomme, Andrew; Schulze, Jürgen P.; Mostafavi, Iman; Lewis, J. P.; Gossmann, Joachim; Singh, Rajvikram

    2014-02-01

    Scalable Metadata Environments (MDEs) are an artistic approach for designing immersive environments for large scale data exploration in which users interact with data by forming multiscale patterns that they alternatively disrupt and reform. Developed and prototyped as part of an art-science research collaboration, we define an MDE as a 4D virtual environment structured by quantitative and qualitative metadata describing multidimensional data collections. Entire data sets (e.g.10s of millions of records) can be visualized and sonified at multiple scales and at different levels of detail so they can be explored interactively in real-time within MDEs. They are designed to reflect similarities and differences in the underlying data or metadata such that patterns can be visually/aurally sorted in an exploratory fashion by an observer who is not familiar with the details of the mapping from data to visual, auditory or dynamic attributes. While many approaches for visual and auditory data mining exist, MDEs are distinct in that they utilize qualitative and quantitative data and metadata to construct multiple interrelated conceptual coordinate systems. These "regions" function as conceptual lattices for scalable auditory and visual representations within virtual environments computationally driven by multi-GPU CUDA-enabled fluid dyamics systems.

  11. Utility of virtual reality environments to examine physiological reactivity and subjective distress in adults who stutter.

    PubMed

    Brundage, Shelley B; Brinton, James M; Hancock, Adrienne B

    2016-12-01

    Virtual reality environments (VREs) allow for immersion in speaking environments that mimic real-life interactions while maintaining researcher control. VREs have been used successfully to engender arousal in other disorders. The purpose of this study was to investigate the utility of virtual reality environments to examine physiological reactivity and subjective ratings of distress in persons who stutter (PWS). Subjective and objective measures of arousal were collected from 10PWS during four-minute speeches to a virtual audience and to a virtual empty room. Stuttering frequency and physiological measures (skin conductance level and heart rate) did not differ across speaking conditions, but subjective ratings of distress were significantly higher in the virtual audience condition compared to the virtual empty room. VREs have utility in elevating subjective ratings of distress in PWS. VREs have the potential to be useful tools for practicing treatment targets in a safe, controlled, and systematic manner. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Real-time, interactive, visually updated simulator system for telepresence

    NASA Technical Reports Server (NTRS)

    Schebor, Frederick S.; Turney, Jerry L.; Marzwell, Neville I.

    1991-01-01

    Time delays and limited sensory feedback of remote telerobotic systems tend to disorient teleoperators and dramatically decrease the operator's performance. To remove the effects of time delays, key components were designed and developed of a prototype forward simulation subsystem, the Global-Local Environment Telerobotic Simulator (GLETS) that buffers the operator from the remote task. GLETS totally immerses an operator in a real-time, interactive, simulated, visually updated artificial environment of the remote telerobotic site. Using GLETS, the operator will, in effect, enter into a telerobotic virtual reality and can easily form a gestalt of the virtual 'local site' that matches the operator's normal interactions with the remote site. In addition to use in space based telerobotics, GLETS, due to its extendable architecture, can also be used in other teleoperational environments such as toxic material handling, construction, and undersea exploration.

  13. Immersive Virtual Worlds in University-Level Human Geography Courses

    ERIC Educational Resources Information Center

    Dittmer, Jason

    2010-01-01

    This paper addresses the potential for increased deployment of immersive virtual worlds in higher geographic education. An account of current practice regarding popular culture in the geography classroom is offered, focusing on the objectification of popular culture rather than its constitutive role vis-a-vis place. Current e-learning practice is…

  14. Feasibility of Using an Augmented Immersive Virtual Reality Learning Environment to Enhance Music Conducting Skills

    ERIC Educational Resources Information Center

    Orman, Evelyn K.; Price, Harry E.; Russell, Christine R.

    2017-01-01

    Acquiring nonverbal skills necessary to appropriately communicate and educate members of performing ensembles is essential for wind band conductors. Virtual reality learning environments (VRLEs) provide a unique setting for developing these proficiencies. For this feasibility study, we used an augmented immersive VRLE to enhance eye contact, torso…

  15. The Utility of Using Immersive Virtual Environments for the Assessment of Science Inquiry Learning

    ERIC Educational Resources Information Center

    Code, Jillianne; Clarke-Midura, Jody; Zap, Nick; Dede, Chris

    2013-01-01

    Determining the effectiveness of any educational technology depends upon teachers' and learners' perception of the functional utility of that tool for teaching, learning, and assessment. The Virtual Performance project at Harvard University is developing and studying the feasibility of using immersive technology to develop performance…

  16. The Design, Development and Evaluation of a Virtual Reality Based Learning Environment

    ERIC Educational Resources Information Center

    Chen, Chwen Jen

    2006-01-01

    Many researchers and instructional designers increasingly recognise the benefits of utilising three dimensional virtual reality (VR) technology in instruction. In general, there are two types of VR system, the immersive system and the non-immersive system. This article focuses on the latter system that merely uses the conventional personal…

  17. Using Immersive Virtual Environments for Certification

    NASA Technical Reports Server (NTRS)

    Lutz, R.; Cruz-Neira, C.

    1998-01-01

    Immersive virtual environments (VEs) technology has matured to the point where it can be utilized as a scientific and engineering problem solving tool. In particular, VEs are starting to be used to design and evaluate safety-critical systems that involve human operators, such as flight and driving simulators, complex machinery training, and emergency rescue strategies.

  18. A haptic interface for virtual simulation of endoscopic surgery.

    PubMed

    Rosenberg, L B; Stredney, D

    1996-01-01

    Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.

  19. A comparison of older adults' subjective experiences with virtual and real environments during dynamic balance activities.

    PubMed

    Proffitt, Rachel; Lange, Belinda; Chen, Christina; Winstein, Carolee

    2015-01-01

    The purpose of this study was to explore the subjective experience of older adults interacting with both virtual and real environments. Thirty healthy older adults engaged with real and virtual tasks of similar motor demands: reaching to a target in standing and stepping stance. Immersive tendencies and absorption scales were administered before the session. Game engagement and experience questionnaires were completed after each task, followed by a semistructured interview at the end of the testing session. Data were analyzed respectively using paired t tests and grounded theory methodology. Participants preferred the virtual task over the real task. They also reported an increase in presence and absorption with the virtual task, describing an external focus of attention. Findings will be used to inform future development of appropriate game-based balance training applications that could be embedded in the home or community settings as part of evidence-based fall prevention programs.

  20. Simulation of Physical Experiments in Immersive Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Wasfy, Tamer M.

    2001-01-01

    An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.

  1. Immersive Visual Analytics for Transformative Neutron Scattering Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Daniel, Jamison R; Drouhard, Margaret

    The ORNL Spallation Neutron Source (SNS) provides the most intense pulsed neutron beams in the world for scientific research and development across a broad range of disciplines. SNS experiments produce large volumes of complex data that are analyzed by scientists with varying degrees of experience using 3D visualization and analysis systems. However, it is notoriously difficult to achieve proficiency with 3D visualizations. Because 3D representations are key to understanding the neutron scattering data, scientists are unable to analyze their data in a timely fashion resulting in inefficient use of the limited and expensive SNS beam time. We believe a moremore » intuitive interface for exploring neutron scattering data can be created by combining immersive virtual reality technology with high performance data analytics and human interaction. In this paper, we present our initial investigations of immersive visualization concepts as well as our vision for an immersive visual analytics framework that could lower the barriers to 3D exploratory data analysis of neutron scattering data at the SNS.« less

  2. Interreality: A New Paradigm for E-health.

    PubMed

    Riva, Giuseppe

    2009-01-01

    "Interreality" is a personalized immersive e-therapy whose main novelty is a hybrid, closed-loop empowering experience bridging physical and virtual worlds. The main feature of interreality is a twofold link between the virtual and the real world: (a) behavior in the physical world influences the experience in the virtual one; (b) behavior in the virtual world influences the experience in the real one. This is achieved through: (1) 3D Shared Virtual Worlds: role-playing experiences in which one or more users interact with one another within a 3D world; (2) Bio and Activity Sensors (From the Real to the Virtual World): They are used to track the emotional/health/activity status of the user and to influence his/her experience in the virtual world (aspect, activity and access); (3) Mobile Internet Appliances (From the Virtual to the Real One): In interreality, the social and individual user activity in the virtual world has a direct link with the users' life through a mobile phone/digital assistant. The different technologies that are involved in the interreality vision and its clinical rationale are addressed and discussed.

  3. Formalizing and Promoting Collaboration in 3D Virtual Environments - A Blueprint for the Creation of Group Interaction Patterns

    NASA Astrophysics Data System (ADS)

    Schmeil, Andreas; Eppler, Martin J.

    Despite the fact that virtual worlds and other types of multi-user 3D collaboration spaces have long been subjects of research and of application experiences, it still remains unclear how to best benefit from meeting with colleagues and peers in a virtual environment with the aim of working together. Making use of the potential of virtual embodiment, i.e. being immersed in a space as a personal avatar, allows for innovative new forms of collaboration. In this paper, we present a framework that serves as a systematic formalization of collaboration elements in virtual environments. The framework is based on the semiotic distinctions among pragmatic, semantic and syntactic perspectives. It serves as a blueprint to guide users in designing, implementing, and executing virtual collaboration patterns tailored to their needs. We present two team and two community collaboration pattern examples as a result of the application of the framework: Virtual Meeting, Virtual Design Studio, Spatial Group Configuration, and Virtual Knowledge Fair. In conclusion, we also point out future research directions for this emerging domain.

  4. Applying Virtual Reality to commercial Edutainment

    NASA Technical Reports Server (NTRS)

    Grissom, F.; Goza, Sharon P.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) when defined as a computer generated, immersive, three dimensional graphics environment which provides varying degrees of interactivity, remains an expensive, highly specialized application, yet to find its way into the school, home, or business. As a novel approach to a theme park-type attraction, though, its use can be justified. This paper describes how a virtual reality 'tour of the human digestive system' was created for the Omniplex Science Museum of Oklahoma City, Oklahoma. The customers main objectives were: (1) to educate; (2) to entertain; (3) to draw visitors; and (4) to generate revenue. The 'Edutainment' system ultimately delivered met these goals. As more such systems come into existence the resulting library of licensable programs will greatly reduce development costs to individual institutions.

  5. Virtual reality: a reality for future military pilotage?

    NASA Astrophysics Data System (ADS)

    McIntire, John P.; Martinsen, Gary L.; Marasco, Peter L.; Havig, Paul R.

    2009-05-01

    Virtual reality (VR) systems provide exciting new ways to interact with information and with the world. The visual VR environment can be synthetic (computer generated) or be an indirect view of the real world using sensors and displays. With the potential opportunities of a VR system, the question arises about what benefits or detriments a military pilot might incur by operating in such an environment. Immersive and compelling VR displays could be accomplished with an HMD (e.g., imagery on the visor), large area collimated displays, or by putting the imagery on an opaque canopy. But what issues arise when, instead of viewing the world directly, a pilot views a "virtual" image of the world? Is 20/20 visual acuity in a VR system good enough? To deliver this acuity over the entire visual field would require over 43 megapixels (MP) of display surface for an HMD or about 150 MP for an immersive CAVE system, either of which presents a serious challenge with current technology. Additionally, the same number of sensor pixels would be required to drive the displays to this resolution (and formidable network architectures required to relay this information), or massive computer clusters are necessary to create an entirely computer-generated virtual reality with this resolution. Can we presently implement such a system? What other visual requirements or engineering issues should be considered? With the evolving technology, there are many technological issues and human factors considerations that need to be addressed before a pilot is placed within a virtual cockpit.

  6. Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.

    1999-01-01

    We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.

  7. A Learning Evaluation for an Immersive Virtual Laboratory for Technical Training Applied into a Welding Workshop

    ERIC Educational Resources Information Center

    Torres, Francisco; Neira Tovar, Leticia A.; del Rio, Marta Sylvia

    2017-01-01

    This study aims to explore the results of welding virtual training performance, designed using a learning model based on cognitive and usability techniques, applying an immersive concept focused on person attention. Moreover, it also intended to demonstrate that exits a moderating effect of performance improvement when the user experience is taken…

  8. Collaborative Science Learning in Three-Dimensional Immersive Virtual Worlds: Pre-Service Teachers' Experiences in Second Life

    ERIC Educational Resources Information Center

    Nussli, Natalie; Oh, Kevin; McCandless, Kevin

    2014-01-01

    The purpose of this mixed methods study was to help pre-service teachers experience and evaluate the potential of Second Life, a three-dimensional immersive virtual environment, for potential integration into their future teaching. By completing collaborative assignments in Second Life, nineteen pre-service general education teachers explored an…

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drouhard, Margaret MEG G; Steed, Chad A; Hahn, Steven E

    In this paper, we propose strategies and objectives for immersive data visualization with applications in materials science using the Oculus Rift virtual reality headset. We provide background on currently available analysis tools for neutron scattering data and other large-scale materials science projects. In the context of the current challenges facing scientists, we discuss immersive virtual reality visualization as a potentially powerful solution. We introduce a prototype immersive visual- ization system, developed in conjunction with materials scientists at the Spallation Neutron Source, which we have used to explore large crystal structures and neutron scattering data. Finally, we offer our perspective onmore » the greatest challenges that must be addressed to build effective and intuitive virtual reality analysis tools that will be useful for scientists in a wide range of fields.« less

  10. D Modelling and Mapping for Virtual Exploration of Underwater Archaeology Assets

    NASA Astrophysics Data System (ADS)

    Liarokapis, F.; Kouřil, P.; Agrafiotis, P.; Demesticha, S.; Chmelík, J.; Skarlatos, D.

    2017-02-01

    This paper investigates immersive technologies to increase exploration time in an underwater archaeological site, both for the public, as well as, for researchers and scholars. Focus is on the Mazotos shipwreck site in Cyprus, which is located 44 meters underwater. The aim of this work is two-fold: (a) realistic modelling and mapping of the site and (b) an immersive virtual reality visit. For 3D modelling and mapping optical data were used. The underwater exploration is composed of a variety of sea elements including: plants, fish, stones, and artefacts, which are randomly positioned. Users can experience an immersive virtual underwater visit in Mazotos shipwreck site and get some information about the shipwreck and its contents for raising their archaeological knowledge and cultural awareness.

  11. Foreign Language Vocabulary Development through Activities in an Online 3D Environment

    ERIC Educational Resources Information Center

    Milton, James; Jonsen, Sunniva; Hirst, Steven; Lindenburn, Sharn

    2012-01-01

    On-line virtual 3D worlds offer the opportunity for users to interact in real time with native speakers of the language they are learning. In principle, this ought to be of great benefit to learners, and mimicking the opportunity for immersion that real-life travel to a foreign country offers. We have very little research to show whether this is…

  12. CliniSpace: a multiperson 3D online immersive training environment accessible through a browser.

    PubMed

    Dev, Parvati; Heinrichs, W LeRoy; Youngblood, Patricia

    2011-01-01

    Immersive online medical environments, with dynamic virtual patients, have been shown to be effective for scenario-based learning (1). However, ease of use and ease of access have been barriers to their use. We used feedback from prior evaluation of these projects to design and develop CliniSpace. To improve usability, we retained the richness of prior virtual environments but modified the user interface. To improve access, we used a Software-as-a-Service (SaaS) approach to present a richly immersive 3D environment within a web browser.

  13. Visualization of reservoir simulation data with an immersive virtual reality system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, B.K.

    1996-10-01

    This paper discusses an investigation into the use of an immersive virtual reality (VR) system to visualize reservoir simulation output data. The hardware and software configurations of the test-immersive VR system are described and compared to a nonimmersive VR system and to an existing workstation screen-based visualization system. The structure of 3D reservoir simulation data and the actions to be performed on the data within the VR system are discussed. The subjective results of the investigation are then presented, followed by a discussion of possible future work.

  14. Immersive Virtual Reality Technologies as a New Platform for Science, Scholarship, and Education

    NASA Astrophysics Data System (ADS)

    Djorgovski, Stanislav G.; Hut, P.; McMillan, S.; Knop, R.; Vesperini, E.; Graham, M.; Portegies Zwart, S.; Farr, W.; Mahabal, A.; Donalek, C.; Longo, G.

    2010-01-01

    Immersive virtual reality (VR) and virtual worlds (VWs) are an emerging set of technologies which likely represent the next evolutionary step in the ways we use information technology to interact with the world of information and with other people, the roles now generally fulfilled by the Web and other common Internet applications. Currently, these technologies are mainly accessed through various VWs, e.g., the Second Life (SL), which are general platforms for a broad range of user activities. As an experiment in the utilization of these technologies for science, scholarship, education, and public outreach, we have formed the Meta-Institute for Computational Astrophysics (MICA; http://mica-vw.org), the first professional scientific organization based exclusively in VWs. The goals of MICA are: (1) Exploration, development and promotion of VWs and VR technologies for professional research in astronomy and related fields. (2) Providing and developing novel social networking venues and mechanisms for scientific collaboration and communications, including professional meetings, effective telepresence, etc. (3) Use of VWs and VR technologies for education and public outreach. (4) Exchange of ideas and joint efforts with other scientific disciplines in promoting these goals for science and scholarship in general. To this effect, we have a regular schedule of professional and public outreach events in SL, including technical seminars, workshops, journal club, collaboration meetings, public lectures, etc. We find that these technologies are already remarkably effective as a telepresence platform for scientific and scholarly discussions, meetings, etc. They can offer substantial savings of time and resources, and eliminate a lot of unnecessary travel. They are equally effective as a public outreach platform, reaching a world-wide audience. On the pure research front, we are currently exploring the use of these technologies as a venue for numerical simulations and their visualization, as well as the immersive and interactive visualization of highly-dimensional data sets.

  15. A formal anthropological view of motivation models of problematic MMO play: achievement, social, and immersion factors in the context of culture.

    PubMed

    Snodgrass, Jeffrey G; Dengah, H J Francois; Lacy, Michael G; Fagan, Jesse

    2013-04-01

    Yee (2006) found three motivational factors-achievement, social, and immersion-underlying play in massively multiplayer online role-playing games ("MMORPGs" or "MMOs" for short). Subsequent work has suggested that these factors foster problematic or addictive forms of play in online worlds. In the current study, we used an online survey of respondents (N = 252), constructed and also interpreted in reference to ethnography and interviews, to examine problematic play in the World of Warcraft (WoW; Blizzard Entertainment, 2004-2013). We relied on tools from psychological anthropology to reconceptualize each of Yee's three motivational factors in order to test for the possible role of culture in problematic MMO play: (a) For achievement, we examined how "cultural consonance" with normative understandings of success might structure problematic forms of play; (b) for social, we analyzed the possibility that developing overvalued virtual relationships that are cutoff from offline social interactions might further exacerbate problematic play; and (c) in relation to immersion, we examined how "dissociative" blurring of actual- and virtual-world identities and experiences might contribute to problematic patterns. Our results confirmed that compared to Yee's original motivational factors, these culturally sensitive measures better predict problematic forms of play, pointing to the important role of sociocultural factors in structuring online play.

  16. Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing

    PubMed Central

    Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T.

    2016-01-01

    In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user’s hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning. PMID:26999151

  17. Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing.

    PubMed

    Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T

    2016-03-18

    In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user's hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning.

  18. Collaborative virtual environments art exhibition

    NASA Astrophysics Data System (ADS)

    Dolinsky, Margaret; Anstey, Josephine; Pape, Dave E.; Aguilera, Julieta C.; Kostis, Helen-Nicole; Tsoupikova, Daria

    2005-03-01

    This panel presentation will exhibit artwork developed in CAVEs and discuss how art methodologies enhance the science of VR through collaboration, interaction and aesthetics. Artists and scientists work alongside one another to expand scientific research and artistic expression and are motivated by exhibiting collaborative virtual environments. Looking towards the arts, such as painting and sculpture, computer graphics captures a visual tradition. Virtual reality expands this tradition to not only what we face, but to what surrounds us and even what responds to our body and its gestures. Art making that once was isolated to the static frame and an optimal point of view is now out and about, in fully immersive mode within CAVEs. Art knowledge is a guide to how the aesthetics of 2D and 3D worlds affect, transform, and influence the social, intellectual and physical condition of the human body through attention to psychology, spiritual thinking, education, and cognition. The psychological interacts with the physical in the virtual in such a way that each facilitates, enhances and extends the other, culminating in a "go together" world. Attention to sharing art experience across high-speed networks introduces a dimension of liveliness and aliveness when we "become virtual" in real time with others.

  19. How virtual reality works: illusions of vision in "real" and virtual environments

    NASA Astrophysics Data System (ADS)

    Stark, Lawrence W.

    1995-04-01

    Visual illusions abound in normal vision--illusions of clarity and completeness, of continuity in time and space, of presence and vivacity--and are part and parcel of the visual world inwhich we live. These illusions are discussed in terms of the human visual system, with its high- resolution fovea, moved from point to point in the visual scene by rapid saccadic eye movements (EMs). This sampling of visual information is supplemented by a low-resolution, wide peripheral field of view, especially sensitive to motion. Cognitive-spatial models controlling perception, imagery, and 'seeing,' also control the EMs that shift the fovea in the Scanpath mode. These illusions provide for presence, the sense off being within an environment. They equally well lead to 'Telepresence,' the sense of being within a virtual display, especially if the operator is intensely interacting within an eye-hand and head-eye human-machine interface that provides for congruent visual and motor frames of reference. Interaction, immersion, and interest compel telepresence; intuitive functioning and engineered information flows can optimize human adaptation to the artificial new world of virtual reality, as virtual reality expands into entertainment, simulation, telerobotics, and scientific visualization and other professional work.

  20. [Virtual reality in the treatment of mental disorders].

    PubMed

    Malbos, Eric; Boyer, Laurent; Lançon, Christophe

    2013-11-01

    Virtual reality is a media allowing users to interact in real time with computerized virtual environments. The application of this immersive technology to cognitive behavioral therapies is increasingly exploited for the treatment of mental disorders. The present study is a review of literature spanning from 1992 to 2012. It depicts the utility of this new tool for assessment and therapy through the various clinical studies carried out on subjects exhibiting diverse mental disorders. Most of the studies conducted on tested subjects attest to the significant efficacy of the Virtual Reality Exposure Therapy (VRET) for the treatment of distinct mental disorders. Comparative studies of VRET with the treatment of reference (the in vivo exposure component of the cognitive behavioral therapy) document an equal efficacy of the two methods and in some cases a superior therapeutic effect in favor of the VRET. Even though clinical experiments set on a larger scale, extended follow-up and studies about factors influencing presence are needed, virtual reality exposure represents an efficacious, confidential, affordable, flexible, interactive therapeutic method which application will progressively widened in the field of mental health. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  1. Cognitive factors associated with immersion in virtual environments

    NASA Technical Reports Server (NTRS)

    Psotka, Joseph; Davison, Sharon

    1993-01-01

    Immersion into the dataspace provided by a computer, and the feeling of really being there or 'presence', are commonly acknowledged as the uniquely important features of virtual reality environments. How immersed one feels appears to be determined by a complex set of physical components and affordances of the environment, and as yet poorly understood psychological processes. Pimentel and Teixeira say that the experience of being immersed in a computer-generated world involves the same mental shift of 'suspending your disbelief for a period of time' as 'when you get wrapped up in a good novel or become absorbed in playing a computer game'. That sounds as if it could be right, but it would be good to get some evidence for these important conclusions. It might be even better to try to connect these statements with theoretical positions that try to do justice to complex cognitive processes. The basic precondition for understanding Virtual Reality (VR) is understanding the spatial representation systems that localize our bodies or egocenters in space. The effort to understand these cognitive processes is being driven with new energy by the pragmatic demands of successful virtual reality environments, but the literature is largely sparse and anecdotal.

  2. An immersive virtual peer for studying social influences on child cyclists' road-crossing behavior.

    PubMed

    Babu, Sabarish V; Grechkin, Timofey Y; Chihak, Benjamin; Ziemer, Christine; Kearney, Joseph K; Cremer, James F; Plumert, Jodie M

    2011-01-01

    The goal of our work is to develop a programmatically controlled peer to bicycle with a human subject for the purpose of studying how social interactions influence road-crossing behavior. The peer is controlled through a combination of reactive controllers that determine the gross motion of the virtual bicycle, action-based controllers that animate the virtual bicyclist and generate verbal behaviors, and a keyboard interface that allows an experimenter to initiate the virtual bicyclist's actions during the course of an experiment. The virtual bicyclist's repertoire of behaviors includes road following, riding alongside the human rider, stopping at intersections, and crossing intersections through specified gaps in traffic. The virtual cyclist engages the human subject through gaze, gesture, and verbal interactions. We describe the structure of the behavior code and report the results of a study examining how 10- and 12-year-old children interact with a peer cyclist that makes either risky or safe choices in selecting gaps in traffic. Results of our study revealed that children who rode with a risky peer were more likely to cross intermediate-sized gaps than children who rode with a safe peer. In addition, children were significantly less likely to stop at the last six intersections after the experience of riding with the risky than the safe peer during the first six intersections. The results of the study and children's reactions to the virtual peer indicate that our virtual peer framework is a promising platform for future behavioral studies of peer influences on children's bicycle riding behavior. © 2011 IEEE Published by the IEEE Computer Society

  3. The forensic holodeck: an immersive display for forensic crime scene reconstructions.

    PubMed

    Ebert, Lars C; Nguyen, Tuan T; Breitbeck, Robert; Braun, Marcel; Thali, Michael J; Ross, Steffen

    2014-12-01

    In forensic investigations, crime scene reconstructions are created based on a variety of three-dimensional image modalities. Although the data gathered are three-dimensional, their presentation on computer screens and paper is two-dimensional, which incurs a loss of information. By applying immersive virtual reality (VR) techniques, we propose a system that allows a crime scene to be viewed as if the investigator were present at the scene. We used a low-cost VR headset originally developed for computer gaming in our system. The headset offers a large viewing volume and tracks the user's head orientation in real-time, and an optical tracker is used for positional information. In addition, we created a crime scene reconstruction to demonstrate the system. In this article, we present a low-cost system that allows immersive, three-dimensional and interactive visualization of forensic incident scene reconstructions.

  4. The Effects of Actual Human Size Display and Stereoscopic Presentation on Users' Sense of Being Together with and of Psychological Immersion in a Virtual Character

    PubMed Central

    Ahn, Dohyun; Seo, Youngnam; Kim, Minkyung; Kwon, Joung Huem; Jung, Younbo; Ahn, Jungsun

    2014-01-01

    Abstract This study examined the role of display size and mode in increasing users' sense of being together with and of their psychological immersion in a virtual character. Using a high-resolution three-dimensional virtual character, this study employed a 2×2 (stereoscopic mode vs. monoscopic mode×actual human size vs. small size display) factorial design in an experiment with 144 participants randomly assigned to each condition. Findings showed that stereoscopic mode had a significant effect on both users' sense of being together and psychological immersion. However, display size affected only the sense of being together. Furthermore, display size was not found to moderate the effect of stereoscopic mode. PMID:24606057

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markidis, S.; Rizwan, U.

    The use of virtual nuclear control room can be an effective and powerful tool for training personnel working in the nuclear power plants. Operators could experience and simulate the functioning of the plant, even in critical situations, without being in a real power plant or running any risk. 3D models can be exported to Virtual Reality formats and then displayed in the Virtual Reality environment providing an immersive 3D experience. However, two major limitations of this approach are that 3D models exhibit static textures, and they are not fully interactive and therefore cannot be used effectively in training personnel. Inmore » this paper we first describe a possible solution for embedding the output of a computer application in a 3D virtual scene, coupling real-world applications and VR systems. The VR system reported here grabs the output of an application running on an X server; creates a texture with the output and then displays it on a screen or a wall in the virtual reality environment. We then propose a simple model for providing interaction between the user in the VR system and the running simulator. This approach is based on the use of internet-based application that can be commanded by a laptop or tablet-pc added to the virtual environment. (authors)« less

  6. A second life for eHealth: prospects for the use of 3-D virtual worlds in clinical psychology.

    PubMed

    Gorini, Alessandra; Gaggioli, Andrea; Vigna, Cinzia; Riva, Giuseppe

    2008-08-05

    The aim of the present paper is to describe the role played by three-dimensional (3-D) virtual worlds in eHealth applications, addressing some potential advantages and issues related to the use of this emerging medium in clinical practice. Due to the enormous diffusion of the World Wide Web (WWW), telepsychology, and telehealth in general, have become accepted and validated methods for the treatment of many different health care concerns. The introduction of the Web 2.0 has facilitated the development of new forms of collaborative interaction between multiple users based on 3-D virtual worlds. This paper describes the development and implementation of a form of tailored immersive e-therapy called p-health whose key factor is interreality, that is, the creation of a hybrid augmented experience merging physical and virtual worlds. We suggest that compared with conventional telehealth applications such as emails, chat, and videoconferences, the interaction between real and 3-D virtual worlds may convey greater feelings of presence, facilitate the clinical communication process, positively influence group processes and cohesiveness in group-based therapies, and foster higher levels of interpersonal trust between therapists and patients. However, challenges related to the potentially addictive nature of such virtual worlds and questions related to privacy and personal safety will also be discussed.

  7. Using immersive simulation for training first responders for mass casualty incidents.

    PubMed

    Wilkerson, William; Avstreih, Dan; Gruppen, Larry; Beier, Klaus-Peter; Woolliscroft, James

    2008-11-01

    A descriptive study was performed to better understand the possible utility of immersive virtual reality simulation for training first responders in a mass casualty event. Utilizing a virtual reality cave automatic virtual environment (CAVE) and high-fidelity human patient simulator (HPS), a group of experts modeled a football stadium that experienced a terrorist explosion during a football game. Avatars (virtual patients) were developed by expert consensus that demonstrated a spectrum of injuries ranging from death to minor lacerations. A group of paramedics was assessed by observation for decisions made and action taken. A critical action checklist was created and used for direct observation and viewing videotaped recordings. Of the 12 participants, only 35.7% identified the type of incident they encountered. None identified a secondary device that was easily visible. All participants were enthusiastic about the simulation and provided valuable comments and insights. Learner feedback and expert performance review suggests that immersive training in a virtual environment has the potential to be a powerful tool to train first responders for high-acuity, low-frequency events, such as a terrorist attack.

  8. Using virtual reality to analyze sports performance.

    PubMed

    Bideau, Benoit; Kulpa, Richard; Vignais, Nicolas; Brault, Sébastien; Multon, Franck; Craig, Cathy

    2010-01-01

    Improving performance in sports can be difficult because many biomechanical, physiological, and psychological factors come into play during competition. A better understanding of the perception-action loop employed by athletes is necessary. This requires isolating contributing factors to determine their role in player performance. Because of its inherent limitations, video playback doesn't permit such in-depth analysis. Interactive, immersive virtual reality (VR) can overcome these limitations and foster a better understanding of sports performance from a behavioral-neuroscience perspective. Two case studies using VR technology and a sophisticated animation engine demonstrate how to use information from visual displays to inform a player's future course of action.

  9. Development of a Virtual Museum Including a 4d Presentation of Building History in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Tschirschwitz, F.; Deggim, S.

    2017-02-01

    In the last two decades the definition of the term "virtual museum" changed due to rapid technological developments. Using today's available 3D technologies a virtual museum is no longer just a presentation of collections on the Internet or a virtual tour of an exhibition using panoramic photography. On one hand, a virtual museum should enhance a museum visitor's experience by providing access to additional materials for review and knowledge deepening either before or after the real visit. On the other hand, a virtual museum should also be used as teaching material in the context of museum education. The laboratory for Photogrammetry & Laser Scanning of the HafenCity University Hamburg has developed a virtual museum (VM) of the museum "Alt-Segeberger Bürgerhaus", a historic town house. The VM offers two options for visitors wishing to explore the museum without travelling to the city of Bad Segeberg, Schleswig-Holstein, Germany. Option a, an interactive computer-based, tour for visitors to explore the exhibition and to collect information of interest or option b, to immerse into virtual reality in 3D with the HTC Vive Virtual Reality System.

  10. Body Space in Social Interactions: A Comparison of Reaching and Comfort Distance in Immersive Virtual Reality

    PubMed Central

    Iachini, Tina; Coello, Yann; Frassinetti, Francesca; Ruggiero, Gennaro

    2014-01-01

    Background Do peripersonal space for acting on objects and interpersonal space for interacting with con-specifics share common mechanisms and reflect the social valence of stimuli? To answer this question, we investigated whether these spaces refer to a similar or different physical distance. Methodology Participants provided reachability-distance (for potential action) and comfort-distance (for social processing) judgments towards human and non-human virtual stimuli while standing still (passive) or walking toward stimuli (active). Principal Findings Comfort-distance was larger than other conditions when participants were passive, but reachability and comfort distances were similar when participants were active. Both spaces were modulated by the social valence of stimuli (reduction with virtual females vs males, expansion with cylinder vs robot) and the gender of participants. Conclusions These findings reveal that peripersonal reaching and interpersonal comfort spaces share a common motor nature and are sensitive, at different degrees, to social modulation. Therefore, social processing seems embodied and grounded in the body acting in space. PMID:25405344

  11. Fish in the matrix: motor learning in a virtual world.

    PubMed

    Engert, Florian

    2012-01-01

    One of the large remaining challenges in the field of zebrafish neuroscience is the establishment of techniques and preparations that permit the recording and perturbation of neural activity in animals that can interact meaningfully with the environment. Since it is very difficult to do this in freely behaving zebrafish, I describe here two alternative approaches that meet this goal via tethered preparations. The first uses head-fixation in agarose in combination with online imaging and analysis of tail motion. In the second method, paralyzed fish are suspended with suction pipettes in mid-water and nerve root recordings serve as indicators for intended locomotion. In both cases, fish can be immersed into a virtual environment and allowed to interact with this virtual world via real or fictive tail motions. The specific examples given in this review focus primarily on the role of visual feedback~- but the general principles certainly extend to other modalities, including proprioception, hearing, balance, and somatosensation.

  12. Fish in the matrix: motor learning in a virtual world

    PubMed Central

    Engert, Florian

    2013-01-01

    One of the large remaining challenges in the field of zebrafish neuroscience is the establishment of techniques and preparations that permit the recording and perturbation of neural activity in animals that can interact meaningfully with the environment. Since it is very difficult to do this in freely behaving zebrafish, I describe here two alternative approaches that meet this goal via tethered preparations. The first uses head-fixation in agarose in combination with online imaging and analysis of tail motion. In the second method, paralyzed fish are suspended with suction pipettes in mid-water and nerve root recordings serve as indicators for intended locomotion. In both cases, fish can be immersed into a virtual environment and allowed to interact with this virtual world via real or fictive tail motions. The specific examples given in this review focus primarily on the role of visual feedback~– but the general principles certainly extend to other modalities, including proprioception, hearing, balance, and somatosensation. PMID:23355810

  13. A serious gaming/immersion environment to teach clinical cancer genetics.

    PubMed

    Nosek, Thomas M; Cohen, Mark; Matthews, Anne; Papp, Klara; Wolf, Nancy; Wrenn, Gregg; Sher, Andrew; Coulter, Kenneth; Martin, Jessica; Wiesner, Georgia L

    2007-01-01

    We are creating an interactive, simulated "Cancer Genetics Tower" for the self-paced learning of Clinical Cancer Genetics by medical students (go to: http://casemed.case.edu/cancergenetics). The environment uses gaming theory to engage the students into achieving specific learning objectives. The first few levels contain virtual laboratories where students achieve the basic underpinnings of Cancer Genetics. The next levels apply these principles to clinical practice. A virtual attending physician and four virtual patients, available for questioning through virtual video conferencing, enrich each floor. The pinnacle clinical simulation challenges the learner to integrate all information and demonstrate mastery, thus "winning" the game. A pilot test of the program by 17 medical students yielded very favorable feedback; the students found the Tower a "great way to teach", it held their attention, and it made learning fun. A majority of the students preferred the Tower over other resources to learn Cancer Genetics.

  14. A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills

    NASA Astrophysics Data System (ADS)

    Choi, Kup-Sze

    This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.

  15. CROSS DRIVE: A New Interactive and Immersive Approach for Exploring 3D Time-Dependent Mars Atmospheric Data in Distributed Teams

    NASA Astrophysics Data System (ADS)

    Gerndt, Andreas M.; Engelke, Wito; Giuranna, Marco; Vandaele, Ann C.; Neary, Lori; Aoki, Shohei; Kasaba, Yasumasa; Garcia, Arturo; Fernando, Terrence; Roberts, David; CROSS DRIVE Team

    2016-10-01

    Atmospheric phenomena of Mars can be highly dynamic and have daily and seasonal variations. Planetary-scale wavelike disturbances, for example, are frequently observed in Mars' polar winter atmosphere. Possible sources of the wave activity were suggested to be dynamical instabilities and quasi-stationary planetary waves, i.e. waves that arise predominantly via zonally asymmetric surface properties. For a comprehensive understanding of these phenomena, single layers of altitude have to be analyzed carefully and relations between different atmospheric quantities and interaction with the surface of Mars have to be considered. The CROSS DRIVE project tries to address the presentation of those data with a global view by means of virtual reality techniques. Complex orbiter data from spectrometer and observation data from Earth are combined with global circulation models and high-resolution terrain data and images available from Mars Express or MRO instruments. Scientists can interactively extract features from those dataset and can change visualization parameters in real-time in order to emphasize findings. Stereoscopic views allow for perception of the actual 3D behavior of Mars's atmosphere. A very important feature of the visualization system is the possibility to connect distributed workspaces together. This enables discussions between distributed working groups. The workspace can scale from virtual reality systems to expert desktop applications to web-based project portals. If multiple virtual environments are connected, the 3D position of each individual user is captured and used to depict the scientist as an avatar in the virtual world. The appearance of the avatar can also scale from simple annotations to complex avatars using tele-presence technology to reconstruct the users in 3D. Any change of the feature set (annotations, cutplanes, volume rendering, etc.) within the VR is immediately exchanged between all connected users. This allows that everybody is always aware of what is visible and discussed. The discussion is supported by audio and interaction is controlled by a moderator managing turn-taking presentations. A use case execution proved a success and showed the potential of this immersive approach.

  16. A randomized, controlled trial of immersive virtual reality analgesia, during physical therapy for pediatric burns.

    PubMed

    Schmitt, Yuko S; Hoffman, Hunter G; Blough, David K; Patterson, David R; Jensen, Mark P; Soltani, Maryam; Carrougher, Gretchen J; Nakamura, Dana; Sharar, Sam R

    2011-02-01

    This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6-19 years old) performed range-of-motion exercises under a therapist's direction for 1-5 days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects' perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27-44%) in pain ratings during virtual reality. They also reported improved affect ("fun") during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use. Copyright © 2010 Elsevier Ltd and ISBI. All rights reserved.

  17. A Randomized, Controlled Trial of Immersive Virtual Reality Analgesia during Physical Therapy for Pediatric Burn Injuries

    PubMed Central

    Schmitt, Yuko S.; Hoffman, Hunter G.; Blough, David K.; Patterson, David R.; Jensen, Mark P.; Soltani, Maryam; Carrougher, Gretchen J.; Nakamura, Dana; Sharar, Sam R.

    2010-01-01

    This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6–19 years old) performed range-of-motion exercises under a therapist’s direction for one to five days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects’ perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27–44%) in pain ratings during virtual reality. They also reported improved affect (“fun”) during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use. PMID:20692769

  18. Immersive Virtual Reality to Improve Walking Abilities in Cerebral Palsy: A Pilot Study.

    PubMed

    Gagliardi, Chiara; Turconi, Anna Carla; Biffi, Emilia; Maghini, Cristina; Marelli, Alessia; Cesareo, Ambra; Diella, Eleonora; Panzeri, Daniele

    2018-04-27

    Immersive virtual reality (IVR) offers new possibilities to perform treatments in an ecological and interactive environment with multimodal online feedbacks. Sixteen school-aged children (mean age 11 ± 2.4 years) with Bilateral CP-diplegia, attending mainstream schools were recruited for a pilot study in a pre-post treatment experimental design. The intervention was focused on walking competences and endurance and performed by the Gait Real-time Analysis Interactive Lab (GRAIL), an innovative treadmill platform based on IVR. The participants underwent eighteen therapy sessions in 4 weeks. Functional evaluations, instrumental measures including GAIT analysis and parental questionnaire were utilized to assess the treatment effects. Walking pattern (stride length left and right side, respectively p = 0.001 and 0.003; walking speed p = 0.001), endurance (6MWT, p = 0.026), gross motor abilities (GMFM-88, p = 0.041) and most kinematic and kinetic parameters significantly improved after the intervention. The changes were mainly predicted by age and cognitive abilities. The effect could have been due to the possibility of IVR to foster integration of motor/perceptual competences beyond the training of the walking ability, giving a chance of improvement also to older and already treated children.

  19. Using immersive media and digital technology to communicate Earth Science

    NASA Astrophysics Data System (ADS)

    Kapur, Ravi

    2016-04-01

    A number of technologies in digital media and interactivity have rapidly advanced and are now converging to enable rich, multi-sensoral experiences which create opportunities for both digital art and science communication. Techniques used in full-dome film-making can now be deployed in virtual reality experiences; gaming technologies can be utilised to explore real data sets; and collaborative interactivity enable new forms of public artwork. This session will explore these converging trends through a number of emerging and forthcoming projects dealing with Earth science, climate change and planetary science.

  20. Designing the Self: The Transformation of the Relational Self-Concept through Social Encounters in a Virtual Immersive Environment

    ERIC Educational Resources Information Center

    Knutzen, K. Brant; Kennedy, David M.

    2012-01-01

    This article describes the findings of a 3-month study on how social encounters mediated by an online Virtual Immersive Environment (VIE) impacted on the relational self-concept of adolescents. The study gathered data from two groups of students as they took an Introduction to Design and Programming class. Students in group 1 undertook course…

  1. An Australian and New Zealand Scoping Study on the Use of 3D Immersive Virtual Worlds in Higher Education

    ERIC Educational Resources Information Center

    Dalgarno, Barney; Lee, Mark J. W.; Carlson, Lauren; Gregory, Sue; Tynan, Belinda

    2011-01-01

    This article describes the research design of, and reports selected findings from, a scoping study aimed at examining current and planned applications of 3D immersive virtual worlds at higher education institutions across Australia and New Zealand. The scoping study is the first of its kind in the region, intended to parallel and complement a…

  2. CAVE2: a hybrid reality environment for immersive simulation and information analysis

    NASA Astrophysics Data System (ADS)

    Febretti, Alessandro; Nishimoto, Arthur; Thigpen, Terrance; Talandis, Jonas; Long, Lance; Pirtle, J. D.; Peterka, Tom; Verlo, Alan; Brown, Maxine; Plepys, Dana; Sandin, Dan; Renambot, Luc; Johnson, Andrew; Leigh, Jason

    2013-03-01

    Hybrid Reality Environments represent a new kind of visualization spaces that blur the line between virtual environments and high resolution tiled display walls. This paper outlines the design and implementation of the CAVE2TM Hybrid Reality Environment. CAVE2 is the world's first near-seamless flat-panel-based, surround-screen immersive system. Unique to CAVE2 is that it will enable users to simultaneously view both 2D and 3D information, providing more flexibility for mixed media applications. CAVE2 is a cylindrical system of 24 feet in diameter and 8 feet tall, and consists of 72 near-seamless, off-axisoptimized passive stereo LCD panels, creating an approximately 320 degree panoramic environment for displaying information at 37 Megapixels (in stereoscopic 3D) or 74 Megapixels in 2D and at a horizontal visual acuity of 20/20. Custom LCD panels with shifted polarizers were built so the images in the top and bottom rows of LCDs are optimized for vertical off-center viewing- allowing viewers to come closer to the displays while minimizing ghosting. CAVE2 is designed to support multiple operating modes. In the Fully Immersive mode, the entire room can be dedicated to one virtual simulation. In 2D model, the room can operate like a traditional tiled display wall enabling users to work with large numbers of documents at the same time. In the Hybrid mode, a mixture of both 2D and 3D applications can be simultaneously supported. The ability to treat immersive work spaces in this Hybrid way has never been achieved before, and leverages the special abilities of CAVE2 to enable researchers to seamlessly interact with large collections of 2D and 3D data. To realize this hybrid ability, we merged the Scalable Adaptive Graphics Environment (SAGE) - a system for supporting 2D tiled displays, with Omegalib - a virtual reality middleware supporting OpenGL, OpenSceneGraph and Vtk applications.

  3. [A new concept in digestive surgery: the computer assisted surgical procedure, from virtual reality to telemanipulation].

    PubMed

    Marescaux, J; Clément, J M; Nord, M; Russier, Y; Tassetti, V; Mutter, D; Cotin, S; Ayache, N

    1997-11-01

    Surgical simulation increasingly appears to be an essential aspect of tomorrow's surgery. The development of a hepatic surgery simulator is an advanced concept calling for a new writing system which will transform the medical world: virtual reality. Virtual reality extends the perception of our five senses by representing more than the real state of things by the means of computer sciences and robotics. It consists of three concepts: immersion, navigation and interaction. Three reasons have led us to develop this simulator: the first is to provide the surgeon with a comprehensive visualisation of the organ. The second reason is to allow for planning and surgical simulation that could be compared with the detailed flight-plan for a commercial jet pilot. The third lies in the fact that virtual reality is an integrated part of the concept of computer assisted surgical procedure. The project consists of a sophisticated simulator which has to include five requirements: visual fidelity, interactivity, physical properties, physiological properties, sensory input and output. In this report we will describe how to get a realistic 3D model of the liver from bi-dimensional 2D medical images for anatomical and surgical training. The introduction of a tumor and the consequent planning and virtual resection is also described, as are force feedback and real-time interaction.

  4. Medical Student Bias and Care Recommendations for an Obese versus Non-Obese Virtual Patient

    PubMed Central

    Persky, Susan; Eccleston, Collette P.

    2010-01-01

    Objective This study examined the independent effect of a patient's weight on medical students' attitudes, beliefs, and interpersonal behavior toward the patient, in addition to the clinical recommendations they make for her care. Design Seventy-six clinical-level medical students were randomly assigned to interact with a digital, virtual female patient who was visibly either obese or non-obese. Methods Interactions with the patient took place in an immersive virtual clinical environment (i.e., virtual reality) which allowed standardization of all patient behaviors and characteristics except for weight. Visual contact behavior was automatically recorded during the interaction. Afterward, participants filled out a battery of self-report questionnaires. Results Analyses revealed more negative stereotyping, less anticipated patient adherence, worse perceived health, more responsibility attributed for potentially weight-related presenting complaints, and less visual contact directed toward the obese version of a virtual patient than the non-obese version of the patient. In contrast, there was no clear evidence of bias in clinical recommendations made for the patient's care. Conclusion Biases in attitudes, beliefs, and interpersonal behavior have important implications because they can influence the tone of clinical encounters and rapport in the patient-provider relationship, which can have important downstream consequences. Gaining a clear understanding of the nature and source of weight bias in the clinical encounter is an important first step toward development of strategies to address it. PMID:20820169

  5. Immersive Technologies and Language Learning

    ERIC Educational Resources Information Center

    Blyth, Carl

    2018-01-01

    This article briefly traces the historical conceptualization of linguistic and cultural immersion through technological applications, from the early days of locally networked computers to the cutting-edge technologies known as virtual reality and augmented reality. Next, the article explores the challenges of immersive technologies for the field…

  6. Validation of virtual reality as a tool to understand and prevent child pedestrian injury.

    PubMed

    Schwebel, David C; Gaines, Joanna; Severson, Joan

    2008-07-01

    In recent years, virtual reality has emerged as an innovative tool for health-related education and training. Among the many benefits of virtual reality is the opportunity for novice users to engage unsupervised in a safe environment when the real environment might be dangerous. Virtual environments are only useful for health-related research, however, if behavior in the virtual world validly matches behavior in the real world. This study was designed to test the validity of an immersive, interactive virtual pedestrian environment. A sample of 102 children and 74 adults was recruited to complete simulated road-crossings in both the virtual environment and the identical real environment. In both the child and adult samples, construct validity was demonstrated via significant correlations between behavior in the virtual and real worlds. Results also indicate construct validity through developmental differences in behavior; convergent validity by showing correlations between parent-reported child temperament and behavior in the virtual world; internal reliability of various measures of pedestrian safety in the virtual world; and face validity, as measured by users' self-reported perception of realism in the virtual world. We discuss issues of generalizability to other virtual environments, and the implications for application of virtual reality to understanding and preventing pediatric pedestrian injuries.

  7. INCREASING SAVING BEHAVIOR THROUGH AGE-PROGRESSED RENDERINGS OF THE FUTURE SELF.

    PubMed

    Hershfield, Hal E; Goldstein, Daniel G; Sharpe, William F; Fox, Jesse; Yeykelis, Leo; Carstensen, Laura L; Bailenson, Jeremy N

    2011-11-01

    Many people fail to save what they need to for retirement (Munnell, Webb, and Golub-Sass 2009). Research on excessive discounting of the future suggests that removing the lure of immediate rewards by pre-committing to decisions, or elaborating the value of future rewards can both make decisions more future-oriented. In this article, we explore a third and complementary route, one that deals not with present and future rewards, but with present and future selves. In line with thinkers who have suggested that people may fail, through a lack of belief or imagination, to identify with their future selves (Parfit 1971; Schelling 1984), we propose that allowing people to interact with age-progressed renderings of themselves will cause them to allocate more resources toward the future. In four studies, participants interacted with realistic computer renderings of their future selves using immersive virtual reality hardware and interactive decision aids. In all cases, those who interacted with virtual future selves exhibited an increased tendency to accept later monetary rewards over immediate ones.

  8. Armagh Observatory - Historic Building Information Modelling for Virtual Learning in Building Conservation

    NASA Astrophysics Data System (ADS)

    Murphy, M.; Chenaux, A.; Keenaghan, G.; GIbson, V..; Butler, J.; Pybusr, C.

    2017-08-01

    In this paper the recording and design for a Virtual Reality Immersive Model of Armagh Observatory is presented, which will replicate the historic buildings and landscape with distant meridian markers and position of its principal historic instruments within a model of the night sky showing the position of bright stars. The virtual reality model can be used for educational purposes allowing the instruments within the historic building model to be manipulated within 3D space to demonstrate how the position measurements of stars were made in the 18th century. A description is given of current student and researchers activities concerning on-site recording and surveying and the virtual modelling of the buildings and landscape. This is followed by a design for a Virtual Reality Immersive Model of Armagh Observatory use game engine and virtual learning platforms and concepts.

  9. An Analysis of VR Technology Used in Immersive Simulations with a Serious Game Perspective.

    PubMed

    Menin, Aline; Torchelsen, Rafael; Nedel, Luciana

    2018-03-01

    Using virtual environments (VEs) is a safer and cost-effective alternative to executing dangerous tasks, such as training firefighters and industrial operators. Immersive virtual reality (VR) combined with game aspects have the potential to improve the user experience in the VE by increasing realism, engagement, and motivation. This article investigates the impact of VR technology on 46 immersive gamified simulations with serious purposes and classifies it towards a taxonomy. Our findings suggest that immersive VR improves simulation outcomes, such as increasing learning gain and knowledge retention and improving clinical outcomes for rehabilitation. However, it also has limitations such as motion sickness and restricted access to VR hardware. Our contributions are to provide a better understanding of the benefits and limitations of using VR in immersive simulations with serious purposes, to propose a taxonomy that classifies them, and to discuss whether methods and participants profiles influence results.

  10. How and Why Affective and Reactive Virtual Agents Will Bring New Insights on Social Cognitive Disorders in Schizophrenia? An Illustration with a Virtual Card Game Paradigm

    PubMed Central

    Oker, Ali; Prigent, Elise; Courgeon, Matthieu; Eyharabide, Victoria; Urbach, Mathieu; Bazin, Nadine; Amorim, Michel-Ange; Passerieux, Christine; Martin, Jean-Claude; Brunet-Gouet, Eric

    2015-01-01

    In recent decades, many studies have shown that schizophrenia is associated with severe social cognitive impairments affecting key components, such as the recognition of emotions, theory of mind, attributional style, and metacognition. Most studies investigated each construct separately, precluding analysis of the interactive and immersive nature of real-life situation. Specialized batteries of tests are under investigation to assess social cognition, which is thought now as a link between neurocognitive disorders and impaired functioning. However, this link accounts for a limited part of the variance of real-life functioning. To fill this gap, advances in virtual reality and affective computing have made it possible to carry out experimental investigations of naturalistic social cognition, in controlled conditions, with good reproducibility. This approach is illustrated with the description of a new paradigm based on an original virtual card game in which subjects interpret emotional displays from a female virtual agent, and decipher her helping intentions. Independent variables concerning emotional expression in terms of valence and intensity were manipulated. We show how several useful dependant variables, ranging from classic experimental psychology data to metacognition or subjective experiences records, may be extracted from a single experiment. Methodological issues about the immersion into a simulated intersubjective situation are considered. The example of this new flexible experimental setting, with regards to the many constructs recognized in social neurosciences, constitutes a rationale for focusing on this potential intermediate link between standardized tests and real-life functioning, and also for using it as an innovative media for cognitive remediation. PMID:25870549

  11. How and why affective and reactive virtual agents will bring new insights on social cognitive disorders in schizophrenia? An illustration with a virtual card game paradigm.

    PubMed

    Oker, Ali; Prigent, Elise; Courgeon, Matthieu; Eyharabide, Victoria; Urbach, Mathieu; Bazin, Nadine; Amorim, Michel-Ange; Passerieux, Christine; Martin, Jean-Claude; Brunet-Gouet, Eric

    2015-01-01

    In recent decades, many studies have shown that schizophrenia is associated with severe social cognitive impairments affecting key components, such as the recognition of emotions, theory of mind, attributional style, and metacognition. Most studies investigated each construct separately, precluding analysis of the interactive and immersive nature of real-life situation. Specialized batteries of tests are under investigation to assess social cognition, which is thought now as a link between neurocognitive disorders and impaired functioning. However, this link accounts for a limited part of the variance of real-life functioning. To fill this gap, advances in virtual reality and affective computing have made it possible to carry out experimental investigations of naturalistic social cognition, in controlled conditions, with good reproducibility. This approach is illustrated with the description of a new paradigm based on an original virtual card game in which subjects interpret emotional displays from a female virtual agent, and decipher her helping intentions. Independent variables concerning emotional expression in terms of valence and intensity were manipulated. We show how several useful dependant variables, ranging from classic experimental psychology data to metacognition or subjective experiences records, may be extracted from a single experiment. Methodological issues about the immersion into a simulated intersubjective situation are considered. The example of this new flexible experimental setting, with regards to the many constructs recognized in social neurosciences, constitutes a rationale for focusing on this potential intermediate link between standardized tests and real-life functioning, and also for using it as an innovative media for cognitive remediation.

  12. Walking in fully immersive virtual environments: an evaluation of potential adverse effects in older adults and individuals with Parkinson's disease.

    PubMed

    Kim, Aram; Darakjian, Nora; Finley, James M

    2017-02-21

    Virtual reality (VR) has recently been explored as a tool for neurorehabilitation to enable individuals with Parkinson's disease (PD) to practice challenging skills in a safe environment. Current technological advances have enabled the use of affordable, fully immersive head-mounted displays (HMDs) for potential therapeutic applications. However, while previous studies have used HMDs in individuals with PD, these were only used for short bouts of walking. Clinical applications of VR for gait training would likely involve an extended exposure to the virtual environment, which has the potential to cause individuals with PD to experience simulator-related adverse effects due to their age or pathology. Thus, our objective was to evaluate the safety of using an HMD for longer bouts of walking in fully immersive VR for older adults and individuals with PD. Thirty-three participants (11 healthy young, 11 healthy older adults, and 11 individuals with PD) were recruited for this study. Participants walked for 20 min while viewing a virtual city scene through an HMD (Oculus Rift DK2). Safety was evaluated using the mini-BESTest, measures of center of pressure (CoP) excursion, and questionnaires addressing symptoms of simulator sickness (SSQ) and measures of stress and arousal. Most participants successfully completed all trials without any discomfort. There were no significant changes for any of our groups in symptoms of simulator sickness or measures of static and dynamic balance after exposure to the virtual environment. Surprisingly, measures of stress decreased in all groups while the PD group also increased the level of arousal after exposure. Older adults and individuals with PD were able to successfully use immersive VR during walking without adverse effects. This provides systematic evidence supporting the safety of immersive VR for gait training in these populations.

  13. A comparison of older adults' subjective experience with virtual and real environments during dynamic balance activities

    PubMed Central

    Proffitt, Rachel; Lange, Belinda; Chen, Christina; Winstein, Carolee

    2014-01-01

    The purpose of this study was to explore the subjective experience of older adults interacting with both virtual and real environments. Thirty healthy older adults engaged with real and virtual tasks of similar motor demands: reaching to a target in standing and stepping stance. Immersive tendencies and absorption scales were administered before the session. Game engagement and experience questionnaires were completed after each task, followed by a semi-structured interview at the end of the testing session. Data were analyzed respectively using paired t-tests and grounded theory methodology. Participants preferred the virtual task over the real task. They also reported an increase in presence and absorption with the virtual task, describing an external focus of attention. Findings will be used to inform future development of appropriate game-based balance training applications that could be embedded in the home or community settings as part of evidence-based fall prevention programs. PMID:24334299

  14. Evaluating display fidelity and interaction fidelity in a virtual reality game.

    PubMed

    McMahan, Ryan P; Bowman, Doug A; Zielinski, David J; Brady, Rachael B

    2012-04-01

    In recent years, consumers have witnessed a technological revolution that has delivered more-realistic experiences in their own homes through high-definition, stereoscopic televisions and natural, gesture-based video game consoles. Although these experiences are more realistic, offering higher levels of fidelity, it is not clear how the increased display and interaction aspects of fidelity impact the user experience. Since immersive virtual reality (VR) allows us to achieve very high levels of fidelity, we designed and conducted a study that used a six-sided CAVE to evaluate display fidelity and interaction fidelity independently, at extremely high and low levels, for a VR first-person shooter (FPS) game. Our goal was to gain a better understanding of the effects of fidelity on the user in a complex, performance-intensive context. The results of our study indicate that both display and interaction fidelity significantly affect strategy and performance, as well as subjective judgments of presence, engagement, and usability. In particular, performance results were strongly in favor of two conditions: low-display, low-interaction fidelity (representative of traditional FPS games) and high-display, high-interaction fidelity (similar to the real world).

  15. A virtual experimenter to increase standardization for the investigation of placebo effects.

    PubMed

    Horing, Bjoern; Newsome, Nathan D; Enck, Paul; Babu, Sabarish V; Muth, Eric R

    2016-07-18

    Placebo effects are mediated by expectancy, which is highly influenced by psychosocial factors of a treatment context. These factors are difficult to standardize. Furthermore, dedicated placebo research often necessitates single-blind deceptive designs where biases are easily introduced. We propose a study protocol employing a virtual experimenter - a computer program designed to deliver treatment and instructions - for the purpose of standardization and reduction of biases when investigating placebo effects. To evaluate the virtual experimenter's efficacy in inducing placebo effects via expectancy manipulation, we suggest a partially blinded, deceptive design with a baseline/retest pain protocol (hand immersions in hot water bath). Between immersions, participants will receive an (actually inert) medication. Instructions pertaining to the medication will be delivered by one of three metaphors: The virtual experimenter, a human experimenter, and an audio/text presentation (predictor "Metaphor"). The second predictor includes falsely informing participants that the medication is an effective pain killer, or correctly informing them that it is, in fact, inert (predictor "Instruction"). Analysis will be performed with hierarchical linear modelling, with a sample size of N = 50. Results from two pilot studies are presented that indicate the viability of the pain protocol (N = 33), and of the virtual experimenter software and placebo manipulation (N = 48). It will be challenging to establish full comparability between all metaphors used for instruction delivery, and to account for participant differences in acceptance of their virtual interaction partner. Once established, the presence of placebo effects would suggest that the virtual experimenter exhibits sufficient cues to be perceived as a social agent. He could consequently provide a convenient platform to investigate effects of experimenter behavior, or other experimenter characteristics, e.g., sex, age, race/ethnicity or professional status. More general applications are possible, for example in psychological research such as bias research, or virtual reality research. Potential applications also exist for standardizing clinical research by documenting and communicating instructions used in clinical trials.

  16. Developing effective serious games: the effect of background sound on visual fidelity perception with varying texture resolution.

    PubMed

    Rojas, David; Kapralos, Bill; Cristancho, Sayra; Collins, Karen; Hogue, Andrew; Conati, Cristina; Dubrowski, Adam

    2012-01-01

    Despite the benefits associated with virtual learning environments and serious games, there are open, fundamental issues regarding simulation fidelity and multi-modal cue interaction and their effect on immersion, transfer of knowledge, and retention. Here we describe the results of a study that examined the effect of ambient (background) sound on the perception of visual fidelity (defined with respect to texture resolution). Results suggest that the perception of visual fidelity is dependent on ambient sound and more specifically, white noise can have detrimental effects on our perception of high quality visuals. The results of this study will guide future studies that will ultimately aid in developing an understanding of the role that fidelity, and multi-modal interactions play with respect to knowledge transfer and retention for users of virtual simulations and serious games.

  17. Embodying compassion: a virtual reality paradigm for overcoming excessive self-criticism.

    PubMed

    Falconer, Caroline J; Slater, Mel; Rovira, Aitor; King, John A; Gilbert, Paul; Antley, Angus; Brewin, Chris R

    2014-01-01

    Virtual reality has been successfully used to study and treat psychological disorders such as phobias and posttraumatic stress disorder but has rarely been applied to clinically-relevant emotions other than fear and anxiety. Self-criticism is a ubiquitous feature of psychopathology and can be treated by increasing levels of self-compassion. We exploited the known effects of identification with a virtual body to arrange for healthy female volunteers high in self-criticism to experience self-compassion from an embodied first-person perspective within immersive virtual reality. Whereas observation and practice of compassionate responses reduced self-criticism, the additional experience of embodiment also increased self-compassion and feelings of being safe. The results suggest potential new uses for immersive virtual reality in a range of clinical conditions.

  18. Embodying Compassion: A Virtual Reality Paradigm for Overcoming Excessive Self-Criticism

    PubMed Central

    Falconer, Caroline J.; Slater, Mel; Rovira, Aitor; King, John A.; Gilbert, Paul; Antley, Angus; Brewin, Chris R.

    2014-01-01

    Virtual reality has been successfully used to study and treat psychological disorders such as phobias and posttraumatic stress disorder but has rarely been applied to clinically-relevant emotions other than fear and anxiety. Self-criticism is a ubiquitous feature of psychopathology and can be treated by increasing levels of self-compassion. We exploited the known effects of identification with a virtual body to arrange for healthy female volunteers high in self-criticism to experience self-compassion from an embodied first-person perspective within immersive virtual reality. Whereas observation and practice of compassionate responses reduced self-criticism, the additional experience of embodiment also increased self-compassion and feelings of being safe. The results suggest potential new uses for immersive virtual reality in a range of clinical conditions. PMID:25389766

  19. Virtually numbed: immersive video gaming alters real-life experience.

    PubMed

    Weger, Ulrich W; Loughnan, Stephen

    2014-04-01

    As actors in a highly mechanized environment, we are citizens of a world populated not only by fellow humans, but also by virtual characters (avatars). Does immersive video gaming, during which the player takes on the mantle of an avatar, prompt people to adopt the coldness and rigidity associated with robotic behavior and desensitize them to real-life experience? In one study, we correlated participants' reported video-gaming behavior with their emotional rigidity (as indicated by the number of paperclips that they removed from ice-cold water). In a second experiment, we manipulated immersive and nonimmersive gaming behavior and then likewise measured the extent of the participants' emotional rigidity. Both studies yielded reliable impacts, and thus suggest that immersion into a robotic viewpoint desensitizes people to real-life experiences in oneself and others.

  20. Knowledge Acquisition and Job Training for Advanced Technical Skills Using Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Watanuki, Keiichi; Kojima, Kazuyuki

    The environment in which Japanese industry has achieved great respect is changing tremendously due to the globalization of world economies, while Asian countries are undergoing economic and technical development as well as benefiting from the advances in information technology. For example, in the design of custom-made casting products, a designer who lacks knowledge of casting may not be able to produce a good design. In order to obtain a good design and manufacturing result, it is necessary to equip the designer and manufacturer with a support system related to casting design, or a so-called knowledge transfer and creation system. This paper proposes a new virtual reality based knowledge acquisition and job training system for casting design, which is composed of the explicit and tacit knowledge transfer systems using synchronized multimedia and the knowledge internalization system using portable virtual environment. In our proposed system, the education content is displayed in the immersive virtual environment, whereby a trainee may experience work in the virtual site operation. Provided that the trainee has gained explicit and tacit knowledge of casting through the multimedia-based knowledge transfer system, the immersive virtual environment catalyzes the internalization of knowledge and also enables the trainee to gain tacit knowledge before undergoing on-the-job training at a real-time operation site.

  1. A microbased shared virtual world prototype

    NASA Technical Reports Server (NTRS)

    Pitts, Gerald; Robinson, Mark; Strange, Steve

    1993-01-01

    Virtual reality (VR) allows sensory immersion and interaction with a computer-generated environment. The user adopts a physical interface with the computer, through Input/Output devices such as a head-mounted display, data glove, mouse, keyboard, or monitor, to experience an alternate universe. What this means is that the computer generates an environment which, in its ultimate extension, becomes indistinguishable from the real world. 'Imagine a wraparound television with three-dimensional programs, including three-dimensional sound, and solid objects that you can pick up and manipulate, even feel with your fingers and hands.... 'Imagine that you are the creator as well as the consumer of your artificial experience, with the power to use a gesture or word to remold the world you see and hear and feel. That part is not fiction... three-dimensional computer graphics, input/output devices, computer models that constitute a VR system make it possible, today, to immerse yourself in an artificial world and to reach in and reshape it.' Our research's goal was to propose a feasibility experiment in the construction of a networked virtual reality system, making use of current personal computer (PC) technology. The prototype was built using Borland C compiler, running on an IBM 486 33 MHz and a 386 33 MHz. Each game currently is represented as an IPX client on a non-dedicated Novell server. We initially posed the two questions: (1) Is there a need for networked virtual reality? (2) In what ways can the technology be made available to the most people possible?

  2. Investigation of tracking systems properties in CAVE-type virtual reality systems

    NASA Astrophysics Data System (ADS)

    Szymaniak, Magda; Mazikowski, Adam; Meironke, Michał

    2017-08-01

    In recent years, many scientific and industrial centers in the world developed a virtual reality systems or laboratories. One of the most advanced solutions are Immersive 3D Visualization Lab (I3DVL), a CAVE-type (Cave Automatic Virtual Environment) laboratory. It contains two CAVE-type installations: six-screen installation arranged in a form of a cube, and four-screen installation, a simplified version of the previous one. The user feeling of "immersion" and interaction with virtual world depend on many factors, in particular on the accuracy of the tracking system of the user. In this paper properties of the tracking systems applied in I3DVL was investigated. For analysis two parameters were selected: the accuracy of the tracking system and the range of detection of markers by the tracking system in space of the CAVE. Measurements of system accuracy were performed for six-screen installation, equipped with four tracking cameras for three axes: X, Y, Z. Rotation around the Y axis was also analyzed. Measured tracking system shows good linear and rotating accuracy. The biggest issue was the range of the monitoring of markers inside the CAVE. It turned out, that the tracking system lose sight of the markers in the corners of the installation. For comparison, for a simplified version of CAVE (four-screen installation), equipped with eight tracking cameras, this problem was not occur. Obtained results will allow for improvement of cave quality.

  3. Cyber entertainment system using an immersive networked virtual environment

    NASA Astrophysics Data System (ADS)

    Ihara, Masayuki; Honda, Shinkuro; Kobayashi, Minoru; Ishibashi, Satoshi

    2002-05-01

    Authors are examining a cyber entertainment system that applies IPT (Immersive Projection Technology) displays to the entertainment field. This system enables users who are in remote locations to communicate with each other so that they feel as if they are together. Moreover, the system enables those users to experience a high degree of presence, this is due to provision of stereoscopic vision as well as a haptic interface and stereo sound. This paper introduces this system from the viewpoint of space sharing across the network and elucidates its operation using the theme of golf. The system is developed by integrating avatar control, an I/O device, communication links, virtual interaction, mixed reality, and physical simulations. Pairs of these environments are connected across the network. This allows the two players to experience competition. An avatar of each player is displayed by the other player's IPT display in the remote location and is driven by only two magnetic sensors. That is, in the proposed system, users don't need to wear any data suit with a lot of sensors and they are able to play golf without any encumbrance.

  4. You Spin my Head Right Round: Threshold of Limited Immersion for Rotation Gains in Redirected Walking.

    PubMed

    Schmitz, Patric; Hildebrandt, Julian; Valdez, Andre Calero; Kobbelt, Leif; Ziefle, Martina

    2018-04-01

    In virtual environments, the space that can be explored by real walking is limited by the size of the tracked area. To enable unimpeded walking through large virtual spaces in small real-world surroundings, redirection techniques are used. These unnoticeably manipulate the user's virtual walking trajectory. It is important to know how strongly such techniques can be applied without the user noticing the manipulation-or getting cybersick. Previously, this was estimated by measuring a detection threshold (DT) in highly-controlled psychophysical studies, which experimentally isolate the effect but do not aim for perceived immersion in the context of VR applications. While these studies suggest that only relatively low degrees of manipulation are tolerable, we claim that, besides establishing detection thresholds, it is important to know when the user's immersion breaks. We hypothesize that the degree of unnoticed manipulation is significantly different from the detection threshold when the user is immersed in a task. We conducted three studies: a) to devise an experimental paradigm to measure the threshold of limited immersion (TLI), b) to measure the TLI for slowly decreasing and increasing rotation gains, and c) to establish a baseline of cybersickness for our experimental setup. For rotation gains greater than 1.0, we found that immersion breaks quite late after the gain is detectable. However, for gains lesser than 1.0, some users reported a break of immersion even before established detection thresholds were reached. Apparently, the developed metric measures an additional quality of user experience. This article contributes to the development of effective spatial compression methods by utilizing the break of immersion as a benchmark for redirection techniques.

  5. The Use of Information Operations (IO) in Immersive Virtual Environments (IVE)

    DTIC Science & Technology

    2010-06-01

    are motivated or persuaded when interacting with computing products rather than through them. [26] In 2003, Dr. B.J. Fogg , leader of the Stanford...comparable IO utility may be possible through the other computing technologies listed. 23 Figure 6. Captology Focus. From [25] In his book, Dr. Fogg ...Self- Representation on Behavior.” Human Communication Research, no. 33 pp. 271– 290, 2007. [26] B. J. Fogg . Persuasive Technology: Using Computers

  6. Contextual modulation of pain sensitivity utilising virtual environments

    PubMed Central

    Smith, Ashley; Carlow, Klancy; Biddulph, Tara; Murray, Brooke; Paton, Melissa; Harvie, Daniel S

    2017-01-01

    Background: Investigating psychological mechanisms that modulate pain, such as those that might be accessed by manipulation of context, is of great interest to researchers seeking to better understand and treat pain. The aim of this study was to better understand the interaction between pain sensitivity, and contexts with inherent emotional and social salience – by exploiting modern immersive virtual reality (VR) technology. Methods: A within-subjects, randomised, double-blinded, repeated measures (RM) design was used. In total, 25 healthy participants were exposed to neutral, pleasant, threatening, socially positive and socially negative contexts, using an Oculus Rift DK2. Pressure pain thresholds (PPTs) were recorded in each context, as well as prior to and following the procedure. We also investigated whether trait anxiety and pain catastrophisation interacted with the relationship between the different contexts and pain. Results: Pressure pain sensitivity was not modulated by context (p = 0.48). Anxiety and pain catastrophisation were not significantly associated with PPTs, nor did they interact with the relationship between context and PPTs. Conclusion: Contrary to our hypothesis, socially and emotionally salient contexts did not influence pain thresholds. In light of other research, we suggest that pain outcomes might only be tenable to manipulation by contextual cues if they specifically manipulate the meaning of the pain-eliciting stimulus, rather than manipulate psychological state generally – as per the current study. Future research might exploit immersive VR technology to better explore the link between noxious stimuli and contexts that directly alter its threat value. PMID:28491299

  7. Effects of sensory cueing in virtual motor rehabilitation. A review.

    PubMed

    Palacios-Navarro, Guillermo; Albiol-Pérez, Sergio; García-Magariño García, Iván

    2016-04-01

    To critically identify studies that evaluate the effects of cueing in virtual motor rehabilitation in patients having different neurological disorders and to make recommendations for future studies. Data from MEDLINE®, IEEExplore, Science Direct, Cochrane library and Web of Science was searched until February 2015. We included studies that investigate the effects of cueing in virtual motor rehabilitation related to interventions for upper or lower extremities using auditory, visual, and tactile cues on motor performance in non-immersive, semi-immersive, or fully immersive virtual environments. These studies compared virtual cueing with an alternative or no intervention. Ten studies with a total number of 153 patients were included in the review. All of them refer to the impact of cueing in virtual motor rehabilitation, regardless of the pathological condition. After selecting the articles, the following variables were extracted: year of publication, sample size, study design, type of cueing, intervention procedures, outcome measures, and main findings. The outcome evaluation was done at baseline and end of the treatment in most of the studies. All of studies except one showed improvements in some or all outcomes after intervention, or, in some cases, in favor of the virtual rehabilitation group compared to the control group. Virtual cueing seems to be a promising approach to improve motor learning, providing a channel for non-pharmacological therapeutic intervention in different neurological disorders. However, further studies using larger and more homogeneous groups of patients are required to confirm these findings. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. A standardized set of 3-D objects for virtual reality research and applications.

    PubMed

    Peeters, David

    2018-06-01

    The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.

  9. Virtual Exploration of Earth's Evolution

    NASA Astrophysics Data System (ADS)

    Anbar, A. D.; Bruce, G.; Semken, S. C.; Summons, R. E.; Buxner, S.; Horodyskyj, L.; Kotrc, B.; Swann, J.; Klug Boonstra, S. L.; Oliver, C.

    2014-12-01

    Traditional introductory STEM courses often reinforce misconceptions because the large scale of many classes forces a structured, lecture-centric model of teaching that emphasizes delivery of facts rather than exploration, inquiry, and scientific reasoning. This problem is especially acute in teaching about the co-evolution of Earth and life, where classroom learning and textbook teaching are far removed from the immersive and affective aspects of field-based science, and where the challenges of taking large numbers of students into the field make it difficult to expose them to the complex context of the geologic record. We are exploring the potential of digital technologies and online delivery to address this challenge, using immersive and engaging virtual environments that are more like games than like lectures, grounded in active learning, and deliverable at scale via the internet. The goal is to invert the traditional lecture-centric paradigm by placing lectures at the periphery and inquiry-driven, integrative virtual investigations at the center, and to do so at scale. To this end, we are applying a technology platform we devised, supported by NASA and the NSF, that integrates a variety of digital media in a format that we call an immersive virtual field trip (iVFT). In iVFTs, students engage directly with virtual representations of real field sites, with which they interact non-linearly at a variety of scales via game-like exploration while guided by an adaptive tutoring system. This platform has already been used to develop pilot iVFTs useful in teaching anthropology, archeology, ecology, and geoscience. With support the Howard Hughes Medical Institute, we are now developing and evaluating a coherent suite of ~ 12 iVFTs that span the sweep of life's history on Earth, from the 3.8 Ga metasediments of West Greenland to ancient hominid sites in East Africa. These iVFTs will teach fundamental principles of geology and practices of scientific inquiry, and expose students to the evidence from which evolutionary and paleoenvironmental inferences are derived. In addition to making these iVFT available to the geoscience community for EPO, we will evaluate the comparative effectiveness of iVFT and traditional lecture and lab approaches to achieving geoscience learning objectives.

  10. Implementation of 3d Tools and Immersive Experience Interaction for Supporting Learning in a Library-Archive Environment. Visions and Challenges

    NASA Astrophysics Data System (ADS)

    Angeletaki, A.; Carrozzino, M.; Johansen, S.

    2013-07-01

    In this paper we present an experimental environment of 3D books combined with a game application that has been developed by a collaboration project between the Norwegian University of Science and Technology in Trondheim, Norway the NTNU University Library, and the Percro laboratory of Santa Anna University in Pisa, Italy. MUBIL is an international research project involving museums, libraries and ICT academy partners aiming to develop a consistent methodology enabling the use of Virtual Environments as a metaphor to present manuscripts content through the paradigms of interaction and immersion, evaluating different possible alternatives. This paper presents the results of the application of two prototypes of books augmented with the use of XVR and IL technology. We explore immersive-reality design strategies in archive and library contexts for attracting new users. Our newly established Mubil-lab has invited school classes to test the books augmented with 3D models and other multimedia content in order to investigate whether the immersion in such environments can create wider engagement and support learning. The metaphor of 3D books and game designs in a combination allows the digital books to be handled through a tactile experience and substitute the physical browsing. In this paper we present some preliminary results about the enrichment of the user experience in such environment.

  11. Cranial implant design using augmented reality immersive system.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.

  12. A strategic map for high-impact virtual experience design

    NASA Astrophysics Data System (ADS)

    Faste, Haakon; Bergamasco, Massimo

    2009-02-01

    We have employed methodologies of human centered design to inspire and guide the engineering of a definitive low-cost aesthetic multimodal experience intended to stimulate cultural growth. Using a combination of design research, trend analysis and the programming of immersive virtual 3D worlds, over 250 innovative concepts have been brainstormed, prototyped, evaluated and refined. These concepts have been used to create a strategic map for the development of highimpact virtual art experiences, the most promising of which have been incorporated into a multimodal environment programmed in the online interactive 3D platform XVR. A group of test users have evaluated the experience as it has evolved, using a multimodal interface with stereo vision, 3D audio and haptic feedback. This paper discusses the process, content, results, and impact on our engineering laboratory that this research has produced.

  13. Nomad devices for interactions in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    George, Paul; Kemeny, Andras; Merienne, Frédéric; Chardonnet, Jean-Rémy; Thouvenin, Indira Mouttapa; Posselt, Javier; Icart, Emmanuel

    2013-03-01

    Renault is currently setting up a new CAVE™, a 5 rear-projected wall virtual reality room with a combined 3D resolution of 100 Mpixels, distributed over sixteen 4k projectors and two 2k projector as well as an additional 3D HD collaborative powerwall. Renault's CAVE™ aims at answering needs of the various vehicle conception steps [1]. Starting from vehicle Design, through the subsequent Engineering steps, Ergonomic evaluation and perceived quality control, Renault has built up a list of use-cases and carried out an early software evaluation in the four sided CAVE™ of Institute Image, called MOVE. One goal of the project is to study interactions in a CAVE™, especially with nomad devices such as IPhone or IPad to manipulate virtual objects and to develop visualization possibilities. Inspired by nomad devices current uses (multi-touch gestures, IPhone UI look'n'feel and AR applications), we have implemented an early feature set taking advantage of these popular input devices. In this paper, we present its performance through measurement data collected in our test platform, a 4-sided homemade low-cost virtual reality room, powered by ultra-short-range and standard HD home projectors.

  14. Immersive virtual reality platform for medical training: a "killer-application".

    PubMed

    2000-01-01

    The Medical Readiness Trainer (MRT) integrates fully immersive Virtual Reality (VR), highly advanced medical simulation technologies, and medical data to enable unprecedented medical education and training. The flexibility offered by the MRT environment serves as a practical teaching tool today and in the near future the will serve as an ideal vehicle for facilitating the transition to the next level of medical practice, i.e., telepresence and next generation Internet-based collaborative learning.

  15. NASA's Hybrid Reality Lab: One Giant Leap for Full Dive

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2017-01-01

    This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.

  16. A Second Life for eHealth: Prospects for the Use of 3-D Virtual Worlds in Clinical Psychology

    PubMed Central

    Gaggioli, Andrea; Vigna, Cinzia; Riva, Giuseppe

    2008-01-01

    The aim of the present paper is to describe the role played by three-dimensional (3-D) virtual worlds in eHealth applications, addressing some potential advantages and issues related to the use of this emerging medium in clinical practice. Due to the enormous diffusion of the World Wide Web (WWW), telepsychology, and telehealth in general, have become accepted and validated methods for the treatment of many different health care concerns. The introduction of the Web 2.0 has facilitated the development of new forms of collaborative interaction between multiple users based on 3-D virtual worlds. This paper describes the development and implementation of a form of tailored immersive e-therapy called p-health whose key factor is interreality, that is, the creation of a hybrid augmented experience merging physical and virtual worlds. We suggest that compared with conventional telehealth applications such as emails, chat, and videoconferences, the interaction between real and 3-D virtual worlds may convey greater feelings of presence, facilitate the clinical communication process, positively influence group processes and cohesiveness in group-based therapies, and foster higher levels of interpersonal trust between therapists and patients. However, challenges related to the potentially addictive nature of such virtual worlds and questions related to privacy and personal safety will also be discussed. PMID:18678557

  17. Virtual Reality: Emerging Applications and Future Directions

    ERIC Educational Resources Information Center

    Ludlow, Barbara L.

    2015-01-01

    Virtual reality is an emerging technology that has resulted in rapid expansion in the development of virtual immersive environments for use as educational simulations in schools, colleges and universities. This article presents an overview of virtual reality, describes a number of applications currently being used by special educators for…

  18. A Practical Guide, with Theoretical Underpinnings, for Creating Effective Virtual Reality Learning Environments

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.; Domingo, Jelia

    2017-01-01

    With the advent of open source virtual environments, the associated cost reductions, and the more flexible options, avatar-based virtual reality environments are within reach of educators. By using and repurposing readily available virtual environments, instructors can bring engaging, community-building, and immersive learning opportunities to…

  19. An innovative virtual reality training tool for orthognathic surgery.

    PubMed

    Pulijala, Y; Ma, M; Pears, M; Peebles, D; Ayoub, A

    2018-02-01

    Virtual reality (VR) surgery using Oculus Rift and Leap Motion devices is a multi-sensory, holistic surgical training experience. A multimedia combination including 360° videos, three-dimensional interaction, and stereoscopic videos in VR has been developed to enable trainees to experience a realistic surgery environment. The innovation allows trainees to interact with the individual components of the maxillofacial anatomy and apply surgical instruments while watching close-up stereoscopic three-dimensional videos of the surgery. In this study, a novel training tool for Le Fort I osteotomy based on immersive virtual reality (iVR) was developed and validated. Seven consultant oral and maxillofacial surgeons evaluated the application for face and content validity. Using a structured assessment process, the surgeons commented on the content of the developed training tool, its realism and usability, and the applicability of VR surgery for orthognathic surgical training. The results confirmed the clinical applicability of VR for delivering training in orthognathic surgery. Modifications were suggested to improve the user experience and interactions with the surgical instruments. This training tool is ready for testing with surgical trainees. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  20. Virtual race transformation reverses racial in-group bias.

    PubMed

    Hasler, Béatrice S; Spanlang, Bernhard; Slater, Mel

    2017-01-01

    People generally show greater preference for members of their own racial group compared to racial out-group members. This type of 'in-group bias' is evident in mimicry behaviors. We tend to automatically mimic the behaviors of in-group members, and this behavior is associated with interpersonal sensitivity and empathy. However, mimicry is reduced when interacting with out-group members. Although race is considered an unchangeable trait, it is possible using embodiment in immersive virtual reality to engender the illusion in people of having a body of a different race. Previous research has used this technique to show that after a short period of embodiment of White people in a Black virtual body their implicit racial bias against Black people diminishes. Here we show that this technique powerfully enhances mimicry. We carried out an experiment with 32 White (Caucasian) female participants. Half were embodied in a White virtual body and the remainder in a Black virtual body. Each interacted in two different sessions with a White and a Black virtual character, in counterbalanced order. The results show that dyads with the same virtual body skin color expressed greater mimicry than those of different color. Importantly, this effect occurred depending on the virtual body's race, not participants' actual racial group. When embodied in a Black virtual body, White participants treat Black as their novel in-group and Whites become their novel out-group. This reversed in-group bias effect was obtained regardless of participants' level of implicit racial bias. We discuss the theoretical and practical implications of this surprising psychological phenomenon.

  1. Virtual race transformation reverses racial in-group bias

    PubMed Central

    Hasler, Béatrice S.; Spanlang, Bernhard

    2017-01-01

    People generally show greater preference for members of their own racial group compared to racial out-group members. This type of ‘in-group bias’ is evident in mimicry behaviors. We tend to automatically mimic the behaviors of in-group members, and this behavior is associated with interpersonal sensitivity and empathy. However, mimicry is reduced when interacting with out-group members. Although race is considered an unchangeable trait, it is possible using embodiment in immersive virtual reality to engender the illusion in people of having a body of a different race. Previous research has used this technique to show that after a short period of embodiment of White people in a Black virtual body their implicit racial bias against Black people diminishes. Here we show that this technique powerfully enhances mimicry. We carried out an experiment with 32 White (Caucasian) female participants. Half were embodied in a White virtual body and the remainder in a Black virtual body. Each interacted in two different sessions with a White and a Black virtual character, in counterbalanced order. The results show that dyads with the same virtual body skin color expressed greater mimicry than those of different color. Importantly, this effect occurred depending on the virtual body’s race, not participants’ actual racial group. When embodied in a Black virtual body, White participants treat Black as their novel in-group and Whites become their novel out-group. This reversed in-group bias effect was obtained regardless of participants’ level of implicit racial bias. We discuss the theoretical and practical implications of this surprising psychological phenomenon. PMID:28437469

  2. Use of Immersive Simulations to Enhance Graduate Student Learning: Implications for Educational Leadership Programs

    ERIC Educational Resources Information Center

    Voelkel, Robert H.; Johnson, Christie W.; Gilbert, Kristen A.

    2016-01-01

    The purpose of this article is to present how one university incorporates immersive simulations through platforms which employ avatars to enhance graduate student understanding and learning in educational leadership programs. While using simulations and immersive virtual environments continues to grow, the literature suggests limited evidence of…

  3. Objective and subjective quality assessment of geometry compression of reconstructed 3D humans in a 3D virtual room

    NASA Astrophysics Data System (ADS)

    Mekuria, Rufael; Cesar, Pablo; Doumanis, Ioannis; Frisiello, Antonella

    2015-09-01

    Compression of 3D object based video is relevant for 3D Immersive applications. Nevertheless, the perceptual aspects of the degradation introduced by codecs for meshes and point clouds are not well understood. In this paper we evaluate the subjective and objective degradations introduced by such codecs in a state of art 3D immersive virtual room. In the 3D immersive virtual room, users are captured with multiple cameras, and their surfaces are reconstructed as photorealistic colored/textured 3D meshes or point clouds. To test the perceptual effect of compression and transmission, we render degraded versions with different frame rates in different contexts (near/far) in the scene. A quantitative subjective study with 16 users shows that negligible distortion of decoded surfaces compared to the original reconstructions can be achieved in the 3D virtual room. In addition, a qualitative task based analysis in a full prototype field trial shows increased presence, emotion, user and state recognition of the reconstructed 3D Human representation compared to animated computer avatars.

  4. STRIVE: Stress Resilience In Virtual Environments: a pre-deployment VR system for training emotional coping skills and assessing chronic and acute stress responses.

    PubMed

    Rizzo, Albert; Buckwalter, J Galen; John, Bruce; Newman, Brad; Parsons, Thomas; Kenny, Patrick; Williams, Josh

    2012-01-01

    The incidence of posttraumatic stress disorder (PTSD) in returning OEF/OIF military personnel is creating a significant healthcare challenge. This has served to motivate research on how to better develop and disseminate evidence-based treatments for PTSD. One emerging form of treatment for combat-related PTSD that has shown promise involves the delivery of exposure therapy using immersive Virtual Reality (VR). Initial outcomes from open clinical trials have been positive and fully randomized controlled trials are currently in progress to further validate this approach. Based on our research group's initial positive outcomes using VR to emotionally engage and successfully treat persons undergoing exposure therapy for PTSD, we have begun development in a similar VR-based approach to deliver stress resilience training with military service members prior to their initial deployment. The Stress Resilience In Virtual Environments (STRIVE) project aims to create a set of combat simulations (derived from our existing Virtual Iraq/Afghanistan exposure therapy system) that are part of a multi-episode narrative experience. Users can be immersed within challenging combat contexts and interact with virtual characters within these episodes as part of an experiential learning approach for training a range of psychoeducational and cognitive-behavioral emotional coping strategies believed to enhance stress resilience. The STRIVE project aims to present this approach to service members prior to deployment as part of a program designed to better prepare military personnel for the types of emotional challenges that are inherent in the combat environment. During these virtual training experiences users are monitored physiologically as part of a larger investigation into the biomarkers of the stress response. One such construct, Allostatic Load, is being directly investigated via physiological and neuro-hormonal analysis from specimen collections taken immediately before and after engagement in the STRIVE virtual experience.

  5. Virtual reality for freely moving animals.

    PubMed

    Stowers, John R; Hofbauer, Maximilian; Bastien, Renaud; Griessner, Johannes; Higgins, Peter; Farooqui, Sarfarazhussain; Fischer, Ruth M; Nowikovsky, Karin; Haubensak, Wulf; Couzin, Iain D; Tessmar-Raible, Kristin; Straw, Andrew D

    2017-10-01

    Standard animal behavior paradigms incompletely mimic nature and thus limit our understanding of behavior and brain function. Virtual reality (VR) can help, but it poses challenges. Typical VR systems require movement restrictions but disrupt sensorimotor experience, causing neuronal and behavioral alterations. We report the development of FreemoVR, a VR system for freely moving animals. We validate immersive VR for mice, flies, and zebrafish. FreemoVR allows instant, disruption-free environmental reconfigurations and interactions between real organisms and computer-controlled agents. Using the FreemoVR platform, we established a height-aversion assay in mice and studied visuomotor effects in Drosophila and zebrafish. Furthermore, by photorealistically mimicking zebrafish we discovered that effective social influence depends on a prospective leader balancing its internally preferred directional choice with social interaction. FreemoVR technology facilitates detailed investigations into neural function and behavior through the precise manipulation of sensorimotor feedback loops in unrestrained animals.

  6. Virtual reality in anxiety disorders: the past and the future.

    PubMed

    Gorini, Alessandra; Riva, Giuseppe

    2008-02-01

    One of the most effective treatments of anxiety is exposure therapy: a person is exposed to specific feared situations or objects that trigger anxiety. This exposure process may be done through actual exposure, with visualization, by imagination or using virtual reality (VR), that provides users with computer simulated environments with and within which they can interact. VR is made possible by the capability of computers to synthesize a 3D graphical environment from numerical data. Furthermore, because input devices sense the subject's reactions and motions, the computer can modify the synthetic environment accordingly, creating the illusion of interacting with, and thus being immersed within the environment. Starting from 1995, different experimental studies have been conducted in order to investigate the effect of VR exposure in the treatment of subclinical fears and anxiety disorders. This review will discuss their outcome and provide guidelines for the use of VR exposure for the treatment of anxious patients.

  7. Comparing two types of navigational interfaces for Virtual Reality.

    PubMed

    Teixeira, Luís; Vilar, Elisângela; Duarte, Emília; Rebelo, Francisco; da Silva, Fernando Moreira

    2012-01-01

    Previous studies suggest significant differences between navigating virtual environments in a life-like walking manner (i.e., using treadmills or walk-in-place techniques) and virtual navigation (i.e., flying while really standing). The latter option, which usually involves hand-centric devices (e.g., joysticks), is the most common in Virtual Reality-based studies, mostly due to low costs, less space and technology demands. However, recently, new interaction devices, originally conceived for videogames have become available offering interesting potentialities for research. This study aimed to explore the potentialities of the Nintendo Wii Balance Board as a navigation interface in a Virtual Environment presented in an immersive Virtual Reality system. Comparing participants' performance while engaged in a simulated emergency egress allows determining the adequacy of such alternative navigation interface on the basis of empirical results. Forty university students participated in this study. Results show that participants were more efficient when performing navigation tasks using the Joystick than with the Balance Board. However there were no significantly differences in the behavioral compliance with exit signs. Therefore, this study suggests that, at least for tasks similar to the studied, the Balance Board have good potentiality to be used as a navigation interface for Virtual Reality systems.

  8. Accessible virtual reality therapy using portable media devices.

    PubMed

    Bruck, Susan; Watters, Paul A

    2010-01-01

    Simulated immersive environments displayed on large screens are a valuable therapeutic asset in the treatment of a range of psychological disorders. Permanent environments are expensive to build and maintain, require specialized clinician training and technical support and often have limited accessibility for clients. Ideally, virtual reality exposure therapy (VRET) could be accessible to the broader community if we could use inexpensive hardware with specifically designed software. This study tested whether watching a handheld non-immersive media device causes nausea and other cybersickness responses. Using a repeated measure design we found that nausea, general discomfort, eyestrain, blurred vision and an increase in salivation significantly increased in response to handheld non-immersive media device exposure.

  9. Molecular Dynamics Visualization (MDV): Stereoscopic 3D Display of Biomolecular Structure and Interactions Using the Unity Game Engine.

    PubMed

    Wiebrands, Michael; Malajczuk, Chris J; Woods, Andrew J; Rohl, Andrew L; Mancera, Ricardo L

    2018-06-21

    Molecular graphics systems are visualization tools which, upon integration into a 3D immersive environment, provide a unique virtual reality experience for research and teaching of biomolecular structure, function and interactions. We have developed a molecular structure and dynamics application, the Molecular Dynamics Visualization tool, that uses the Unity game engine combined with large scale, multi-user, stereoscopic visualization systems to deliver an immersive display experience, particularly with a large cylindrical projection display. The application is structured to separate the biomolecular modeling and visualization systems. The biomolecular model loading and analysis system was developed as a stand-alone C# library and provides the foundation for the custom visualization system built in Unity. All visual models displayed within the tool are generated using Unity-based procedural mesh building routines. A 3D user interface was built to allow seamless dynamic interaction with the model while being viewed in 3D space. Biomolecular structure analysis and display capabilities are exemplified with a range of complex systems involving cell membranes, protein folding and lipid droplets.

  10. Effects of Exercise in Immersive Virtual Environments on Cortical Neural Oscillations and Mental State

    PubMed Central

    Vogt, Tobias; Herpers, Rainer; Askew, Christopher D.; Scherfgen, David; Strüder, Heiko K.; Schneider, Stefan

    2015-01-01

    Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence. PMID:26366305

  11. Effects of Exercise in Immersive Virtual Environments on Cortical Neural Oscillations and Mental State.

    PubMed

    Vogt, Tobias; Herpers, Rainer; Askew, Christopher D; Scherfgen, David; Strüder, Heiko K; Schneider, Stefan

    2015-01-01

    Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence.

  12. Virtual Enterprise: Transforming Entrepreneurship Education

    ERIC Educational Resources Information Center

    Borgese, Anthony

    2011-01-01

    Entrepreneurship education is ripe for utilizing experiential learning methods. Experiential methods are best learned when there is constant immersion into the subject matter. One such transformative learning methodology is Virtual Enterprise (VE). Virtual Enterprise is a multi-faceted, experiential learning methodology disseminated by the City…

  13. Enhancing Pre-Service Teachers' Awareness to Pupils' Test-Anxiety with 3D Immersive Simulation

    ERIC Educational Resources Information Center

    Passig, David; Moshe, Ronit

    2008-01-01

    This study investigated whether participating in a 3D immersive virtual reality world simulating the experience of test-anxiety would affect preservice teachers' awareness to the phenomenon. Ninety subjects participated in this study, and were divided into three groups. The experimental group experienced a 3D immersive simulation which made…

  14. Motivational interviewing workshop in a virtual world: learning as avatars.

    PubMed

    Shershneva, Marianna; Kim, Ji-Hye; Kear, Cynthia; Heyden, Robin; Heyden, Neil; Lee, Jay; Mitchell, Suzanne

    2014-04-01

    Limited research has been done to understand outcomes of continuing medical education offered in three-dimensional, immersive virtual worlds. We studied a case of a virtual world workshop on motivational interviewing (MI) applied to smoking cessation counseling and its educational impact. To facilitate content development and evaluation, we specified desired MI competencies. The workshop consisted of three sessions, which included lectures, practice with standardized patients, and chat interactions. Data were collected from 13 primary care physicians and residents through workshop observation, and pre- and 3-month post-workshop telephone/Skype interviews and interactions with standardized patients. Interactions with standardized patients were assessed by an expert using a validated MI tool and by standardized patients using a tool developed for this study. For 11 participants who attended two or three sessions, we conducted paired-samples t tests comparing mean differences between the competency scores before and after the workshop. Expert assessment showed significant improvement on six of seven MI competencies. All participants reported learning new knowledge and skills, and nine described incorporating new learning into their clinical practice. Practicing MI with standardized patients and/or observing others' practice appeared to be the most helpful workshop component. The evaluated workshop had positive impact on participants' competencies and practice as related to MI applied to smoking cessation counseling. Our findings support further exploration of three-dimensional virtual worlds as learning environments for continuing medical education.

  15. Innovating Training through Immersive Environments: Generation Y, Exploratory Learning, and Serious Games

    NASA Technical Reports Server (NTRS)

    Gendron, Gerald

    2012-01-01

    Over the next decade, those entering Service and Joint Staff positions within the military will come from a different generation than the current leadership. They will come from Generation Y and have differing preferences for learning. Immersive learning environments like serious games and virtual world initiatives can complement traditional training methods to provide a better overall training program for staffs. Generation Y members desire learning methods which are relevant and interactive, regardless of whether they are delivered over the internet or in person. This paper focuses on a project undertaken to assess alternative training methods to teach special operations staffs. It provides a summary of the needs analysis used to consider alternatives and to better posture the Department of Defense for future training development.

  16. Circumplex Model of Affect: A Measure of Pleasure and Arousal During Virtual Reality Distraction Analgesia.

    PubMed

    Sharar, Sam R; Alamdari, Ava; Hoffer, Christine; Hoffman, Hunter G; Jensen, Mark P; Patterson, David R

    2016-06-01

    Immersive virtual reality (VR) distraction provides clinically effective pain relief and increases subjective reports of "fun" in medical settings of procedural pain. The goal of this study was to better describe the variable of "fun" associated with VR distraction analgesia using the circumplex model (pleasure/arousal) of affect. Seventy-four healthy volunteers (mean age, 29 years; 37 females) received a standardized, 18-minute, multimodal pain sequence (alternating thermal heat and electrical stimulation to distal extremities) while receiving immersive, interactive VR distraction. Subjects rated both their subjective pain intensity and fun using 0-10 Graphic Rating Scales, as well as the pleasantness of their emotional valence and their state of arousal on 9-point scales. Compared with pain stimulation in the control (baseline, no VR) condition, immersive VR distraction significantly reduced subjective pain intensity (P < 0.001). During VR distraction, compared with those reporting negative affect, subjects reporting positive affect did so more frequently (41 percent versus 9 percent), as well as reporting both greater pain reduction (22 percent versus 1 percent) and fun scores (7.0 ± 1.9 versus 2.4 ± 1.4). Several factors-lower anxiety, greater fun, greater presence in the VR environment, and positive emotional valence-were associated with subjective analgesia during VR distraction. Immersive VR distraction reduces subjective pain intensity induced by multimodal experimental nociception. Subjects who report less anxiety, more fun, more VR presence, and more positive emotional valence during VR distraction are more likely to report subjective pain reduction. These findings indicate VR distraction analgesia may be mediated through anxiolytic, attentional, and/or affective mechanisms.

  17. Circumplex Model of Affect: A Measure of Pleasure and Arousal During Virtual Reality Distraction Analgesia

    PubMed Central

    Alamdari, Ava; Hoffer, Christine; Hoffman, Hunter G.; Jensen, Mark P.; Patterson, David R.

    2016-01-01

    Abstract Objective: Immersive virtual reality (VR) distraction provides clinically effective pain relief and increases subjective reports of “fun” in medical settings of procedural pain. The goal of this study was to better describe the variable of “fun” associated with VR distraction analgesia using the circumplex model (pleasure/arousal) of affect. Materials and Methods: Seventy-four healthy volunteers (mean age, 29 years; 37 females) received a standardized, 18-minute, multimodal pain sequence (alternating thermal heat and electrical stimulation to distal extremities) while receiving immersive, interactive VR distraction. Subjects rated both their subjective pain intensity and fun using 0–10 Graphic Rating Scales, as well as the pleasantness of their emotional valence and their state of arousal on 9-point scales. Results: Compared with pain stimulation in the control (baseline, no VR) condition, immersive VR distraction significantly reduced subjective pain intensity (P < 0.001). During VR distraction, compared with those reporting negative affect, subjects reporting positive affect did so more frequently (41 percent versus 9 percent), as well as reporting both greater pain reduction (22 percent versus 1 percent) and fun scores (7.0 ± 1.9 versus 2.4 ± 1.4). Several factors—lower anxiety, greater fun, greater presence in the VR environment, and positive emotional valence—were associated with subjective analgesia during VR distraction. Conclusions: Immersive VR distraction reduces subjective pain intensity induced by multimodal experimental nociception. Subjects who report less anxiety, more fun, more VR presence, and more positive emotional valence during VR distraction are more likely to report subjective pain reduction. These findings indicate VR distraction analgesia may be mediated through anxiolytic, attentional, and/or affective mechanisms. PMID:27171578

  18. VERS: a virtual environment for reconstructive surgery planning

    NASA Astrophysics Data System (ADS)

    Montgomery, Kevin N.

    1997-05-01

    The virtual environment for reconstructive surgery (VERS) project at the NASA Ames Biocomputation Center is applying virtual reality technology to aid surgeons in planning surgeries. We are working with a craniofacial surgeon at Stanford to assemble and visualize the bone structure of patients requiring reconstructive surgery either through developmental abnormalities or trauma. This project is an extension of our previous work in 3D reconstruction, mesh generation, and immersive visualization. The current VR system, consisting of an SGI Onyx RE2, FakeSpace BOOM and ImmersiveWorkbench, Virtual Technologies CyberGlove and Ascension Technologies tracker, is currently in development and has already been used to visualize defects preoperatively. In the near future it will be used to more fully plan the surgery and compute the projected result to soft tissue structure. This paper presents the work in progress and details the production of a high-performance, collaborative, and networked virtual environment.

  19. Training wheelchair navigation in immersive virtual environments for patients with spinal cord injury - end-user input to design an effective system.

    PubMed

    Nunnerley, Joanne; Gupta, Swati; Snell, Deborah; King, Marcus

    2017-05-01

    A user-centred design was used to develop and test the feasibility of an immersive 3D virtual reality wheelchair training tool for people with spinal cord injury (SCI). A Wheelchair Training System was designed and modelled using the Oculus Rift headset and a Dynamic Control wheelchair joystick. The system was tested by clinicians and expert wheelchair users with SCI. Data from focus groups and individual interviews were analysed using a general inductive approach to thematic analysis. Four themes emerged: Realistic System, which described the advantages of a realistic virtual environment; a Wheelchair Training System, which described participants' thoughts on the wheelchair training applications; Overcoming Resistance to Technology, the obstacles to introducing technology within the clinical setting; and Working outside the Rehabilitation Bubble which described the protective hospital environment. The Oculus Rift Wheelchair Training System has the potential to provide a virtual rehabilitation setting which could allow wheelchair users to learn valuable community wheelchair use in a safe environment. Nausea appears to be a side effect of the system, which will need to be resolved before this can be a viable clinical tool. Implications for Rehabilitation Immersive virtual reality shows promising benefit for wheelchair training in a rehabilitation setting. Early engagement with consumers can improve product development.

  20. Female artists and the VR crucible: expanding the aesthetic vocabulary

    NASA Astrophysics Data System (ADS)

    Morie, Jacquelyn Ford

    2012-03-01

    Virtual Reality was a technological wonder in its early days, and it was widely held to be a domain where men were the main practitioners. However, a survey done in 2007 of VR Artworks (Immersive Virtual Environments or VEs) showed that women have actually created the majority of artistic immersive works. This argues against the popular idea that the field has been totally dominated by men. While men have made great contributions in advancing the field, especially technologically, it appears most artistic works emerge from a decidedly feminine approach. Such an approach seems well suited to immersive environments as it incorporates aspects of inclusion, wholeness, and a blending of the body and the spirit. Female attention to holistic concerns fits the gestalt approach needed to create in a fully functional yet open-ended virtual world, which focuses not so much on producing a finished object (like a text or a sculpture) but rather on creating a possibility for becoming, like bringing a child into the world. Immersive VEs are not objective works of art to be hung on a wall and critiqued. They are vehicles for experience, vessels to live within for a piece of time.

  1. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    PubMed Central

    Víctor Rodrigo, Mercado-García

    2017-01-01

    Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI). Those cognitive processes take place while a user navigates and explores a virtual environment (VE) and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence) that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI). BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1) set out working environmental conditions, (2) maximize the efficiency of BCI control panels, (3) implement navigation systems based not only on user intentions but also on user emotions, and (4) regulate user mental state to increase the differentiation between control and noncontrol modalities. PMID:29317861

  2. INCREASING SAVING BEHAVIOR THROUGH AGE-PROGRESSED RENDERINGS OF THE FUTURE SELF

    PubMed Central

    HERSHFIELD, HAL E.; GOLDSTEIN, DANIEL G.; SHARPE, WILLIAM F.; FOX, JESSE; YEYKELIS, LEO; CARSTENSEN, LAURA L.; BAILENSON, JEREMY N.

    2014-01-01

    Many people fail to save what they need to for retirement (Munnell, Webb, and Golub-Sass 2009). Research on excessive discounting of the future suggests that removing the lure of immediate rewards by pre-committing to decisions, or elaborating the value of future rewards can both make decisions more future-oriented. In this article, we explore a third and complementary route, one that deals not with present and future rewards, but with present and future selves. In line with thinkers who have suggested that people may fail, through a lack of belief or imagination, to identify with their future selves (Parfit 1971; Schelling 1984), we propose that allowing people to interact with age-progressed renderings of themselves will cause them to allocate more resources toward the future. In four studies, participants interacted with realistic computer renderings of their future selves using immersive virtual reality hardware and interactive decision aids. In all cases, those who interacted with virtual future selves exhibited an increased tendency to accept later monetary rewards over immediate ones. PMID:24634544

  3. Immersion of virtual reality for rehabilitation - Review.

    PubMed

    Rose, Tyler; Nam, Chang S; Chen, Karen B

    2018-05-01

    Virtual reality (VR) shows promise in the application of healthcare and because it presents patients an immersive, often entertaining, approach to accomplish the goal of improvement in performance. Eighteen studies were reviewed to understand human performance and health outcomes after utilizing VR rehabilitation systems. We aimed to understand: (1) the influence of immersion in VR performance and health outcomes; (2) the relationship between enjoyment and potential patient adherence to VR rehabilitation routine; and (3) the influence of haptic feedback on performance in VR. Performance measures including postural stability, navigation task performance, and joint mobility showed varying relations to immersion. Limited data did not allow a solid conclusion between enjoyment and adherence, but patient enjoyment and willingness to participate were reported in care plans that incorporates VR. Finally, different haptic devices such as gloves and controllers provided both strengths and weakness in areas such movement velocity, movement accuracy, and path efficiency. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Virtual Reality Hysteroscopy

    PubMed

    Levy

    1996-08-01

    New interactive computer technologies are having a significant influence on medical education, training, and practice. The newest innovation in computer technology, virtual reality, allows an individual to be immersed in a dynamic computer-generated, three-dimensional environment and can provide realistic simulations of surgical procedures. A new virtual reality hysteroscope passes through a sensing device that synchronizes movements with a three-dimensional model of a uterus. Force feedback is incorporated into this model, so the user actually experiences the collision of an instrument against the uterine wall or the sensation of the resistance or drag of a resectoscope as it cuts through a myoma in a virtual environment. A variety of intrauterine pathologies and procedures are simulated, including hyperplasia, cancer, resection of a uterine septum, polyp, or myoma, and endometrial ablation. This technology will be incorporated into comprehensive training programs that will objectively assess hand-eye coordination and procedural skills. It is possible that by incorporating virtual reality into hysteroscopic training programs, a decrease in the learning curve and the number of complications presently associated with the procedures may be realized. Prospective studies are required to assess these potential benefits.

  5. Building a Virtual Environment for Diabetes Self-Management Education and Support

    PubMed Central

    Johnson, Constance; Feenan, Kevin; Setliff, Glenn; Pereira, Katherine; Hassell, Nancy; Beresford, Henry F.; Epps, Shelly; Nicollerat, Janet; Tatum, William; Feinglos, Mark; Vorderstrasse, Allison

    2015-01-01

    The authors developed an immersive diabetes community to provide diabetes self-management education and support for adults with type 2 diabetes. In this article the authors describe the procedures used to develop this virtual environment (VE). Second Life Impacts Diabetes Education & Self-Management (SLIDES), the VE for our diabetes community was built in Second Life. Social Cognitive Theory, behavioral principles and key aspects of virtual environments related to usability were applied in the development in this VE. Collaboration between researchers, clinicians and information technology (IT) specialists occurred throughout the development process. An interactive community was successfully built and utilized to provide diabetes self-management education and support. VEs for health applications may be innovative and enticing, yet it must be kept in mind that there are substantial effort, expertise, and usability factors that must be considered in the development of these environments for health care consumers. PMID:25699133

  6. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution

    PubMed Central

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks–walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience. PMID:26882473

  7. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution.

    PubMed

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks-walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience.

  8. Using Virtual Worlds to Identify Multidimensional Student Engagement in High School Foreign Language Learning Classrooms

    ERIC Educational Resources Information Center

    Jacob, Laura Beth

    2012-01-01

    Virtual world environments have evolved from object-oriented, text-based online games to complex three-dimensional immersive social spaces where the lines between reality and computer-generated begin to blur. Educators use virtual worlds to create engaging three-dimensional learning spaces for students, but the impact of virtual worlds in…

  9. Faculty Perspectives of Faculty Persona in a Virtual World

    ERIC Educational Resources Information Center

    Blackmon, Stephanie J.

    2013-01-01

    Immersive virtual worlds provide a new way to deliver online courses or parts of online and face-to-face courses. There is a growing body of research on online learning, and the data on virtual worlds is also increasing. However, literature concerning professors' experiences with specific aspects of virtual worlds is limited. For example,…

  10. Will Anything Useful Come Out of Virtual Reality? Examination of a Naval Application

    DTIC Science & Technology

    1993-05-01

    The term virtual reality can encompass varying meanings, but some generally accepted attributes of a virtual environment are that it is immersive...technology, but at present there are few practical applications which are utilizing the broad range of virtual reality technology. This paper will discuss an...Operability, operator functions, Virtual reality , Man-machine interface, Decision aids/decision making, Decision support. ASW.

  11. The effect of degree of immersion upon learning performance in virtual reality simulations for medical education.

    PubMed

    Gutiérrez, Fátima; Pierce, Jennifer; Vergara, Víctor M; Coulter, Robert; Saland, Linda; Caudell, Thomas P; Goldsmith, Timothy E; Alverson, Dale C

    2007-01-01

    Simulations are being used in education and training to enhance understanding, improve performance, and assess competence. However, it is important to measure the performance of these simulations as learning and training tools. This study examined and compared knowledge acquisition using a knowledge structure design. The subjects were first-year medical students at The University of New Mexico School of Medicine. One group used a fully immersed virtual reality (VR) environment using a head mounted display (HMD) and another group used a partially immersed (computer screen) VR environment. The study aims were to determine whether there were significant differences between the two groups as measured by changes in knowledge structure before and after the VR simulation experience. The results showed that both groups benefited from the VR simulation training as measured by the significant increased similarity to the expert knowledge network after the training experience. However, the immersed group showed a significantly higher gain than the partially immersed group. This study demonstrated a positive effect of VR simulation on learning as reflected by improvements in knowledge structure but an enhanced effect of full-immersion using a HMD vs. a screen-based VR system.

  12. Chemistry in Second Life

    PubMed Central

    Lang, Andrew SID; Bradley, Jean-Claude

    2009-01-01

    This review will focus on the current level on chemistry research, education, and visualization possible within the multi-user virtual environment of Second Life. We discuss how Second Life has been used as a platform for the interactive and collaborative visualization of data from molecules and proteins to spectra and experimental data. We then review how these visualizations can be scripted for immersive educational activities and real-life collaborative research. We also discuss the benefits of the social networking affordances of Second Life for both chemists and chemistry students. PMID:19852781

  13. A Theoretically Driven Investigation of the Efficacy of an Immersive Interactive Avatar Rich Virtual Environment in Pre-deployment Nursing Knowledge and Teamwork Skills Training

    DTIC Science & Technology

    2013-05-01

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data... pedagogy , and instructional quality. Measures of effectiveness data is minimal and often has not been conducted in a rigorous manner. To be clear...instructional pedagogy and instructional quality between the programs offered. Efficacy studies beyond student satisfaction scores have not been done in a

  14. Chemistry in second life.

    PubMed

    Lang, Andrew S I D; Bradley, Jean-Claude

    2009-10-23

    This review will focus on the current level on chemistry research, education, and visualization possible within the multi-user virtual environment of Second Life. We discuss how Second Life has been used as a platform for the interactive and collaborative visualization of data from molecules and proteins to spectra and experimental data. We then review how these visualizations can be scripted for immersive educational activities and real-life collaborative research. We also discuss the benefits of the social networking affordances of Second Life for both chemists and chemistry students.

  15. Sensorimotor Learning during a Marksmanship Task in Immersive Virtual Reality

    PubMed Central

    Rao, Hrishikesh M.; Khanna, Rajan; Zielinski, David J.; Lu, Yvonne; Clements, Jillian M.; Potter, Nicholas D.; Sommer, Marc A.; Kopper, Regis; Appelbaum, Lawrence G.

    2018-01-01

    Sensorimotor learning refers to improvements that occur through practice in the performance of sensory-guided motor behaviors. Leveraging novel technical capabilities of an immersive virtual environment, we probed the component kinematic processes that mediate sensorimotor learning. Twenty naïve subjects performed a simulated marksmanship task modeled after Olympic Trap Shooting standards. We measured movement kinematics and shooting performance as participants practiced 350 trials while receiving trial-by-trial feedback about shooting success. Spatiotemporal analysis of motion tracking elucidated the ballistic and refinement phases of hand movements. We found systematic changes in movement kinematics that accompanied improvements in shot accuracy during training, though reaction and response times did not change over blocks. In particular, we observed longer, slower, and more precise ballistic movements that replaced effort spent on corrections and refinement. Collectively, these results leverage developments in immersive virtual reality technology to quantify and compare the kinematics of movement during early learning of full-body sensorimotor orienting. PMID:29467693

  16. Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.

    PubMed

    Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J

    2011-11-01

    To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback.

    PubMed

    Alaraj, Ali; Luciano, Cristian J; Bailey, Daniel P; Elsenousi, Abdussalam; Roitberg, Ben Z; Bernardo, Antonio; Banerjee, P Pat; Charbel, Fady T

    2015-03-01

    With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.

  18. University of Kentucky

    Science.gov Websites

    community are writing new chapters. Follow Along Take the Virtual Tour An immersive, 3D tour of our Dining Net Price Calculator Visit Virtual Visit From President Capilouto Right now, you are joining

  19. NASA Virtual Glovebox: An Immersive Virtual Desktop Environment for Training Astronauts in Life Science Experiments

    NASA Technical Reports Server (NTRS)

    Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard

    2003-01-01

    The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.

  20. Using Immersive Virtual Reality for Electrical Substation Training

    ERIC Educational Resources Information Center

    Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana

    2015-01-01

    Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…

  1. Exploring the Utility of a Virtual Performance Assessment

    ERIC Educational Resources Information Center

    Clarke-Midura, Jody; Code, Jillianne; Zap, Nick; Dede, Chris

    2011-01-01

    With funding from the Institute of Education Sciences (IES), the Virtual Performance Assessment project at the Harvard Graduate School of Education is developing and studying the feasibility of immersive virtual performance assessments (VPAs) to assess scientific inquiry of middle school students as a standardized component of an accountability…

  2. Learning through Place-Making: Virtual Environments and Future Literacies

    ERIC Educational Resources Information Center

    Berry, Maryanne Susan

    2010-01-01

    This study examines a project through which elementary school and high school students collaborated with university Architecture/New Media students in building models of virtual, immersive libraries. It presents the project in the context of multiple and cross-disciplinary fields currently investigating the use of virtual and immersive…

  3. Immersive 3D geovisualisation in higher education

    NASA Astrophysics Data System (ADS)

    Philips, Andrea; Walz, Ariane; Bergner, Andreas; Graeff, Thomas; Heistermann, Maik; Kienzler, Sarah; Korup, Oliver; Lipp, Torsten; Schwanghart, Wolfgang; Zeilinger, Gerold

    2014-05-01

    Through geovisualisation we explore spatial data, we analyse it towards a specific questions, we synthesise results, and we present and communicate them to a specific audience (MacEachren & Kraak 1997). After centuries of paper maps, the means to represent and visualise our physical environment and its abstract qualities have changed dramatically since the 1990s - and accordingly the methods how to use geovisualisation in teaching. Whereas some people might still consider the traditional classroom as ideal setting for teaching and learning geographic relationships and its mapping, we used a 3D CAVE (computer-animated virtual environment) as environment for a problem-oriented learning project called "GEOSimulator". Focussing on this project, we empirically investigated, if such a technological advance like the CAVE make 3D visualisation, including 3D geovisualisation, not only an important tool for businesses (Abulrub et al. 2012) and for the public (Wissen et al. 2008), but also for educational purposes, for which it had hardly been used yet. The 3D CAVE is a three-sided visualisation platform, that allows for immersive and stereoscopic visualisation of observed and simulated spatial data. We examined the benefits of immersive 3D visualisation for geographic research and education and synthesized three fundamental technology-based visual aspects: First, the conception and comprehension of space and location does not need to be generated, but is instantaneously and intuitively present through stereoscopy. Second, optical immersion into virtual reality strengthens this spatial perception which is in particular important for complex 3D geometries. And third, a significant benefit is interactivity, which is enhanced through immersion and allows for multi-discursive and dynamic data exploration and knowledge transfer. Based on our problem-oriented learning project, which concentrates on a case study on flood risk management at the Wilde Weisseritz in Germany, a river that significantly contributed to the hundred-year flooding in Dresden in 2002, we empirically evaluated the usefulness of this immersive 3D technology towards learning success. Results show that immersive 3D geovisualisation have educational and content-related advantages compared to 2D geovisualisations through the mentioned benefits. This innovative way of geovisualisation is thus not only entertaining and motivating for students, but can also be constructive for research studies by, for instance, facilitating the study of complex environments or decision-making processes.

  4. Exploring Learner Acceptance of the Use of Virtual Reality in Medical Education: A Case Study of Desktop and Projection-Based Display Systems

    ERIC Educational Resources Information Center

    Huang, Hsiu-Mei; Liaw, Shu-Sheng; Lai, Chung-Min

    2016-01-01

    Advanced technologies have been widely applied in medical education, including human-patient simulators, immersive virtual reality Cave Automatic Virtual Environment systems, and video conferencing. Evaluating learner acceptance of such virtual reality (VR) learning environments is a critical issue for ensuring that such technologies are used to…

  5. Taking Science Online: Evaluating Presence and Immersion through a Laboratory Experience in a Virtual Learning Environment for Entomology Students

    ERIC Educational Resources Information Center

    Annetta, Leonard; Klesath, Marta; Meyer, John

    2009-01-01

    A 3-D virtual field trip was integrated into an online college entomology course and developed as a trial for the possible incorporation of future virtual environments to supplement online higher education laboratories. This article provides an explanation of the rationale behind creating the virtual experience, the Bug Farm; the method and…

  6. History Educators and the Challenge of Immersive Pasts: A Critical Review of Virtual Reality "Tools" and History Pedagogy

    ERIC Educational Resources Information Center

    Allison, John

    2008-01-01

    This paper will undertake a critical review of the impact of virtual reality tools on the teaching of history. Virtual reality is useful in several different ways. History educators, elementary and secondary school teachers and professors, can all profit from the digital environment. Challenges arise quickly however. Virtual reality technologies…

  7. The Fidelity of ’Feel’: Emotional Affordance in Virtual Environments

    DTIC Science & Technology

    2005-07-01

    The Fidelity of “Feel”: Emotional Affordance in Virtual Environments Jacquelyn Ford Morie, Josh Williams, Aimee Dozois, Donat-Pierre Luigi... environment but also the participant. We do this with the focus on what emotional affordances this manipulation will provide. Our first evaluation scenario...emotionally affective VEs. Keywords: Immersive Environments , Virtual Environments , VEs, Virtual Reality, emotion , affordance, fidelity, presence

  8. Direct manipulation of virtual objects

    NASA Astrophysics Data System (ADS)

    Nguyen, Long K.

    Interacting with a Virtual Environment (VE) generally requires the user to correctly perceive the relative position and orientation of virtual objects. For applications requiring interaction in personal space, the user may also need to accurately judge the position of the virtual object relative to that of a real object, for example, a virtual button and the user's real hand. This is difficult since VEs generally only provide a subset of the cues experienced in the real world. Complicating matters further, VEs presented by currently available visual displays may be inaccurate or distorted due to technological limitations. Fundamental physiological and psychological aspects of vision as they pertain to the task of object manipulation were thoroughly reviewed. Other sensory modalities -- proprioception, haptics, and audition -- and their cross-interactions with each other and with vision are briefly discussed. Visual display technologies, the primary component of any VE, were canvassed and compared. Current applications and research were gathered and categorized by different VE types and object interaction techniques. While object interaction research abounds in the literature, pockets of research gaps remain. Direct, dexterous, manual interaction with virtual objects in Mixed Reality (MR), where the real, seen hand accurately and effectively interacts with virtual objects, has not yet been fully quantified. An experimental test bed was designed to provide the highest accuracy attainable for salient visual cues in personal space. Optical alignment and user calibration were carefully performed. The test bed accommodated the full continuum of VE types and sensory modalities for comprehensive comparison studies. Experimental designs included two sets, each measuring depth perception and object interaction. The first set addressed the extreme end points of the Reality-Virtuality (R-V) continuum -- Immersive Virtual Environment (IVE) and Reality Environment (RE). This validated, linked, and extended several previous research findings, using one common test bed and participant pool. The results provided a proven method and solid reference points for further research. The second set of experiments leveraged the first to explore the full R-V spectrum and included additional, relevant sensory modalities. It consisted of two full-factorial experiments providing for rich data and key insights into the effect of each type of environment and each modality on accuracy and timeliness of virtual object interaction. The empirical results clearly showed that mean depth perception error in personal space was less than four millimeters whether the stimuli presented were real, virtual, or mixed. Likewise, mean error for the simple task of pushing a button was less than four millimeters whether the button was real or virtual. Mean task completion time was less than one second. Key to the high accuracy and quick task performance time observed was the correct presentation of the visual cues, including occlusion, stereoscopy, accommodation, and convergence. With performance results already near optimal level with accurate visual cues presented, adding proprioception, audio, and haptic cues did not significantly improve performance. Recommendations for future research include enhancement of the visual display and further experiments with more complex tasks and additional control variables.

  9. Virtual Solar Energy Center: A Case Study of the Use of Advanced Visualization Techniques for the Comprehension of Complex Engineering Products and Processes

    NASA Astrophysics Data System (ADS)

    Ritter, Kenneth August, III

    Industry has a continuing need to train its workforce on recent engineering developments, but many engineering products and processes are hard to explain because of limitations of size, visibility, time scale, cost, and safety. The product or process might be difficult to see because it is either very large or very small, because it is enclosed within an opaque container, or because it happens very fast or very slowly. Some engineering products and processes are also costly or unsafe to use for training purposes, and sometimes the domain expert is not physically available at the training location. All these limitations can potentially be addressed using advanced visualization techniques such as virtual reality. This dissertation describes the development of an immersive virtual reality application using the Six Sigma DMADV process to explain the main equipment and processes used in a concentrating solar power plant. The virtual solar energy center (VEC) application was initially developed and tested in a Cave Automatic Virtual Environment (CAVE) during 2013 and 2014. The software programs used for development were SolidWorks, 3ds Max Design, and Unity 3D. Current hardware and software technologies that could complement this research were analyzed. The NVIDA GRID Visual Computing Appliance (VCA) was chosen as the rendering solution for animating complex CAD models in this application. The MiddleVR software toolkit was selected as the toolkit for VR interactions and CAVE display. A non-immersive 3D version of the VEC application was tested and shown to be an effective training tool in late 2015. An immersive networked version of the VEC allows the user to receive live instruction from a trainer being projected via depth camera imagery from a remote location. Four comparative analysis studies were performed. These studies used the average normalized gain from pre-test scores to determine the effectiveness of the various training methods. With the DMADV approach, solutions were identified and verified during each iteration of the development, which saved valuable time and resulted in better results being achieved in each revision of the application, with the final version having 88% positive responses and same effectiveness as other methods assessed.

  10. Evaluating Multiple Levels of an Interaction Fidelity Continuum on Performance and Learning in Near-Field Training Simulations.

    PubMed

    Bhargava, Ayush; Bertrand, Jeffrey W; Gramopadhye, Anand K; Madathil, Kapil C; Babu, Sabarish V

    2018-04-01

    With costs of head-mounted displays (HMDs) and tracking technology decreasing rapidly, various virtual reality applications are being widely adopted for education and training. Hardware advancements have enabled replication of real-world interactions in virtual environments to a large extent, paving the way for commercial grade applications that provide a safe and risk-free training environment at a fraction of the cost. But this also mandates the need to develop more intrinsic interaction techniques and to empirically evaluate them in a more comprehensive manner. Although there exists a body of previous research that examines the benefits of selected levels of interaction fidelity on performance, few studies have investigated the constituent components of fidelity in a Interaction Fidelity Continuum (IFC) with several system instances and their respective effects on performance and learning in the context of a real-world skills training application. Our work describes a large between-subjects investigation conducted over several years that utilizes bimanual interaction metaphors at six discrete levels of interaction fidelity to teach basic precision metrology concepts in a near-field spatial interaction task in VR. A combined analysis performed on the data compares and contrasts the six different conditions and their overall effects on performance and learning outcomes, eliciting patterns in the results between the discrete application points on the IFC. With respect to some performance variables, results indicate that simpler restrictive interaction metaphors and highest fidelity metaphors perform better than medium fidelity interaction metaphors. In light of these results, a set of general guidelines are created for developers of spatial interaction metaphors in immersive virtual environments for precise fine-motor skills training simulations.

  11. Effects of virtual reality immersion and audiovisual distraction techniques for patients with pruritus

    PubMed Central

    Leibovici, Vera; Magora, Florella; Cohen, Sarale; Ingber, Arieh

    2009-01-01

    BACKGROUND: Virtual reality immersion (VRI), an advanced computer-generated technique, decreased subjective reports of pain in experimental and procedural medical therapies. Furthermore, VRI significantly reduced pain-related brain activity as measured by functional magnetic resonance imaging. Resemblance between anatomical and neuroendocrine pathways of pain and pruritus may prove VRI to be a suitable adjunct for basic and clinical studies of the complex aspects of pruritus. OBJECTIVES: To compare effects of VRI with audiovisual distraction (AVD) techniques for attenuation of pruritus in patients with atopic dermatitis and psoriasis vulgaris. METHODS: Twenty-four patients suffering from chronic pruritus – 16 due to atopic dermatitis and eight due to psoriasis vulgaris – were randomly assigned to play an interactive computer game using a special visor or a computer screen. Pruritus intensity was self-rated before, during and 10 min after exposure using a visual analogue scale ranging from 0 to 10. The interviewer rated observed scratching on a three-point scale during each distraction program. RESULTS: Student’s t tests were significant for reduction of pruritus intensity before and during VRI and AVD (P=0.0002 and P=0.01, respectively) and were significant only between ratings before and after VRI (P=0.017). Scratching was mostly absent or mild during both programs. CONCLUSIONS: VRI and AVD techniques demonstrated the ability to diminish itching sensations temporarily. Further studies on the immediate and late effects of interactive computer distraction techniques to interrupt itching episodes will open potential paths for future pruritus research. PMID:19714267

  12. Object Creation and Human Factors Evaluation for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1998-01-01

    The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.

  13. The effectiveness of virtual reality distraction for pain reduction: a systematic review.

    PubMed

    Malloy, Kevin M; Milling, Leonard S

    2010-12-01

    Virtual reality technology enables people to become immersed in a computer-simulated, three-dimensional environment. This article provides a comprehensive review of controlled research on the effectiveness of virtual reality (VR) distraction for reducing pain. To be included in the review, studies were required to use a between-subjects or mixed model design in which VR distraction was compared with a control condition or an alternative intervention in relieving pain. An exhaustive search identified 11 studies satisfying these criteria. VR distraction was shown to be effective for reducing experimental pain, as well as the discomfort associated with burn injury care. Studies of needle-related pain provided less consistent findings. Use of more sophisticated virtual reality technology capable of fully immersing the individual in a virtual environment was associated with greater relief. Overall, controlled research suggests that VR distraction may be a useful tool for clinicians who work with a variety of pain problems. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Immersive telepresence system using high-resolution omnidirectional movies and a locomotion interface

    NASA Astrophysics Data System (ADS)

    Ikeda, Sei; Sato, Tomokazu; Kanbara, Masayuki; Yokoya, Naokazu

    2004-05-01

    Technology that enables users to experience a remote site virtually is called telepresence. A telepresence system using real environment images is expected to be used in the field of entertainment, medicine, education and so on. This paper describes a novel telepresence system which enables users to walk through a photorealistic virtualized environment by actual walking. To realize such a system, a wide-angle high-resolution movie is projected on an immersive multi-screen display to present users the virtualized environments and a treadmill is controlled according to detected user's locomotion. In this study, we use an omnidirectional multi-camera system to acquire images real outdoor scene. The proposed system provides users with rich sense of walking in a remote site.

  15. Motivational Interviewing Workshop in a Virtual World: Learning as Avatars

    PubMed Central

    Shershneva, Marianna; Kim, Ji-Hye; Kear, Cynthia; Heyden, Robin; Heyden, Neil; Lee, Jay; Mitchell, Suzanne

    2015-01-01

    Background Limited research has been done to understand outcomes of continuing medical education offered in three-dimensional, immersive virtual worlds. Objectives We studied a case of a virtual world workshop on motivational interviewing (MI) applied to smoking cessation counseling and its educational impact. Methods To facilitate content development and evaluation, we specified desired MI competencies. The workshop consisted of three sessions, which included lectures, practice with standardized patients, and chat interactions. Data were collected from 13 primary care physicians and residents through workshop observation, and pre- and three-month post-workshop telephone/Skype interviews and interactions with standardized patients. Interactions with standardized patients were assessed by an expert using a validated MI tool, and by standardized patients using a tool developed for this study. For 11 participants who attended two or three sessions, we conducted paired-samples t-tests comparing mean differences between the competency scores pre- and post-event. Results Expert assessment showed significant improvement on six of seven MI competencies (p< .05). All participants reported learning new knowledge and skills, and nine described incorporating new learning into their clinical practice. Practicing MI with standardized patients and/or observing others' practice appeared to be the most helpful workshop component. Conclusions The evaluated workshop had positive impact on participants' competencies and practice as related to MI applied to smoking cessation counseling. Our findings support further exploration of three-dimensional virtual worlds as learning environments for continuing medical education. PMID:24788420

  16. Learning Experience with Virtual Worlds

    ERIC Educational Resources Information Center

    Wagner, Christian

    2008-01-01

    Virtual worlds create a new opportunity to enrich the educational experience through media-rich immersive learning. Virtual worlds have gained notoriety in games such as World of Warcraft (WoW), which has become the most successful online game ever, and in "general purpose" worlds, such as Second Life (SL), whose participation levels (more than 10…

  17. The Pixelated Professor: Faculty in Immersive Virtual Worlds

    ERIC Educational Resources Information Center

    Blackmon, Stephanie

    2015-01-01

    Online environments, particularly virtual worlds, can sometimes complicate issues of self expression. For example, the faculty member who loves punk rock has an opportunity, through hairstyle and attire choices in the virtual world, to share that part of herself with students. However, deciding to share that part of the self can depend on a number…

  18. Teaching Literature in Virtual Worlds: Immersive Learning in English Studies

    ERIC Educational Resources Information Center

    Webb, Allen, Ed.

    2011-01-01

    What are the realities and possibilities of utilizing on-line virtual worlds as teaching tools for specific literary works? Through engaging and surprising stories from classrooms where virtual worlds are in use, this book invites readers to understand and participate in this emerging and valuable pedagogy. It examines the experience of high…

  19. An Investigation into Cooperative Learning in a Virtual World Using Problem-Based Learning

    ERIC Educational Resources Information Center

    Parson, Vanessa; Bignell, Simon

    2017-01-01

    Three-dimensional multi-user virtual environments (MUVEs) have the potential to provide experiential learning qualitatively similar to that found in the real world. MUVEs offer a pedagogically-driven immersive learning opportunity for educationalists that is cost-effective and enjoyable. A family of digital virtual avatars was created within…

  20. Pre-Service Teachers Designing Virtual World Learning Environments

    ERIC Educational Resources Information Center

    Jacka, Lisa; Booth, Kate

    2012-01-01

    Integrating Information Technology Communications in the classroom has been an important part of pre-service teacher education for over a decade. The advent of virtual worlds provides the pre-service teacher with an opportunity to study teaching and learning in a highly immersive 3D computer-based environment. Virtual worlds also provide a place…

  1. Embodying self-compassion within virtual reality and its effects on patients with depression.

    PubMed

    Falconer, Caroline J; Rovira, Aitor; King, John A; Gilbert, Paul; Antley, Angus; Fearon, Pasco; Ralph, Neil; Slater, Mel; Brewin, Chris R

    2016-01-01

    Self-criticism is a ubiquitous feature of psychopathology and can be combatted by increasing levels of self-compassion. However, some patients are resistant to self-compassion. To investigate whether the effects of self-identification with virtual bodies within immersive virtual reality could be exploited to increase self-compassion in patients with depression. We developed an 8-minute scenario in which 15 patients practised delivering compassion in one virtual body and then experienced receiving it from themselves in another virtual body. In an open trial, three repetitions of this scenario led to significant reductions in depression severity and self-criticism, as well as to a significant increase in self-compassion, from baseline to 4-week follow-up. Four patients showed clinically significant improvement. The results indicate that interventions using immersive virtual reality may have considerable clinical potential and that further development of these methods preparatory to a controlled trial is now warranted. None. © The Royal College of Psychiatrists 2016. This is an open access article distributed under the terms of the Creative Commons Attribution (CC BY) licence.

  2. Framing the magic

    NASA Astrophysics Data System (ADS)

    Tsoupikova, Daria

    2006-02-01

    This paper will explore how the aesthetics of the virtual world affects, transforms, and enhances the immersive emotional experience of the user. What we see and what we do upon entering the virtual environment influences our feelings, mental state, physiological changes and sensibility. To create a unique virtual experience the important component to design is the beauty of the virtual world based on the aesthetics of the graphical objects such as textures, models, animation, and special effects. The aesthetic potency of the images that comprise the virtual environment can make the immersive experience much stronger and more compelling. The aesthetic qualities of the virtual world as born out through images and graphics can influence the user's state of mind. Particular changes and effects on the user can be induced through the application of techniques derived from the research fields of psychology, anthropology, biology, color theory, education, art therapy, music, and art history. Many contemporary artists and developers derive much inspiration for their work from their experience with traditional arts such as painting, sculpture, design, architecture and music. This knowledge helps them create a higher quality of images and stereo graphics in the virtual world. The understanding of the close relation between the aesthetic quality of the virtual environment and the resulting human perception is the key to developing an impressive virtual experience.

  3. Mapping Social Interactions: The Science of Proxemics.

    PubMed

    McCall, Cade

    Interpersonal distance and gaze provide a wealth of information during face-to-face social interactions. These "proxemic" behaviors offer a window into everyday social cognition by revealing interactants' affective states (e.g., interpersonal attitudes) and cognitive responses (e.g., social attention). Here we provide a brief overview of the social psychological literature in this domain. We focus on new techniques for experimentally manipulating and measuring proxemics, including the use of immersive virtual environments and digital motion capture. We also discuss ways in which these approaches can be integrated with psychophysiological and neuroimaging techniques. Throughout, we argue that contemporary proxemics research provides psychology and neuroscience with a means to study social cognition and behavior as they naturally emerge and unfold in vivo.

  4. The effect of user's perceived presence and promotion focus on usability for interacting in virtual environments.

    PubMed

    Sun, Huey-Min; Li, Shang-Phone; Zhu, Yu-Qian; Hsiao, Bo

    2015-09-01

    Technological advance in human-computer interaction has attracted increasing research attention, especially in the field of virtual reality (VR). Prior research has focused on examining the effects of VR on various outcomes, for example, learning and health. However, which factors affect the final outcomes? That is, what kind of VR system design will achieve higher usability? This question remains largely. Furthermore, when we look at VR system deployment from a human-computer interaction (HCI) lens, does user's attitude play a role in achieving the final outcome? This study aims to understand the effect of immersion and involvement, as well as users' regulatory focus on usability for a somatosensory VR learning system. This study hypothesized that regulatory focus and presence can effectively enhance user's perceived usability. Survey data from 78 students in Taiwan indicated that promotion focus is positively related to user's perceived efficiency, whereas involvement and promotion focus are positively related to user's perceived effectiveness. Promotion focus also predicts user satisfaction and overall usability perception. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  5. Envisioning the future of home care: applications of immersive virtual reality.

    PubMed

    Brennan, Patricia Flatley; Arnott Smith, Catherine; Ponto, Kevin; Radwin, Robert; Kreutz, Kendra

    2013-01-01

    Accelerating the design of technologies to support health in the home requires 1) better understanding of how the household context shapes consumer health behaviors and (2) the opportunity to afford engineers, designers, and health professionals the chance to systematically study the home environment. We developed the Living Environments Laboratory (LEL) with a fully immersive, six-sided virtual reality CAVE to enable recreation of a broad range of household environments. We have successfully developed a virtual apartment, including a kitchen, living space, and bathroom. Over 2000 people have visited the LEL CAVE. Participants use an electronic wand to activate common household affordances such as opening a refrigerator door or lifting a cup. Challenges currently being explored include creating natural gesture to interface with virtual objects, developing robust, simple procedures to capture actual living environments and rendering them in a 3D visualization, and devising systematic stable terminologies to characterize home environments.

  6. Enhancements to VTK enabling Scientific Visualization in Immersive Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick; Jhaveri, Sankhesh; Chaudhary, Aashish

    Modern scientific, engineering and medical computational sim- ulations, as well as experimental and observational data sens- ing/measuring devices, produce enormous amounts of data. While statistical analysis provides insight into this data, scientific vi- sualization is tactically important for scientific discovery, prod- uct design and data analysis. These benefits are impeded, how- ever, when scientific visualization algorithms are implemented from scratch—a time-consuming and redundant process in im- mersive application development. This process can greatly ben- efit from leveraging the state-of-the-art open-source Visualization Toolkit (VTK) and its community. Over the past two (almost three) decades, integrating VTK with a virtual reality (VR)more » environment has only been attempted to varying degrees of success. In this pa- per, we demonstrate two new approaches to simplify this amalga- mation of an immersive interface with visualization rendering from VTK. In addition, we cover several enhancements to VTK that pro- vide near real-time updates and efficient interaction. Finally, we demonstrate the combination of VTK with both Vrui and OpenVR immersive environments in example applications.« less

  7. Cue-exposure software for the treatment of bulimia nervosa and binge eating disorder.

    PubMed

    Gutiérrez-Maldonado, José; Pla-Sanjuanelo, Joana; Ferrer-García, Marta

    2016-11-01

    Cue-exposure therapy (CET) has proven its efficacy in treating patients with bulimia nervosa and binge eating disorder who are resistant to standard treatment. Furthermore, incorporating virtual reality (VR) technology is increasingly considered a valid exposure method that may help to increase the efficacy of standard treatments in a variety of eating disorders. Although immersive displays improve the beneficial effects, expensive technology is not always necessary. We aimed to assess whether exposure to food related virtual environments could decrease food craving in a non-clinical sample. In addition, we specifically compared the effects of two VR systems (one non-immersive and one immersive) during CET. We therefore applied a one-session CET to 113 undergraduate students. Decreased food craving was found during exposure to both VR environments compared with pre-treatment levels, supporting the efficacy of VR-CET in reducing food craving. We found no significant differences in craving between immersive and non-immersive systems. Low-cost non-immersive systems applied through 3D laptops can improve the accessibility of this technique. By reducing the costs and improving the usability, VR-CET on 3D laptops may become a viable option that can be readily applied in a greater range of clinical contexts.

  8. The Responses of Medical General Practitioners to Unreasonable Patient Demand for Antibiotics - A Study of Medical Ethics Using Immersive Virtual Reality

    PubMed Central

    Pan, Xueni; Slater, Mel; Beacco, Alejandro; Navarro, Xavi; Bellido Rivas, Anna I.; Swapp, David; Hale, Joanna; Forbes, Paul Alexander George; Denvir, Catrina; de C. Hamilton, Antonia F.; Delacroix, Sylvie

    2016-01-01

    Background Dealing with insistent patient demand for antibiotics is an all too common part of a General Practitioner’s daily routine. This study explores the extent to which portable Immersive Virtual Reality technology can help us gain an accurate understanding of the factors that influence a doctor’s response to the ethical challenge underlying such tenacious requests for antibiotics (given the threat posed by growing anti-bacterial resistance worldwide). It also considers the potential of such technology to train doctors to face such dilemmas. Experiment Twelve experienced GPs and nine trainees were confronted with an increasingly angry demand by a woman to prescribe antibiotics to her mother in the face of inconclusive evidence that such antibiotic prescription is necessary. The daughter and mother were virtual characters displayed in immersive virtual reality. The specific purposes of the study were twofold: first, whether experienced GPs would be more resistant to patient demands than the trainees, and second, to investigate whether medical doctors would take the virtual situation seriously. Results Eight out of the 9 trainees prescribed the antibiotics, whereas 7 out of the 12 GPs did so. On the basis of a Bayesian analysis, these results yield reasonable statistical evidence in favor of the notion that experienced GPs are more likely to withstand the pressure to prescribe antibiotics than trainee doctors, thus answering our first question positively. As for the second question, a post experience questionnaire assessing the participants’ level of presence (together with participants’ feedback and body language) suggested that overall participants did tend towards the illusion of being in the consultation room depicted in the virtual reality and that the virtual consultation taking place was really happening. PMID:26889676

  9. The Responses of Medical General Practitioners to Unreasonable Patient Demand for Antibiotics--A Study of Medical Ethics Using Immersive Virtual Reality.

    PubMed

    Pan, Xueni; Slater, Mel; Beacco, Alejandro; Navarro, Xavi; Bellido Rivas, Anna I; Swapp, David; Hale, Joanna; Forbes, Paul Alexander George; Denvir, Catrina; Hamilton, Antonia F de C; Delacroix, Sylvie

    2016-01-01

    Dealing with insistent patient demand for antibiotics is an all too common part of a General Practitioner's daily routine. This study explores the extent to which portable Immersive Virtual Reality technology can help us gain an accurate understanding of the factors that influence a doctor's response to the ethical challenge underlying such tenacious requests for antibiotics (given the threat posed by growing anti-bacterial resistance worldwide). It also considers the potential of such technology to train doctors to face such dilemmas. Twelve experienced GPs and nine trainees were confronted with an increasingly angry demand by a woman to prescribe antibiotics to her mother in the face of inconclusive evidence that such antibiotic prescription is necessary. The daughter and mother were virtual characters displayed in immersive virtual reality. The specific purposes of the study were twofold: first, whether experienced GPs would be more resistant to patient demands than the trainees, and second, to investigate whether medical doctors would take the virtual situation seriously. Eight out of the 9 trainees prescribed the antibiotics, whereas 7 out of the 12 GPs did so. On the basis of a Bayesian analysis, these results yield reasonable statistical evidence in favor of the notion that experienced GPs are more likely to withstand the pressure to prescribe antibiotics than trainee doctors, thus answering our first question positively. As for the second question, a post experience questionnaire assessing the participants' level of presence (together with participants' feedback and body language) suggested that overall participants did tend towards the illusion of being in the consultation room depicted in the virtual reality and that the virtual consultation taking place was really happening.

  10. Science Education Using a Computer Model-Virtual Puget Sound

    NASA Astrophysics Data System (ADS)

    Fruland, R.; Winn, W.; Oppenheimer, P.; Stahr, F.; Sarason, C.

    2002-12-01

    We created an interactive learning environment based on an oceanographic computer model of Puget Sound-Virtual Puget Sound (VPS)-as an alternative to traditional teaching methods. Students immersed in this navigable 3-D virtual environment observed tidal movements and salinity changes, and performed tracer and buoyancy experiments. Scientific concepts were embedded in a goal-based scenario to locate a new sewage outfall in Puget Sound. Traditional science teaching methods focus on distilled representations of agreed-upon knowledge removed from real-world context and scientific debate. Our strategy leverages students' natural interest in their environment, provides meaningful context and engages students in scientific debate and knowledge creation. Results show that VPS provides a powerful learning environment, but highlights the need for research on how to most effectively represent concepts and organize interactions to support scientific inquiry and understanding. Research is also needed to ensure that new technologies and visualizations do not foster misconceptions, including the impression that the model represents reality rather than being a useful tool. In this presentation we review results from prior work with VPS and outline new work for a modeling partnership recently formed with funding from the National Ocean Partnership Program (NOPP).

  11. Immersive Virtual Reality for Pediatric Pain.

    PubMed

    Won, Andrea Stevenson; Bailey, Jakki; Bailenson, Jeremy; Tataru, Christine; Yoon, Isabel A; Golianu, Brenda

    2017-06-23

    Children must often endure painful procedures as part of their treatment for various medical conditions. Those with chronic pain endure frequent or constant discomfort in their daily lives, sometimes severely limiting their physical capacities. With the advent of affordable consumer-grade equipment, clinicians have access to a promising and engaging intervention for pediatric pain, both acute and chronic. In addition to providing relief from acute and procedural pain, virtual reality (VR) may also help to provide a corrective psychological and physiological environment to facilitate rehabilitation for pediatric patients suffering from chronic pain. The special qualities of VR such as presence, interactivity, customization, social interaction, and embodiment allow it to be accepted by children and adolescents and incorporated successfully into their existing medical therapies. However, the powerful and transformative nature of many VR experiences may also pose some risks and should be utilized with caution. In this paper, we review recent literature in pediatric virtual reality for procedural pain and anxiety, acute and chronic pain, and some rehabilitation applications. We also discuss the practical considerations of using VR in pediatric care, and offer specific suggestions and information for clinicians wishing to adopt these engaging therapies into their daily clinical practice.

  12. Mesoscopic Rigid Body Modelling of the Extracellular Matrix Self-Assembly.

    PubMed

    Wong, Hua; Prévoteau-Jonquet, Jessica; Baud, Stéphanie; Dauchez, Manuel; Belloy, Nicolas

    2018-06-11

    The extracellular matrix (ECM) plays an important role in supporting tissues and organs. It even has a functional role in morphogenesis and differentiation by acting as a source of active molecules (matrikines). Many diseases are linked to dysfunction of ECM components and fragments or changes in their structures. As such it is a prime target for drugs. Because of technological limitations for observations at mesoscopic scales, the precise structural organisation of the ECM is not well-known, with sparse or fuzzy experimental observables. Based on the Unity3D game and physics engines, along with rigid body dynamics, we propose a virtual sandbox to model large biological molecules as dynamic chains of rigid bodies interacting together to gain insight into ECM components behaviour in the mesoscopic range. We have preliminary results showing how parameters such as fibre flexibility or the nature and number of interactions between molecules can induce different structures in the basement membrane. Using the Unity3D game engine and virtual reality headset coupled with haptic controllers, we immerse the user inside the corresponding simulation. Untrained users are able to navigate a complex virtual sandbox crowded with large biomolecules models in a matter of seconds.

  13. Immersive Virtual Reality for Pediatric Pain

    PubMed Central

    Won, Andrea Stevenson; Bailey, Jakki; Bailenson, Jeremy; Tataru, Christine; Yoon, Isabel A.; Golianu, Brenda

    2017-01-01

    Children must often endure painful procedures as part of their treatment for various medical conditions. Those with chronic pain endure frequent or constant discomfort in their daily lives, sometimes severely limiting their physical capacities. With the advent of affordable consumer-grade equipment, clinicians have access to a promising and engaging intervention for pediatric pain, both acute and chronic. In addition to providing relief from acute and procedural pain, virtual reality (VR) may also help to provide a corrective psychological and physiological environment to facilitate rehabilitation for pediatric patients suffering from chronic pain. The special qualities of VR such as presence, interactivity, customization, social interaction, and embodiment allow it to be accepted by children and adolescents and incorporated successfully into their existing medical therapies. However, the powerful and transformative nature of many VR experiences may also pose some risks and should be utilized with caution. In this paper, we review recent literature in pediatric virtual reality for procedural pain and anxiety, acute and chronic pain, and some rehabilitation applications. We also discuss the practical considerations of using VR in pediatric care, and offer specific suggestions and information for clinicians wishing to adopt these engaging therapies into their daily clinical practice. PMID:28644422

  14. Virtual reality for pain and anxiety management in children

    PubMed Central

    Arane, Karen; Behboudi, Amir; Goldman, Ran D.

    2017-01-01

    Abstract Question Pain and anxiety are common in children who need procedures such as administering vaccines or drawing blood. Recent reports have described the use of virtual reality (VR) as a method of distraction during such procedures. How does VR work in reducing pain and anxiety in pediatric patients and what are the potential uses for it? Answer Recent studies explored using VR with pediatric patients undergoing procedures ranging from vaccinations and intravenous injections to laceration repair and dressing changes for burn wounds. Interacting with immersive VR might divert attention, leading to a slower response to incoming pain signals. Preliminary results have shown that VR is effective, either alone or in combination with standard care, in reducing the pain and anxiety patients experience compared with standard care or other distraction methods. PMID:29237632

  15. Experiments in mixed reality

    NASA Astrophysics Data System (ADS)

    Krum, David M.; Sadek, Ramy; Kohli, Luv; Olson, Logan; Bolas, Mark

    2010-01-01

    As part of the Institute for Creative Technologies and the School of Cinematic Arts at the University of Southern California, the Mixed Reality lab develops technologies and techniques for presenting realistic immersive training experiences. Such experiences typically place users within a complex ecology of social actors, physical objects, and collections of intents, motivations, relationships, and other psychological constructs. Currently, it remains infeasible to completely synthesize the interactivity and sensory signatures of such ecologies. For this reason, the lab advocates mixed reality methods for training and conducts experiments exploring such methods. Currently, the lab focuses on understanding and exploiting the elasticity of human perception with respect to representational differences between real and virtual environments. This paper presents an overview of three projects: techniques for redirected walking, displays for the representation of virtual humans, and audio processing to increase stress.

  16. Immersive, interactive virtual field trips promote learning

    NASA Astrophysics Data System (ADS)

    Bruce, G.; Mead, C.; Buxner, S.; Taylor, W.; Semken, S. C.; Anbar, A. D.; Sundstrom, J.

    2016-12-01

    We are assessing the educational effectiveness of a new type of immersive virtual field trip (iVFT) that we are developing, grounded in active, inquiry-based learning, and accessible via web browsers. To this end, we collected data from five high school AP biology classes (n = 153) that were assigned an iVFT lesson focused on life and environment during the Ediacaran time period, 550 million years ago. Students explore a series of fossil beds using high resolution imagery and video acquired during a field expedition to the Nilpena site in the Flinders Ranges, South Australia. They first encounter an immersive spherical image, which orients them to the area. Then, they identify fossils in the iVFT, using a dichotomous key. Finally, they explore an interactive simulation of this ancient ecosystem. The average time spent on the experience was approximately two hours. The learning objective is for students to be able to describe the Ediacaran ecosystem preserved in the rocks at Nilpena. To assess this outcome, we administered identical pre- and post-lesson quizzes to students. Results showed a statistically significant improvement on the six-item quiz with a normalized gain of 0.96 (pre-lesson mean: 2.4, post-lesson mean: 5.9, p < .001). All but three students demonstrated an increase in score or maintained a perfect score. The pre-lesson scores are close to what would be expected from guessing, so these results represent a substantial growth in understanding. These findings encourage the use of iVFT-based learning experiences in education (an evolving suite is publicly available at http://vft.asu.edu). In the future, we will explore in more detail which aspects of the experience provide greatest educational benefit, and the effectiveness in teaching scientific reasoning skills in addition to content knowledge. To answer these questions, we will supplement content-based questions with mixed-methods data including interviews.

  17. Offenders become the victim in virtual reality: impact of changing perspective in domestic violence.

    PubMed

    Seinfeld, S; Arroyo-Palacios, J; Iruretagoyena, G; Hortensius, R; Zapata, L E; Borland, D; de Gelder, B; Slater, M; Sanchez-Vives, M V

    2018-02-09

    The role of empathy and perspective-taking in preventing aggressive behaviors has been highlighted in several theoretical models. In this study, we used immersive virtual reality to induce a full body ownership illusion that allows offenders to be in the body of a victim of domestic abuse. A group of male domestic violence offenders and a control group without a history of violence experienced a virtual scene of abuse in first-person perspective. During the virtual encounter, the participants' real bodies were replaced with a life-sized virtual female body that moved synchronously with their own real movements. Participants' emotion recognition skills were assessed before and after the virtual experience. Our results revealed that offenders have a significantly lower ability to recognize fear in female faces compared to controls, with a bias towards classifying fearful faces as happy. After being embodied in a female victim, offenders improved their ability to recognize fearful female faces and reduced their bias towards recognizing fearful faces as happy. For the first time, we demonstrate that changing the perspective of an aggressive population through immersive virtual reality can modify socio-perceptual processes such as emotion recognition, thought to underlie this specific form of aggressive behaviors.

  18. Defense applications of the CAVE (CAVE automatic virtual environment)

    NASA Astrophysics Data System (ADS)

    Isabelle, Scott K.; Gilkey, Robert H.; Kenyon, Robert V.; Valentino, George; Flach, John M.; Spenny, Curtis H.; Anderson, Timothy R.

    1997-07-01

    The CAVE is a multi-person, room-sized, high-resolution, 3D video and auditory environment, which can be used to present very immersive virtual environment experiences. This paper describes the CAVE technology and the capability of the CAVE system as originally developed at the Electronics Visualization Laboratory of the University of Illinois- Chicago and as more recently implemented by Wright State University (WSU) in the Armstrong Laboratory at Wright- Patterson Air Force Base (WPAFB). One planned use of the WSU/WPAFB CAVE is research addressing the appropriate design of display and control interfaces for controlling uninhabited aerial vehicles. The WSU/WPAFB CAVE has a number of features that make it well-suited to this work: (1) 360 degrees surround, plus floor, high resolution visual displays, (2) virtual spatialized audio, (3) the ability to integrate real and virtual objects, and (4) rapid and flexible reconfiguration. However, even though the CAVE is likely to have broad utility for military applications, it does have certain limitations that may make it less well- suited to applications that require 'natural' haptic feedback, vestibular stimulation, or an ability to interact with close detailed objects.

  19. Augmenting the thermal flux experiment: A mixed reality approach with the HoloLens

    NASA Astrophysics Data System (ADS)

    Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.

    2017-09-01

    In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted displays, which allow one to embed virtual objects into the real surroundings, leading to a Mixed Reality (MR) experience. In such an environment, digital and real objects do not only coexist, but moreover are also able to interact with each other in real time. These concepts can be used to merge human perception of reality with digitally visualized sensor data, thereby making the invisible visible. As a first example, in this paper we introduce alongside the basic idea of this column an MR experiment in thermodynamics for a laboratory course for freshman students in physics or other science and engineering subjects that uses physical data from mobile devices for analyzing and displaying physical phenomena to students.

  20. A software system for evaluation and training of spatial reasoning and neuroanatomical knowledge in a virtual environment.

    PubMed

    Armstrong, Ryan; de Ribaupierre, Sandrine; Eagleson, Roy

    2014-04-01

    This paper describes the design and development of a software tool for the evaluation and training of surgical residents using an interactive, immersive, virtual environment. Our objective was to develop a tool to evaluate user spatial reasoning skills and knowledge in a neuroanatomical context, as well as to augment their performance through interactivity. In the visualization, manually segmented anatomical surface images of MRI scans of the brain were rendered using a stereo display to improve depth cues. A magnetically tracked wand was used as a 3D input device for localization tasks within the brain. The movement of the wand was made to correspond to movement of a spherical cursor within the rendered scene, providing a reference for localization. Users can be tested on their ability to localize structures within the 3D scene, and their ability to place anatomical features at the appropriate locations within the rendering. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Can walking motions improve visually induced rotational self-motion illusions in virtual reality?

    PubMed

    Riecke, Bernhard E; Freiberg, Jacob B; Grechkin, Timofey Y

    2015-02-04

    Illusions of self-motion (vection) can provide compelling sensations of moving through virtual environments without the need for complex motion simulators or large tracked physical walking spaces. Here we explore the interaction between biomechanical cues (stepping along a rotating circular treadmill) and visual cues (viewing simulated self-rotation) for providing stationary users a compelling sensation of rotational self-motion (circular vection). When tested individually, biomechanical and visual cues were similarly effective in eliciting self-motion illusions. However, in combination they yielded significantly more intense self-motion illusions. These findings provide the first compelling evidence that walking motions can be used to significantly enhance visually induced rotational self-motion perception in virtual environments (and vice versa) without having to provide for physical self-motion or motion platforms. This is noteworthy, as linear treadmills have been found to actually impair visually induced translational self-motion perception (Ash, Palmisano, Apthorp, & Allison, 2013). Given the predominant focus on linear walking interfaces for virtual-reality locomotion, our findings suggest that investigating circular and curvilinear walking interfaces offers a promising direction for future research and development and can help to enhance self-motion illusions, presence and immersion in virtual-reality systems. © 2015 ARVO.

  2. Use of 3D techniques for virtual production

    NASA Astrophysics Data System (ADS)

    Grau, Oliver; Price, Marc C.; Thomas, Graham A.

    2000-12-01

    Virtual production for broadcast is currently mainly used in the form of virtual studios, where the resulting media is a sequence of 2D images. With the steady increase of 3D computing power in home PCs and the technical progress in 3D display technology, the content industry is looking for new kinds of program material, which makes use of 3D technology. The applications range form analysis of sport scenes, 3DTV, up to the creation of fully immersive content. In a virtual studio a camera films one or more actors in a controlled environment. The pictures of the actors can be segmented very accurately in real time using chroma keying techniques. The isolated silhouette can be integrated into a new synthetic virtual environment using a studio mixer. The resulting shape description of the actors is 2D so far. For the realization of more sophisticated optical interactions of the actors with the virtual environment, such as occlusions and shadows, an object-based 3D description of scenes is needed. However, the requirements of shape accuracy, and the kind of representation, differ in accordance with the application. This contribution gives an overview of requirements and approaches for the generation of an object-based 3D description in various applications studied by the BBC R and D department. An enhanced Virtual Studio for 3D programs is proposed that covers a range of applications for virtual production.

  3. Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project

    NASA Astrophysics Data System (ADS)

    Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.

    2016-12-01

    Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo Curriculum Award.

  4. Ames Lab 101: C6: Virtual Engineering

    ScienceCinema

    McCorkle, Doug

    2018-01-01

    Ames Laboratory scientist Doug McCorkle explains the importance of virtual engineering and talks about the C6. The C6 is a three-dimensional, fully-immersive synthetic environment residing in the center atrium of Iowa State University's Howe Hall.

  5. Virtual reality training for health-care professionals.

    PubMed

    Mantovani, Fabrizia; Castelnuovo, Gianluca; Gaggioli, Andrea; Riva, Giuseppe

    2003-08-01

    Emerging changes in health-care delivery are having a significant impact on the structure of health-care professionals' education. Today it is recognized that medical knowledge doubles every 6-8 years, with new medical procedures emerging everyday. While the half-life of medical information is so short, the average physician practices 30 years and the average nurse 40 years. Continuing education thus represents an important challenge to face. Recent advances in educational technology are offering an increasing number of innovative learning tools. Among these, Virtual Reality represents a promising area with high potential of enhancing the training of health-care professionals. Virtual Reality Training can provide a rich, interactive, engaging educational context, thus supporting experiential learning-by-doing; it can, in fact, contribute to raise interest and motivation in trainees and to effectively support skills acquisition and transfer, since the learning process can be settled within an experiential framework. Current virtual training applications for health-care differ a lot as to both their technological/multimedia sophistication and to the types of skills trained, varying for example from telesurgical applications to interactive simulations of human body and brain, to virtual worlds for emergency training. Other interesting applications include the development of immersive 3D environments for training psychiatrists and psychologists in the treatment of mental disorders. This paper has the main aim of discussing the rationale and main benefits for the use of virtual reality in health-care education and training. Significant research and projects carried out in this field will also be presented, followed by discussion on key issues concerning current limitations and future development directions.

  6. Perceptual Fidelity vs. Engineering Compromises In Virtual Acoustic Displays

    NASA Technical Reports Server (NTRS)

    Wenzel, Elizabeth M.; Ahumada, Albert (Technical Monitor)

    1997-01-01

    Immersive, three-dimensional displays are increasingly becoming a goal of advanced human-machine interfaces. While the technology for achieving truly useful multisensory environments is still being developed, techniques for generating three-dimensional sound are now both sophisticated and practical enough to be applied to acoustic displays. The ultimate goal of virtual acoustics is to simulate the complex acoustic field experienced by a listener freely moving around within an environment. Of course, such complexity, freedom of movement and interactively is not always possible in a "true" virtual environment, much less in lower-fidelity multimedia systems. However, many of the perceptual and engineering constraints (and frustrations) that researchers, engineers and listeners have experienced in virtual audio are relevant to multimedia. In fact, some of the problems that have been studied will be even more of an issue for lower fidelity systems that are attempting to address the requirements of a huge, diverse and ultimately unknown audience. Examples include individual differences in head-related transfer functions, a lack of real interactively (head-tracking) in many multimedia displays, and perceptual degradation due to low sampling rates and/or low-bit compression. This paper discusses some of the engineering Constraints faced during implementation of virtual acoustic environments and the perceptual consequences of these constraints. Specific examples are given for NASA applications such as telerobotic control, aeronautical displays, and shuttle launch communications. An attempt will also be made to relate these issues to low-fidelity implementations such as the internet.

  7. Perceptual Fidelity Versus Engineering Compromises in Virtual Acoustic Displays

    NASA Technical Reports Server (NTRS)

    Wenzel, Elizabeth M.; Ellis, Stephen R. (Technical Monitor); Frey, Mary Anne (Technical Monitor); Schneider, Victor S. (Technical Monitor)

    1997-01-01

    Immersive, three-dimensional displays are increasingly becoming a goal of advanced human-machine interfaces. While the technology for achieving truly useful multisensory environments is still being developed, techniques for generating three-dimensional sound are now both sophisticated and practical enough to be applied to acoustic displays. The ultimate goal of virtual acoustics is to simulate the complex acoustic field experienced by a listener freely moving around within an environment. Of course, such complexity, freedom of movement and interactivity is not always possible in a 'true' virtual environment, much less in lower-fidelity multimedia systems. However, many of the perceptual and engineering constraints (and frustrations) that researchers, engineers and listeners have experienced in virtual audio are relevant to multimedia. In fact, some of the problems that have been studied will be even more of an issue for lower fidelity systems that are attempting to address the requirements of a huge, diverse and ultimately unknown audience. Examples include individual differences in head-related transfer functions, A lack of real interactively (head-tracking) in many multimedia displays, and perceptual degradation due to low sampling rates and/or low-bit compression. This paper discusses some of the engineering constraints faced during implementation of virtual acoustic environments and the perceptual consequences of these constraints. Specific examples are given for NASA applications such as telerobotic control, aeronautical displays, and shuttle launch communications. An attempt will also be made to relate these issues to low-fidelity implementations such as the internet.

  8. Tools virtualization for command and control systems

    NASA Astrophysics Data System (ADS)

    Piszczek, Marek; Maciejewski, Marcin; Pomianek, Mateusz; Szustakowski, Mieczysław

    2017-10-01

    Information management is an inseparable part of the command process. The result is that the person making decisions at the command post interacts with data providing devices in various ways. Tools virtualization process can introduce a number of significant modifications in the design of solutions for management and command. The general idea involves replacing physical devices user interface with their digital representation (so-called Virtual instruments). A more advanced level of the systems "digitalization" is to use the mixed reality environments. In solutions using Augmented reality (AR) customized HMI is displayed to the operator when he approaches to each device. Identification of device is done by image recognition of photo codes. Visualization is achieved by (optical) see-through head mounted display (HMD). Control can be done for example by means of a handheld touch panel. Using the immersive virtual environment, the command center can be digitally reconstructed. Workstation requires only VR system (HMD) and access to information network. Operator can interact with devices in such a way as it would perform in real world (for example with the virtual hands). Because of their procedures (an analysis of central vision, eye tracking) MR systems offers another useful feature of reducing requirements for system data throughput. Due to the fact that at the moment we focus on the single device. Experiments carried out using Moverio BT-200 and SteamVR systems and the results of experimental application testing clearly indicate the ability to create a fully functional information system with the use of mixed reality technology.

  9. Contextual EFL Learning in a 3D Virtual Environment

    ERIC Educational Resources Information Center

    Lan, Yu-Ju

    2015-01-01

    The purposes of the current study are to develop virtually immersive EFL learning contexts for EFL learners in Taiwan to pre- and review English materials beyond the regular English class schedule. A 2-iteration action research lasting for one semester was conducted to evaluate the effects of virtual contexts on learners' EFL learning. 132…

  10. The Benefits and Barriers of Using Virtual Worlds to Engage Healthcare Professionals on Distance Learning Programmes

    ERIC Educational Resources Information Center

    Hack, Catherine Jane

    2016-01-01

    Using the delivery of a large postgraduate distance learning module in bioethics to health professionals as an illustrative example, the type of learning activity that could be enhanced through delivery in an immersive virtual world (IVW) was explored. Several activities were repurposed from the "traditional" virtual learning environment…

  11. Development, Implementation, and Assessment of General Chemistry Lab Experiments Performed in the Virtual World of Second Life

    ERIC Educational Resources Information Center

    Winkelmann, Kurt; Keeney-Kennicutt, Wendy; Fowler, Debra; Macik, Maria

    2017-01-01

    Virtual worlds are a potential medium for teaching college-level chemistry laboratory courses. To determine the feasibility of conducting chemistry experiments in such an environment, undergraduate students performed two experiments in the immersive virtual world of Second Life (SL) as part of their regular General Chemistry 2 laboratory course.…

  12. Working Collaboratively in Virtual Learning Environments: Using Second Life with Korean High School Students in History Class

    ERIC Educational Resources Information Center

    Kim, Mi Hwa

    2013-01-01

    The purpose of this experimental study was to investigate the impact of the use of a virtual environment for learning Korean history on high school students' learning outcomes and attitudes toward virtual worlds (collaboration, engagement, general use of SL [Second Life], and immersion). In addition, this experiment examined the relationships…

  13. Students' First Impression of Second Life: A Case from the United Arab Emirates

    ERIC Educational Resources Information Center

    Abdallah, Salam; Douglas, Jamal

    2010-01-01

    Emerging 3D virtual worlds such as Second Life can offer students with opportunities to enhance learning using rich collaborative asynchronous media. Virtual worlds are believed to impact the future of higher education and therefore, universities across the world are immersing themselves inside virtual worlds to establish a unique learning and…

  14. Architectures for Developing Multiuser, Immersive Learning Scenarios

    ERIC Educational Resources Information Center

    Nadolski, Rob J.; Hummel, Hans G. K.; Slootmaker, Aad; van der Vegt, Wim

    2012-01-01

    Multiuser immersive learning scenarios hold strong potential for lifelong learning as they can support the acquisition of higher order skills in an effective, efficient, and attractive way. Existing virtual worlds, game development platforms, and game engines only partly cater for the proliferation of such learning scenarios as they are often…

  15. Exploring the Relationship Between Distributed Training, Integrated Learning Environments, and Immersive Training Environments

    DTIC Science & Technology

    2007-01-01

    educating and training (O’Keefe IV & McIntyre III, 2006). Topics vary widely from standard educational topics such as teaching kids physics, mechanics...Winn, W., & Yu, R. (1997). The Impact of Three Dimensional Immersive Virtual Environments on Modern Pedagogy : Global Change, VR and Learning

  16. Three-Dimensional User Interfaces for Immersive Virtual Reality

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1997-01-01

    The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.

  17. Immersive Virtual Environment Technology to Supplement Environmental Perception, Preference and Behavior Research: A Review with Applications

    PubMed Central

    Smith, Jordan W.

    2015-01-01

    Immersive virtual environment (IVE) technology offers a wide range of potential benefits to research focused on understanding how individuals perceive and respond to built and natural environments. In an effort to broaden awareness and use of IVE technology in perception, preference and behavior research, this review paper describes how IVE technology can be used to complement more traditional methods commonly applied in public health research. The paper also describes a relatively simple workflow for creating and displaying 360° virtual environments of built and natural settings and presents two freely-available and customizable applications that scientists from a variety of disciplines, including public health, can use to advance their research into human preferences, perceptions and behaviors related to built and natural settings. PMID:26378565

  18. Immersive Virtual Environment Technology to Supplement Environmental Perception, Preference and Behavior Research: A Review with Applications.

    PubMed

    Smith, Jordan W

    2015-09-11

    Immersive virtual environment (IVE) technology offers a wide range of potential benefits to research focused on understanding how individuals perceive and respond to built and natural environments. In an effort to broaden awareness and use of IVE technology in perception, preference and behavior research, this review paper describes how IVE technology can be used to complement more traditional methods commonly applied in public health research. The paper also describes a relatively simple workflow for creating and displaying 360° virtual environments of built and natural settings and presents two freely-available and customizable applications that scientists from a variety of disciplines, including public health, can use to advance their research into human preferences, perceptions and behaviors related to built and natural settings.

  19. Evaluation of knowledge transfer in an immersive virtual learning environment for the transportation community : [tech summary].

    DOT National Transportation Integrated Search

    2014-05-01

    mmersive Virtual Learning Environments (IVLEs) are extensively used in training, but few rigorous scienti c investigations regarding : the transfer of learning have been conducted. Measurement of learning transfer through evaluative methods is key...

  20. Truck driver fatigue assessment using a virtual reality system.

    DOT National Transportation Integrated Search

    2016-10-17

    In this study, a fully immersive Virtual Reality (VR) based driving simulator was developed to serve : as a proof-of-concept that VR can be utilized to assess the level of fatigue (or drowsiness) truck : drivers typically experience during real...

  1. Skill training in multimodal virtual environments.

    PubMed

    Gopher, Daniel

    2012-01-01

    Multimodal, immersive, virtual reality (VR) techniques open new perspectives for perceptual-motor skill trainers. They also introduce new risks and dangers. This paper describes the benefits and pitfalls of multimodal training and the cognitive building blocks of a multimodal, VR training simulators.

  2. Brave New World

    ERIC Educational Resources Information Center

    Panettieri, Joseph C.

    2007-01-01

    Across the globe, progressive universities are embracing any number of MUVEs (multi-user virtual environments), 3D environments, and "immersive" virtual reality tools. And within the next few months, several universities are expected to test so-called "telepresence" videoconferencing systems from Cisco Systems and other leading…

  3. ARENA - A Collaborative Immersive Environment for Virtual Fieldwork

    NASA Astrophysics Data System (ADS)

    Kwasnitschka, T.

    2012-12-01

    Whenever a geoscientific study area is not readily accessible, as is the case on the deep seafloor, it is difficult to apply traditional but effective methods of fieldwork, which often require physical presence of the observer. The Artificial Research Environment for Networked Analysis (ARENA), developed at GEOMAR | Helmholtz Centre for Ocean Research Kiel within the Cluster of Excellence "The Future Ocean", provides a backend solution to robotic research on the seafloor by means of an immersive simulation environment for marine research: A hemispherical screen of 6m diameter covering the entire lower hemisphere surrounds a group of up to four researchers at once. A variety of open source (e.g. Microsoft Research World Wide Telescope) and commercial software platforms allow the interaction with e.g. in-situ recorded video, vector maps, terrain, textured geometry, point cloud and volumetric data in four dimensions. Data can be put into a holistic, georeferenced context and viewed on scales stretching from centimeters to global. Several input devices from joysticks to gestures and vocalized commands allow interaction with the simulation, depending on individual preference. Annotations added to the dataset during the simulation session catalyze the following quantitative evaluation. Both the special simulator design, making data perception a group experience, and the ability to connect remote instances or scaled down versions of ARENA over the Internet are significant advantages over established immersive simulation environments.

  4. Interfacing modeling suite Physics Of Eclipsing Binaries 2.0 with a Virtual Reality Platform

    NASA Astrophysics Data System (ADS)

    Harriett, Edward; Conroy, Kyle; Prša, Andrej; Klassner, Frank

    2018-01-01

    To explore alternate methods for modeling eclipsing binary stars, we extrapolate upon PHOEBE’s (PHysics Of Eclipsing BinariEs) capabilities in a virtual reality (VR) environment to create an immersive and interactive experience for users. The application used is Vizard, a python-scripted VR development platform for environments such as Cave Automatic Virtual Environment (CAVE) and other off-the-shelf VR headsets. Vizard allows the freedom for all modeling to be precompiled without compromising functionality or usage on its part. The system requires five arguments to be precomputed using PHOEBE’s python front-end: the effective temperature, flux, relative intensity, vertex coordinates, and orbits; the user can opt to implement other features from PHOEBE to be accessed within the simulation as well. Here we present the method for making the data observables accessible in real time. An Occulus Rift will be available for a live showcase of various cases of VR rendering of PHOEBE binary systems including detached and contact binary stars.

  5. Beaming into the Rat World: Enabling Real-Time Interaction between Rat and Human Each at Their Own Scale

    PubMed Central

    Normand, Jean-Marie; Sanchez-Vives, Maria V.; Waechter, Christian; Giannopoulos, Elias; Grosswindhager, Bernhard; Spanlang, Bernhard; Guger, Christoph; Klinker, Gudrun; Srinivasan, Mandayam A.; Slater, Mel

    2012-01-01

    Immersive virtual reality (IVR) typically generates the illusion in participants that they are in the displayed virtual scene where they can experience and interact in events as if they were really happening. Teleoperator (TO) systems place people at a remote physical destination embodied as a robotic device, and where typically participants have the sensation of being at the destination, with the ability to interact with entities there. In this paper, we show how to combine IVR and TO to allow a new class of application. The participant in the IVR is represented in the destination by a physical robot (TO) and simultaneously the remote place and entities within it are represented to the participant in the IVR. Hence, the IVR participant has a normal virtual reality experience, but where his or her actions and behaviour control the remote robot and can therefore have physical consequences. Here, we show how such a system can be deployed to allow a human and a rat to operate together, but the human interacting with the rat on a human scale, and the rat interacting with the human on the rat scale. The human is represented in a rat arena by a small robot that is slaved to the human’s movements, whereas the tracked rat is represented to the human in the virtual reality by a humanoid avatar. We describe the system and also a study that was designed to test whether humans can successfully play a game with the rat. The results show that the system functioned well and that the humans were able to interact with the rat to fulfil the tasks of the game. This system opens up the possibility of new applications in the life sciences involving participant observation of and interaction with animals but at human scale. PMID:23118987

  6. Beaming into the rat world: enabling real-time interaction between rat and human each at their own scale.

    PubMed

    Normand, Jean-Marie; Sanchez-Vives, Maria V; Waechter, Christian; Giannopoulos, Elias; Grosswindhager, Bernhard; Spanlang, Bernhard; Guger, Christoph; Klinker, Gudrun; Srinivasan, Mandayam A; Slater, Mel

    2012-01-01

    Immersive virtual reality (IVR) typically generates the illusion in participants that they are in the displayed virtual scene where they can experience and interact in events as if they were really happening. Teleoperator (TO) systems place people at a remote physical destination embodied as a robotic device, and where typically participants have the sensation of being at the destination, with the ability to interact with entities there. In this paper, we show how to combine IVR and TO to allow a new class of application. The participant in the IVR is represented in the destination by a physical robot (TO) and simultaneously the remote place and entities within it are represented to the participant in the IVR. Hence, the IVR participant has a normal virtual reality experience, but where his or her actions and behaviour control the remote robot and can therefore have physical consequences. Here, we show how such a system can be deployed to allow a human and a rat to operate together, but the human interacting with the rat on a human scale, and the rat interacting with the human on the rat scale. The human is represented in a rat arena by a small robot that is slaved to the human's movements, whereas the tracked rat is represented to the human in the virtual reality by a humanoid avatar. We describe the system and also a study that was designed to test whether humans can successfully play a game with the rat. The results show that the system functioned well and that the humans were able to interact with the rat to fulfil the tasks of the game. This system opens up the possibility of new applications in the life sciences involving participant observation of and interaction with animals but at human scale.

  7. Validation of an immersive virtual reality system for training near and far space neglect in individuals with stroke: a pilot study.

    PubMed

    Yasuda, Kazuhiro; Muroi, Daisuke; Ohira, Masahiro; Iwata, Hiroyasu

    2017-10-01

    Unilateral spatial neglect (USN) is defined as impaired ability to attend and see on one side, and when present, it interferes seriously with daily life. These symptoms can exist for near and far spaces combined or independently, and it is important to provide effective intervention for near and far space neglect. The purpose of this pilot study was to propose an immersive virtual reality (VR) rehabilitation program using a head-mounted display that is able to train both near and far space neglect, and to validate the immediate effect of the VR program in both near and far space neglect. Ten USN patients underwent the VR program with a pre-post design and no control. In the virtual environment, we developed visual searching and reaching tasks using an immersive VR system. Behavioral inattention test (BIT) scores obtained pre- and immediate post-VR program were compared. BIT scores obtained pre- and post-VR program revealed that far space neglect but not near space neglect improved promptly after the VR program. This effect for far space neglect was observed in the cancelation task, but not in the line bisection task. Positive effects of the immersive VR program for far space neglect are suggested by the results of the present pilot study. However, further studies with rigorous designs are needed to validate its clinical effectiveness.

  8. The effect of visual-vestibulosomatosensory conflict induced by virtual reality on postural stability in humans.

    PubMed

    Nishiike, Suetaka; Okazaki, Suzuyo; Watanabe, Hiroshi; Akizuki, Hironori; Imai, Takao; Uno, Atsuhiko; Kitahara, Tadashi; Horii, Arata; Takeda, Noriaki; Inohara, Hidenori

    2013-01-01

    In this study, we examined the effects of sensory inputs of visual-vestibulosomatosensory conflict induced by virtual reality (VR) on subjective dizziness, posture stability and visual dependency on postural control in humans. Eleven healthy young volunteers were immersed in two different VR conditions. In the control condition, subjects walked voluntarily with the background images of interactive computer graphics proportionally synchronized to their walking pace. In the visual-vestibulosomatosensory conflict condition, subjects kept still, but the background images that subjects experienced in the control condition were presented. The scores of both Graybiel's and Hamilton's criteria, postural instability and Romberg ratio were measured before and after the two conditions. After immersion in the conflict condition, both subjective dizziness and objective postural instability were significantly increased, and Romberg ratio, an index of the visual dependency on postural control, was slightly decreased. These findings suggest that sensory inputs of visual-vestibulosomatosensory conflict induced by VR induced motion sickness, resulting in subjective dizziness and postural instability. They also suggest that adaptation to the conflict condition decreases the contribution of visual inputs to postural control with re-weighing of vestibulosomatosensory inputs. VR may be used as a rehabilitation tool for dizzy patients by its ability to induce sensory re-weighing of postural control.

  9. Psychological predictors of problematic involvement in massively multiplayer online role-playing games: illustration in a sample of male cybercafé players.

    PubMed

    Billieux, Joël; Chanal, Julien; Khazaal, Yasser; Rochat, Lucien; Gay, Philippe; Zullino, Daniele; Van der Linden, Martial

    2011-01-01

    Massively Multiplayer Online Role-Playing Games (MMORPGs) are video games in which a large number of players interact with one another in a persistent virtual world. MMORPGs can become problematic and result in negative outcomes in daily living (e.g. loss of control on gaming behaviors, compromised social and individual quality of life). The aim of the present study is to investigate psychological predictors of problematic involvement in MMORPGs. Fifty-four males who played MMORPGs regularly were recruited in cybercafés and screened using the UPPS Impulsive Behavior Scale (which assesses 4 facets of impulsivity) and the Motivation to Play Online Questionnaire (which assesses personal motives to play online). Negative consequences due to excessive time spent on the Internet were assessed with the Internet Addiction Test. Multiple regression analysis showed that problematic use of MMORPGs is significantly predicted by: (1) high urgency (b = 0.45), and (2) a motivation to play for immersion (b = 0.35). This study showed that, for certain individuals (who are characterized by a proneness to act rashly in emotional contexts and motivated to play to be immersed in a virtual world), involvement in MMORPGs can become problematic and engender tangible negative consequences in daily life. Copyright © 2011 S. Karger AG, Basel.

  10. What to expect from immersive virtual environment exposure: influences of gender, body mass index, and past experience.

    PubMed

    Stanney, Kay M; Hale, Kelly S; Nahmens, Isabelina; Kennedy, Robert S

    2003-01-01

    For those interested in using head-coupled PC-based immersive virtual environment (VE) technology to train, entertain, or inform, it is essential to understand the effects this technology has on its users. This study investigated potential adverse effects, including the sickness associated with exposure and extreme responses (emesis, flashbacks). Participants were exposed to a VE for 15 to 60 min, with either complete or streamlined navigational control and simple or complex scenes, after which time measures of sickness were obtained. More than 80% of participants experienced nausea, oculomotor disturbances, and/or disorientation, with disorientation potentially lasting > 24 hr. Of the participants, 12.9% prematurely ended their exposure because of adverse effects; of these, 9.2% experienced an emetic response, whereas only 1.2% of all participants experienced emesis. The results indicate that designers may be able to reduce these rates by limiting exposure duration and reducing the degrees of freedom of the user's navigational control. Results from gender, body mass, and past experience comparisons indicated it may be possible to identify those who will experience adverse effects attributable to exposure and warn such individuals. Applications for this research include military, entertainment, and any other interactive systems for which designers seek to avoid adverse effects associated with exposure.

  11. A Storm's Approach; Hurricane Shelter Training in a Digital Age

    NASA Technical Reports Server (NTRS)

    Boyarsky, Andrew; Burden, David; Gronstedt, Anders; Jinman, Andrew

    2012-01-01

    New York City's Office of Emergency Management (OEM) originally ran hundreds of classroom based courses, where they brought together civil servants to learn how to run a Hurricane Shelter (HS). This approach was found to be costly, time consuming and lacked any sense of an impending disaster and need for emergency response. In partnership with the City of New York University School of Professional studies, Gronstedt Group and Daden Limited, the OEM wanted to create a simulation that overcame these issues, providing users with a more immersive and realistic approach at a lower cost. The HS simulation was built in the virtual world Second Life (SL). Virtual worlds are a genre of online communities that often take the form of a computer-based simulated environments, through which users can interact with one another and use or create objects. Using this technology allowed managers to apply their knowledge in both classroom and remote learning environments. The shelter simulation is operational 24/7, guiding users through a 4 1/2 hour narrative from start to finish. This paper will describe the rationale for the project, the technical approach taken - particularly the use of a web based authoring tool to create and manage the immersive simulation, and the results from operational use.

  12. Generation IV Nuclear Energy Systems Construction Cost Reductions through the Use of Virtual Environments - Task 4 Report: Virtual Mockup Maintenance Task Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Shaw; Anthony Baratta; Vaughn Whisker

    2005-02-28

    Task 4 report of 3 year DOE NERI-sponsored effort evaluating immersive virtual reality (CAVE) technology for design review, construction planning, and maintenance planning and training for next generation nuclear power plants. Program covers development of full-scale virtual mockups generated from 3D CAD data presented in a CAVE visualization facility. This report focuses on using Full-scale virtual mockups for nuclear power plant training applications.

  13. Virtual Viewing Time: The Relationship between Presence and Sexual Interest in Androphilic and Gynephilic Men

    PubMed Central

    Fromberger, Peter; Meyer, Sabrina; Kempf, Christina; Jordan, Kirsten; Müller, Jürgen L.

    2015-01-01

    Virtual Reality (VR) has successfully been used in the research of human behavior for more than twenty years. The main advantage of VR is its capability to induce a high sense of presence. This results in emotions and behavior which are very close to those shown in real situations. In the context of sex research, only a few studies have used high-immersive VR so far. The ones that did can be found mostly in the field of forensic psychology. Nevertheless, the relationship between presence and sexual interest still remains unclear. The present study is the first to examine the advantages of high-immersive VR in comparison to a conventional standard desktop system regarding their capability to measure sexual interest. 25 gynephilic and 20 androphilic healthy men underwent three experimental conditions, which differed in their ability to induce a sense of presence. In each condition, participants were asked to rate ten male and ten female virtual human characters regarding their sexual attractiveness. Without their knowledge, the subjects’ viewing time was assessed throughout the rating. Subjects were then asked to rate the sense of presence they had experienced as well as their perceived realism of the characters. Results suggested that stereoscopic viewing can significantly enhance the subjective sexual attractiveness of sexually relevant characters. Furthermore, in all three conditions participants looked significantly longer at sexually relevant virtual characters than at sexually non-relevant ones. The high immersion condition provided the best discriminant validity. From a statistical point of view, however, the sense of presence had no significant influence on the discriminant validity of the viewing time task. The study showed that high-immersive virtual environments enhance realism ratings as well as ratings of sexual attractiveness of three-dimensional human stimuli in comparison to standard desktop systems. Results also show that viewing time seems to be influenced neither by sexual attractiveness nor by realism of stimuli. This indicates how important task specific mechanisms of the viewing time effect are. PMID:25992790

  14. The (human) science of medical virtual learning environments.

    PubMed

    Stone, Robert J

    2011-01-27

    The uptake of virtual simulation technologies in both military and civilian surgical contexts has been both slow and patchy. The failure of the virtual reality community in the 1990s and early 2000s to deliver affordable and accessible training systems stems not only from an obsessive quest to develop the 'ultimate' in so-called 'immersive' hardware solutions, from head-mounted displays to large-scale projection theatres, but also from a comprehensive lack of attention to the needs of the end users. While many still perceive the science of simulation to be defined by technological advances, such as computing power, specialized graphics hardware, advanced interactive controllers, displays and so on, the true science underpinning simulation--the science that helps to guarantee the transfer of skills from the simulated to the real--is that of human factors, a well-established discipline that focuses on the abilities and limitations of the end user when designing interactive systems, as opposed to the more commercially explicit components of technology. Based on three surgical simulation case studies, the importance of a human factors approach to the design of appropriate simulation content and interactive hardware for medical simulation is illustrated. The studies demonstrate that it is unnecessary to pursue real-world fidelity in all instances in order to achieve psychological fidelity--the degree to which the simulated tasks reproduce and foster knowledge, skills and behaviours that can be reliably transferred to real-world training applications.

  15. Interactive floating windows: a new technique for stereoscopic video games

    NASA Astrophysics Data System (ADS)

    Zerebecki, Chris; Stanfield, Brodie; Tawadrous, Mina; Buckstein, Daniel; Hogue, Andrew; Kapralos, Bill

    2012-03-01

    The film industry has a long history of creating compelling experiences in stereoscopic 3D. Recently, the video game as an artistic medium has matured into an effective way to tell engaging and immersive stories. Given the current push to bring stereoscopic 3D technology into the consumer market there is considerable interest to develop stereoscopic 3D video games. Game developers have largely ignored the need to design their games specifically for stereoscopic 3D and have thus relied on automatic conversion and driver technology. Game developers need to evaluate solutions used in other media, such as film, to correct perceptual problems such as window violations, and modify or create new solutions to work within an interactive framework. In this paper we extend the dynamic floating window technique into the interactive domain enabling the player to position a virtual window in space. Interactively changing the position, size, and the 3D rotation of the virtual window, objects can be made to 'break the mask' dramatically enhancing the stereoscopic effect. By demonstrating that solutions from the film industry can be extended into the interactive space, it is our hope that this initiates further discussion in the game development community to strengthen their story-telling mechanisms in stereoscopic 3D games.

  16. Virtual reality as a tool for cross-cultural communication: an example from military team training

    NASA Astrophysics Data System (ADS)

    Downes-Martin, Stephen; Long, Mark; Alexander, Joanna R.

    1992-06-01

    A major problem with communication across cultures, whether professional or national, is that simple language translation if often insufficient to communicate the concepts. This is especially true when the communicators come from highly specialized fields of knowledge or from national cultures with long histories of divergence. This problem becomes critical when the goal of the communication is national negotiation dealing with such high risk items as arms negotiation or trade wars. Virtual Reality technology has considerable potential for facilitating communication across cultures, by immersing the communicators within multiple visual representations of the concepts, and providing control over those representations. Military distributed team training provides a model for virtual reality suitable for cross cultural communication such as negotiation. In both team training and negotiation, the participants must cooperate, agree on a set of goals, and achieve mastery over the concepts being negotiated. Team training technologies suitable for supporting cross cultural negotiation exist (branch wargaming, computer image generation and visualization, distributed simulation), and have developed along different lines than traditional virtual reality technology. Team training de-emphasizes the realism of physiological interfaces between the human and the virtual reality, and emphasizes the interaction of humans with each other and with intelligent simulated agents within the virtual reality. This approach to virtual reality is suggested as being more fruitful for future work.

  17. Mobile devices, Virtual Reality, Augmented Reality, and Digital Geoscience Education.

    NASA Astrophysics Data System (ADS)

    Crompton, H.; De Paor, D. G.; Whitmeyer, S. J.; Bentley, C.

    2016-12-01

    Mobile devices are playing an increasing role in geoscience education. Affordances include instructor-student communication and class management in large classrooms, virtual and augmented reality applications, digital mapping, and crowd-sourcing. Mobile technologies have spawned the sub field of mobile learning or m-learning, which is defined as learning across multiple contexts, through social and content interactions. Geoscientists have traditionally engaged in non-digital mobile learning via fieldwork, but digital devices are greatly extending the possibilities, especially for non-traditional students. Smartphones and tablets are the most common devices but smart glasses such as Pivothead enable live streaming of a first-person view (see for example, https://youtu.be/gWrDaYP5w58). Virtual reality headsets such as Google Cardboard create an immersive virtual field experience and digital imagery such as GigaPan and Structure from Motion enables instructors and/or students to create virtual specimens and outcrops that are sharable across the globe. Whereas virtual reality (VR) replaces the real world with a virtual representation, augmented reality (AR) overlays digital data on the live scene visible to the user in real time. We have previously reported on our use of the AR application called FreshAiR for geoscientific "egg hunts." The popularity of Pokémon Go demonstrates the potential of AR for mobile learning in the geosciences.

  18. What about the Firewall? Creating Virtual Worlds in a Public Primary School Using Sim-on-a-Stick

    ERIC Educational Resources Information Center

    Jacka, Lisa; Booth, Kate

    2012-01-01

    Virtual worlds are highly immersive, engaging and popular computer mediated environments being explored by children and adults. Why then aren't more teachers using virtual worlds in the classroom with primary and secondary school students? Reasons often cited are the learning required to master the technology, low-end graphics cards, poor…

  19. Synchronizing Self and Object Movement: How Child and Adult Cyclists Intercept Moving Gaps in a Virtual Environment

    ERIC Educational Resources Information Center

    Chihak, Benjamin J.; Plumert, Jodie M.; Ziemer, Christine J.; Babu, Sabarish; Grechkin, Timofey; Cremer, James F.; Kearney, Joseph K.

    2010-01-01

    Two experiments examined how 10- and 12-year-old children and adults intercept moving gaps while bicycling in an immersive virtual environment. Participants rode an actual bicycle along a virtual roadway. At 12 test intersections, participants attempted to pass through a gap between 2 moving, car-sized blocks without stopping. The blocks were…

  20. Possibilities and Determinants of Using Low-Cost Devices in Virtual Education Applications

    ERIC Educational Resources Information Center

    Bun, Pawel Kazimierz; Wichniarek, Radoslaw; Górski, Filip; Grajewski, Damian; Zawadzki, Przemyslaw; Hamrol, Adam

    2017-01-01

    Virtual reality (VR) may be used as an innovative educational tool. However, in order to fully exploit its potential, it is essential to achieve the effect of immersion. To more completely submerge the user in a virtual environment, it is necessary to ensure that the user's actions are directly translated into the image generated by the…

  1. Can Virtual Science Foster Real Skills? A Study of Inquiry Skills in a Virtual World

    ERIC Educational Resources Information Center

    Dodds, Heather E.

    2013-01-01

    Online education has grown into a part of the educational market answering the demand for learning at the learner's choice of time and place. Inquiry skills such as observing, questioning, collecting data, and devising fair experiments are an essential element of 21st-century online science coursework. Virtual immersive worlds such as Second Life…

  2. Augmenting the Thermal Flux Experiment: A Mixed Reality Approach with the HoloLens

    ERIC Educational Resources Information Center

    Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.

    2017-01-01

    In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted…

  3. Virtual Worlds; Real Learning: Design Principles for Engaging Immersive Environments

    NASA Technical Reports Server (NTRS)

    Wu (u. Sjarpm)

    2012-01-01

    The EMDT master's program at Full Sail University embarked on a small project to use a virtual environment to teach graduate students. The property used for this project has evolved our several iterations and has yielded some basic design principles and pedagogy for virtual spaces. As a result, students are emerging from the program with a better grasp of future possibilities.

  4. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  5. Embodying self-compassion within virtual reality and its effects on patients with depression

    PubMed Central

    Falconer, Caroline J.; Rovira, Aitor; King, John A.; Gilbert, Paul; Antley, Angus; Fearon, Pasco; Ralph, Neil; Slater, Mel

    2016-01-01

    Background Self-criticism is a ubiquitous feature of psychopathology and can be combatted by increasing levels of self-compassion. However, some patients are resistant to self-compassion. Aims To investigate whether the effects of self-identification with virtual bodies within immersive virtual reality could be exploited to increase self-compassion in patients with depression. Method We developed an 8-minute scenario in which 15 patients practised delivering compassion in one virtual body and then experienced receiving it from themselves in another virtual body. Results In an open trial, three repetitions of this scenario led to significant reductions in depression severity and self-criticism, as well as to a significant increase in self-compassion, from baseline to 4-week follow-up. Four patients showed clinically significant improvement. Conclusions The results indicate that interventions using immersive virtual reality may have considerable clinical potential and that further development of these methods preparatory to a controlled trial is now warranted. Declaration of interest None. Copyright and usage © The Royal College of Psychiatrists 2016. This is an open access article distributed under the terms of the Creative Commons Attribution (CC BY) licence. PMID:27703757

  6. Virtualized Traffic: reconstructing traffic flows from discrete spatiotemporal data.

    PubMed

    Sewall, Jason; van den Berg, Jur; Lin, Ming C; Manocha, Dinesh

    2011-01-01

    We present a novel concept, Virtualized Traffic, to reconstruct and visualize continuous traffic flows from discrete spatiotemporal data provided by traffic sensors or generated artificially to enhance a sense of immersion in a dynamic virtual world. Given the positions of each car at two recorded locations on a highway and the corresponding time instances, our approach can reconstruct the traffic flows (i.e., the dynamic motions of multiple cars over time) between the two locations along the highway for immersive visualization of virtual cities or other environments. Our algorithm is applicable to high-density traffic on highways with an arbitrary number of lanes and takes into account the geometric, kinematic, and dynamic constraints on the cars. Our method reconstructs the car motion that automatically minimizes the number of lane changes, respects safety distance to other cars, and computes the acceleration necessary to obtain a smooth traffic flow subject to the given constraints. Furthermore, our framework can process a continuous stream of input data in real time, enabling the users to view virtualized traffic events in a virtual world as they occur. We demonstrate our reconstruction technique with both synthetic and real-world input. © 2011 IEEE Published by the IEEE Computer Society

  7. IceCube Polar Virtual Reality exhibit: immersive learning for learners of all ages

    NASA Astrophysics Data System (ADS)

    Madsen, J.; Bravo Gallart, S.; Chase, A.; Dougherty, P.; Gagnon, D.; Pronto, K.; Rush, M.; Tredinnick, R.

    2017-12-01

    The IceCube Polar Virtual Reality project is an innovative, interactive exhibit that explains the operation and science of a flagship experiment in polar research, the IceCube Neutrino Observatory. The exhibit allows users to travel from the South Pole, where the detector is located, to the furthest reaches of the universe, learning how the detection of high-energy neutrinos has opened a new view to the universe. This novel exhibit combines a multitouch tabletop display system and commercially available virtual reality (VR) head-mounted displays to enable informal STEM learning of polar research. The exhibit, launched in early November 2017 during the Wisconsin Science Festival in Madison, WI, will study how immersive VR can enhance informal STEM learning. The foundation of this project is built upon a strong collaborative effort between the Living Environments Laboratory (LEL), the Wisconsin IceCube Particle Astrophysics Center (WIPAC), and the Field Day Laboratory groups from the University of Wisconsin-Madison campus. The project is funded through an NSF Advancing Informal STEM Learning (AISL) grant, under a special call for engaging students and the public in polar research. This exploratory pathways project seeks to build expertise to allow future extensions. The plan is to submit a subsequent AISL Broad Implementation proposal to add more 3D environments for other Antarctic research topics and locations in the future. We will describe the current implementation of the project and discuss the challenges and opportunities of working with an interdisciplinary team of scientists and technology and education researchers. We will also present preliminary assessment results, which seek to answer questions such as: Did users gain a better understanding of IceCube research from interacting with the exhibit? Do both technologies (touch table and VR headset) provide the same level of engagement? Is one technology better suited for specific learning outcomes?

  8. Knowledge-Based, Interactive, Custom Anatomical Scene Creation for Medical Education: The Biolucida System

    PubMed Central

    Warren, Wayne; Brinkley, James F.

    2005-01-01

    Few biomedical subjects of study are as resource-intensive to teach as gross anatomy. Medical education stands to benefit greatly from applications which deliver virtual representations of human anatomical structures. While many applications have been created to achieve this goal, their utility to the student is limited because of a lack of interactivity or customizability by expert authors. Here we describe the first version of the Biolucida system, which allows an expert anatomist author to create knowledge-based, customized, and fully interactive scenes and lessons for students of human macroscopic anatomy. Implemented in Java and VRML, Biolucida allows the sharing of these instructional 3D environments over the internet. The system simplifies the process of authoring immersive content while preserving its flexibility and expressivity. PMID:16779148

  9. Knowledge-based, interactive, custom anatomical scene creation for medical education: the Biolucida system.

    PubMed

    Warren, Wayne; Brinkley, James F

    2005-01-01

    Few biomedical subjects of study are as resource-intensive to teach as gross anatomy. Medical education stands to benefit greatly from applications which deliver virtual representations of human anatomical structures. While many applications have been created to achieve this goal, their utility to the student is limited because of a lack of interactivity or customizability by expert authors. Here we describe the first version of the Biolucida system, which allows an expert anatomist author to create knowledge-based, customized, and fully interactive scenes and lessons for students of human macroscopic anatomy. Implemented in Java and VRML, Biolucida allows the sharing of these instructional 3D environments over the internet. The system simplifies the process of authoring immersive content while preserving its flexibility and expressivity.

  10. Evaluation of smartphone-based interaction techniques in a CAVE in the context of immersive digital project review

    NASA Astrophysics Data System (ADS)

    George, Paul; Kemeny, Andras; Colombet, Florent; Merienne, Frédéric; Chardonnet, Jean-Rémy; Thouvenin, Indira Mouttapa

    2014-02-01

    Immersive digital project reviews consist in using virtual reality (VR) as a tool for discussion between various stakeholders of a project. In the automotive industry, the digital car prototype model is the common thread that binds them. It is used during immersive digital project reviews between designers, engineers, ergonomists, etc. The digital mockup is also used to assess future car architecture, habitability or perceived quality requirements with the aim to reduce using physical mockups for optimized cost, delay and quality efficiency. Among the difficulties identified by the users, handling the mockup is a major one. Inspired by current uses of nomad devices (multi-touch gestures, IPhone UI look'n'feel and AR applications), we designed a navigation technique taking advantage of these popular input devices: Space scrolling allows moving around the mockup. In this paper, we present the results of a study we conducted on the usability and acceptability of the proposed smartphone-based interaction metaphor compared to traditional technique and we provide indications of the most efficient choices for different use-cases accordingly. It was carried out in a traditional 4-sided CAVE and its purpose is to assess a chosen set of interaction techniques to be implemented in Renault's new 5-sides 4K x 4K wall high performance CAVE. The proposed new metaphor using nomad devices is well accepted by novice VR users and future implementation should allow an efficient industrial use. Their use is an easy and user friendly alternative of the existing traditional control devices such as a joystick.

  11. Virtual environment display for a 3D audio room simulation

    NASA Astrophysics Data System (ADS)

    Chapin, William L.; Foster, Scott

    1992-06-01

    Recent developments in virtual 3D audio and synthetic aural environments have produced a complex acoustical room simulation. The acoustical simulation models a room with walls, ceiling, and floor of selected sound reflecting/absorbing characteristics and unlimited independent localizable sound sources. This non-visual acoustic simulation, implemented with 4 audio ConvolvotronsTM by Crystal River Engineering and coupled to the listener with a Poihemus IsotrakTM, tracking the listener's head position and orientation, and stereo headphones returning binaural sound, is quite compelling to most listeners with eyes closed. This immersive effect should be reinforced when properly integrated into a full, multi-sensory virtual environment presentation. This paper discusses the design of an interactive, visual virtual environment, complementing the acoustic model and specified to: 1) allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; 2) reinforce the listener's feeling of telepresence into the acoustical environment with visual and proprioceptive sensations; 3) enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and 4) serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations. The installed system implements a head-coupled, wide-angle, stereo-optic tracker/viewer and multi-computer simulation control. The portable demonstration system implements a head-mounted wide-angle, stereo-optic display, separate head and pointer electro-magnetic position trackers, a heterogeneous parallel graphics processing system, and object oriented C++ program code.

  12. Immersive Learning Technologies: Realism and Online Authentic Learning

    ERIC Educational Resources Information Center

    Herrington, Jan; Reeves, Thomas C.; Oliver, Ron

    2007-01-01

    The development of immersive learning technologies in the form of virtual reality and advanced computer applications has meant that realistic creations of simulated environments are now possible. Such simulations have been used to great effect in training in the military, air force, and in medical training. But how realistic do problems need to be…

  13. The Role of Immersive Media in Online Education

    ERIC Educational Resources Information Center

    Bronack, Stephen C.

    2011-01-01

    An increasing number of educators are integrating immersive media into core course offerings. Virtual worlds, serious games, simulations, and augmented reality are enabling students and instructors to connect with content and with one another in novel ways. As a result, many are investigating the new affordances these media provide and the impact…

  14. Virtual reality for pain and anxiety management in children.

    PubMed

    Arane, Karen; Behboudi, Amir; Goldman, Ran D

    2017-12-01

    Question Pain and anxiety are common in children who need procedures such as administering vaccines or drawing blood. Recent reports have described the use of virtual reality (VR) as a method of distraction during such procedures. How does VR work in reducing pain and anxiety in pediatric patients and what are the potential uses for it? Answer Recent studies explored using VR with pediatric patients undergoing procedures ranging from vaccinations and intravenous injections to laceration repair and dressing changes for burn wounds. Interacting with immersive VR might divert attention, leading to a slower response to incoming pain signals. Preliminary results have shown that VR is effective, either alone or in combination with standard care, in reducing the pain and anxiety patients experience compared with standard care or other distraction methods. Copyright© the College of Family Physicians of Canada.

  15. Implementing Artificial Intelligence Behaviors in a Virtual World

    NASA Technical Reports Server (NTRS)

    Krisler, Brian; Thome, Michael

    2012-01-01

    In this paper, we will present a look at the current state of the art in human-computer interface technologies, including intelligent interactive agents, natural speech interaction and gestural based interfaces. We describe our use of these technologies to implement a cost effective, immersive experience on a public region in Second Life. We provision our Artificial Agents as a German Shepherd Dog avatar with an external rules engine controlling the behavior and movement. To interact with the avatar, we implemented a natural language and gesture system allowing the human avatars to use speech and physical gestures rather than interacting via a keyboard and mouse. The result is a system that allows multiple humans to interact naturally with AI avatars by playing games such as fetch with a flying disk and even practicing obedience exercises using voice and gesture, a natural seeming day in the park.

  16. Virtual reality in the assessment, understanding, and treatment of mental health disorders.

    PubMed

    Freeman, D; Reeve, S; Robinson, A; Ehlers, A; Clark, D; Spanlang, B; Slater, M

    2017-10-01

    Mental health problems are inseparable from the environment. With virtual reality (VR), computer-generated interactive environments, individuals can repeatedly experience their problematic situations and be taught, via evidence-based psychological treatments, how to overcome difficulties. VR is moving out of specialist laboratories. Our central aim was to describe the potential of VR in mental health, including a consideration of the first 20 years of applications. A systematic review of empirical studies was conducted. In all, 285 studies were identified, with 86 concerning assessment, 45 theory development, and 154 treatment. The main disorders researched were anxiety (n = 192), schizophrenia (n = 44), substance-related disorders (n = 22) and eating disorders (n = 18). There are pioneering early studies, but the methodological quality of studies was generally low. The gaps in meaningful applications to mental health are extensive. The most established finding is that VR exposure-based treatments can reduce anxiety disorders, but there are numerous research and treatment avenues of promise. VR was found to be a much-misused term, often applied to non-interactive and non-immersive technologies. We conclude that VR has the potential to transform the assessment, understanding and treatment of mental health problems. The treatment possibilities will only be realized if - with the user experience at the heart of design - the best immersive VR technology is combined with targeted translational interventions. The capability of VR to simulate reality could greatly increase access to psychological therapies, while treatment outcomes could be enhanced by the technology's ability to create new realities. VR may merit the level of attention given to neuroimaging.

  17. Manipulating the fidelity of lower extremity visual feedback to identify obstacle negotiation strategies in immersive virtual reality.

    PubMed

    Kim, Aram; Zhou, Zixuan; Kretch, Kari S; Finley, James M

    2017-07-01

    The ability to successfully navigate obstacles in our environment requires integration of visual information about the environment with estimates of our body's state. Previous studies have used partial occlusion of the visual field to explore how information about the body and impending obstacles are integrated to mediate a successful clearance strategy. However, because these manipulations often remove information about both the body and obstacle, it remains to be seen how information about the lower extremities alone is utilized during obstacle crossing. Here, we used an immersive virtual reality (VR) interface to explore how visual feedback of the lower extremities influences obstacle crossing performance. Participants wore a head-mounted display while walking on treadmill and were instructed to step over obstacles in a virtual corridor in four different feedback trials. The trials involved: (1) No visual feedback of the lower extremities, (2) an endpoint-only model, (3) a link-segment model, and (4) a volumetric multi-segment model. We found that the volumetric model improved success rate, placed their trailing foot before crossing and leading foot after crossing more consistently, and placed their leading foot closer to the obstacle after crossing compared to no model. This knowledge is critical for the design of obstacle negotiation tasks in immersive virtual environments as it may provide information about the fidelity necessary to reproduce ecologically valid practice environments.

  18. A Public Database of Immersive VR Videos with Corresponding Ratings of Arousal, Valence, and Correlations between Head Movements and Self Report Measures.

    PubMed

    Li, Benjamin J; Bailenson, Jeremy N; Pines, Adam; Greenleaf, Walter J; Williams, Leanne M

    2017-01-01

    Virtual reality (VR) has been proposed as a methodological tool to study the basic science of psychology and other fields. One key advantage of VR is that sharing of virtual content can lead to more robust replication and representative sampling. A database of standardized content will help fulfill this vision. There are two objectives to this study. First, we seek to establish and allow public access to a database of immersive VR video clips that can act as a potential resource for studies on emotion induction using virtual reality. Second, given the large sample size of participants needed to get reliable valence and arousal ratings for our video, we were able to explore the possible links between the head movements of the observer and the emotions he or she feels while viewing immersive VR. To accomplish our goals, we sourced for and tested 73 immersive VR clips which participants rated on valence and arousal dimensions using self-assessment manikins. We also tracked participants' rotational head movements as they watched the clips, allowing us to correlate head movements and affect. Based on past research, we predicted relationships between the standard deviation of head yaw and valence and arousal ratings. Results showed that the stimuli varied reasonably well along the dimensions of valence and arousal, with a slight underrepresentation of clips that are of negative valence and highly arousing. The standard deviation of yaw positively correlated with valence, while a significant positive relationship was found between head pitch and arousal. The immersive VR clips tested are available online as supplemental material.

  19. A Public Database of Immersive VR Videos with Corresponding Ratings of Arousal, Valence, and Correlations between Head Movements and Self Report Measures

    PubMed Central

    Li, Benjamin J.; Bailenson, Jeremy N.; Pines, Adam; Greenleaf, Walter J.; Williams, Leanne M.

    2017-01-01

    Virtual reality (VR) has been proposed as a methodological tool to study the basic science of psychology and other fields. One key advantage of VR is that sharing of virtual content can lead to more robust replication and representative sampling. A database of standardized content will help fulfill this vision. There are two objectives to this study. First, we seek to establish and allow public access to a database of immersive VR video clips that can act as a potential resource for studies on emotion induction using virtual reality. Second, given the large sample size of participants needed to get reliable valence and arousal ratings for our video, we were able to explore the possible links between the head movements of the observer and the emotions he or she feels while viewing immersive VR. To accomplish our goals, we sourced for and tested 73 immersive VR clips which participants rated on valence and arousal dimensions using self-assessment manikins. We also tracked participants' rotational head movements as they watched the clips, allowing us to correlate head movements and affect. Based on past research, we predicted relationships between the standard deviation of head yaw and valence and arousal ratings. Results showed that the stimuli varied reasonably well along the dimensions of valence and arousal, with a slight underrepresentation of clips that are of negative valence and highly arousing. The standard deviation of yaw positively correlated with valence, while a significant positive relationship was found between head pitch and arousal. The immersive VR clips tested are available online as supplemental material. PMID:29259571

  20. Designers workbench: toward real-time immersive modeling

    NASA Astrophysics Data System (ADS)

    Kuester, Falko; Duchaineau, Mark A.; Hamann, Bernd; Joy, Kenneth I.; Ma, Kwan-Liu

    2000-05-01

    This paper introduces the Designers Workbench, a semi- immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing, and computer-aided engineering systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates form a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The Designers Workbench aims at closing this technology or 'digital gap' experienced by design and CAD engineers by transforming the classical design paradigm into its fully integrate digital and virtual analog allowing collaborative development in a semi- immersive virtual environment. This project emphasizes two key components form the classical product design cycle: freeform modeling and analysis. In the freedom modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.

  1. The Virtual Tablet: Virtual Reality as a Control System

    NASA Technical Reports Server (NTRS)

    Chronister, Andrew

    2016-01-01

    In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of hardware, write software to incorporate each piece of position or orientation data into a coherent description of the object's location in space, place the virtual analogue accordingly, and project the control interface onto it, resulting in a functioning object which has both a physical and a virtual presence. Additionally, the virtual environment was enhanced with two live video feeds from cameras mounted on the robotic device being used as an example target of the virtual interface. The working VT allows users to naturally interact with a control interface with little to no training and without the issues found in previous efforts.

  2. Effect of Virtual Reality Exposure and Aural Stimuli on Eye Contact, Directional Focus, and Focus of Attention of Novice Wind Band Conductors

    ERIC Educational Resources Information Center

    Orman, Evelyn K.

    2016-01-01

    This study examined the effects of virtual reality immersion with audio on eye contact, directional focus and focus of attention for novice wind band conductors. Participants (N = 34) included a control group (n = 12) and two virtual reality groups with (n = 10) and without (n = 12) head tracking. Participants completed conducting/score study…

  3. EXPLORING ENVIRONMENTAL DATA IN A HIGHLY IMMERSIVE VIRTUAL REALITY ENVIRONMENT

    EPA Science Inventory

    Geography inherently fills a 3D space and yet we struggle with displaying geography using, primaarily, 2D display devices. Virtual environments offer a more realistically-dimensioned display space and this is being realized in the expanding area of research on 3D Geographic Infor...

  4. Touching proteins with virtual bare hands - Visualizing protein-drug complexes and their dynamics in self-made virtual reality using gaming hardware

    NASA Astrophysics Data System (ADS)

    Ratamero, Erick Martins; Bellini, Dom; Dowson, Christopher G.; Römer, Rudolf A.

    2018-06-01

    The ability to precisely visualize the atomic geometry of the interactions between a drug and its protein target in structural models is critical in predicting the correct modifications in previously identified inhibitors to create more effective next generation drugs. It is currently common practice among medicinal chemists while attempting the above to access the information contained in three-dimensional structures by using two-dimensional projections, which can preclude disclosure of useful features. A more accessible and intuitive visualization of the three-dimensional configuration of the atomic geometry in the models can be achieved through the implementation of immersive virtual reality (VR). While bespoke commercial VR suites are available, in this work, we present a freely available software pipeline for visualising protein structures through VR. New consumer hardware, such as the uc(HTC Vive) and the uc(Oculus Rift) utilized in this study, are available at reasonable prices. As an instructive example, we have combined VR visualization with fast algorithms for simulating intramolecular motions of protein flexibility, in an effort to further improve structure-led drug design by exposing molecular interactions that might be hidden in the less informative static models. This is a paradigmatic test case scenario for many similar applications in computer-aided molecular studies and design.

  5. Touching proteins with virtual bare hands : Visualizing protein-drug complexes and their dynamics in self-made virtual reality using gaming hardware.

    PubMed

    Ratamero, Erick Martins; Bellini, Dom; Dowson, Christopher G; Römer, Rudolf A

    2018-06-07

    The ability to precisely visualize the atomic geometry of the interactions between a drug and its protein target in structural models is critical in predicting the correct modifications in previously identified inhibitors to create more effective next generation drugs. It is currently common practice among medicinal chemists while attempting the above to access the information contained in three-dimensional structures by using two-dimensional projections, which can preclude disclosure of useful features. A more accessible and intuitive visualization of the three-dimensional configuration of the atomic geometry in the models can be achieved through the implementation of immersive virtual reality (VR). While bespoke commercial VR suites are available, in this work, we present a freely available software pipeline for visualising protein structures through VR. New consumer hardware, such as the HTC VIVE and the OCULUS RIFT utilized in this study, are available at reasonable prices. As an instructive example, we have combined VR visualization with fast algorithms for simulating intramolecular motions of protein flexibility, in an effort to further improve structure-led drug design by exposing molecular interactions that might be hidden in the less informative static models. This is a paradigmatic test case scenario for many similar applications in computer-aided molecular studies and design.

  6. Seeing an Embodied Virtual Hand is Analgesic Contingent on Colocation.

    PubMed

    Nierula, Birgit; Martini, Matteo; Matamala-Gomez, Marta; Slater, Mel; Sanchez-Vives, Maria V

    2017-06-01

    Seeing one's own body has been reported to have analgesic properties. Analgesia has also been described when seeing an embodied virtual body colocated with the real one. However, there is controversy regarding whether this effect holds true when seeing an illusory-owned body part, such as during the rubber-hand illusion. A critical difference between these paradigms is the distance between the real and surrogate body part. Colocation of the real and surrogate arm is possible in an immersive virtual environment, but not during illusory ownership of a rubber arm. The present study aimed at testing whether the distance between a real and a virtual arm can explain such differences in terms of pain modulation. Using a paradigm of embodiment of a virtual body allowed us to evaluate heat pain thresholds at colocation and at a 30-cm distance between the real and the virtual arm. We observed a significantly higher heat pain threshold at colocation than at a 30-cm distance. The analgesic effects of seeing a virtual colocated arm were eliminated by increasing the distance between the real and the virtual arm, which explains why seeing an illusorily owned rubber arm does not consistently result in analgesia. These findings are relevant for the use of virtual reality in pain management. Looking at a virtual body has analgesic properties similar to looking at one's real body. We identify the importance of colocation between a real and a surrogate body for this to occur and thereby resolve a scientific controversy. This information is useful for exploiting immersive virtual reality in pain management. Copyright © 2017. Published by Elsevier Inc.

  7. The Importance of Postural Cues for Determining Eye Height in Immersive Virtual Reality

    PubMed Central

    Leyrer, Markus; Linkenauger, Sally A.; Bülthoff, Heinrich H.; Mohler, Betty J.

    2015-01-01

    In human perception, the ability to determine eye height is essential, because eye height is used to scale heights of objects, velocities, affordances and distances, all of which allow for successful environmental interaction. It is well understood that eye height is fundamental to determine many of these percepts. Yet, how eye height itself is provided is still largely unknown. While the information potentially specifying eye height in the real world is naturally coincident in an environment with a regular ground surface, these sources of information can be easily divergent in similar and common virtual reality scenarios. Thus, we conducted virtual reality experiments where we manipulated the virtual eye height in a distance perception task to investigate how eye height might be determined in such a scenario. We found that humans rely more on their postural cues for determining their eye height if there is a conflict between visual and postural information and little opportunity for perceptual-motor calibration is provided. This is demonstrated by the predictable variations in their distance estimates. Our results suggest that the eye height in such circumstances is informed by postural cues when estimating egocentric distances in virtual reality and consequently, does not depend on an internalized value for eye height. PMID:25993274

  8. X3DOM as Carrier of the Virtual Heritage

    NASA Astrophysics Data System (ADS)

    Jung, Y.; Behr, J.; Graf, H.

    2011-09-01

    Virtual Museums (VM) are a new model of communication that aims at creating a personalized, immersive, and interactive way to enhance our understanding of the world around us. The term "VM" is a short-cut that comprehends various types of digital creations. One of the carriers for the communication of the virtual heritage at future internet level as de-facto standard is browser front-ends presenting the content and assets of museums. A major driving technology for the documentation and presentation of heritage driven media is real-time 3D content, thus imposing new strategies for a web inclusion. 3D content must become a first class web media that can be created, modified, and shared in the same way as text, images, audio and video are handled on the web right now. A new integration model based on a DOM integration into the web browsers' architecture opens up new possibilities for declarative 3 D content on the web and paves the way for new application scenarios for the virtual heritage at future internet level. With special regards to the X3DOM project as enabling technology for declarative 3D in HTML, this paper describes application scenarios and analyses its technological requirements for an efficient presentation and manipulation of virtual heritage assets on the web.

  9. The importance of postural cues for determining eye height in immersive virtual reality.

    PubMed

    Leyrer, Markus; Linkenauger, Sally A; Bülthoff, Heinrich H; Mohler, Betty J

    2015-01-01

    In human perception, the ability to determine eye height is essential, because eye height is used to scale heights of objects, velocities, affordances and distances, all of which allow for successful environmental interaction. It is well understood that eye height is fundamental to determine many of these percepts. Yet, how eye height itself is provided is still largely unknown. While the information potentially specifying eye height in the real world is naturally coincident in an environment with a regular ground surface, these sources of information can be easily divergent in similar and common virtual reality scenarios. Thus, we conducted virtual reality experiments where we manipulated the virtual eye height in a distance perception task to investigate how eye height might be determined in such a scenario. We found that humans rely more on their postural cues for determining their eye height if there is a conflict between visual and postural information and little opportunity for perceptual-motor calibration is provided. This is demonstrated by the predictable variations in their distance estimates. Our results suggest that the eye height in such circumstances is informed by postural cues when estimating egocentric distances in virtual reality and consequently, does not depend on an internalized value for eye height.

  10. Hands-on Learning in the Virtual World

    ERIC Educational Resources Information Center

    Branson, John; Thomson, Diane

    2013-01-01

    The U.S. military has long understood the value of immersive simulations in education. Before the Navy entrusts a ship to a crew, crew members must first practice and demonstrate their competency in a fully immersive, simulated environment. Why not teach students in the same way? K-12 educators in Pennsylvania, USA, recently did just that when…

  11. Engagement with Electronic Screen Media among Students with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Mineo, Beth A.; Ziegler, William; Gill, Susan; Salkin, Donna

    2009-01-01

    This study investigated the relative engagement potential of four types of electronic screen media (ESM): animated video, video of self, video of a familiar person engaged with an immersive virtual reality (VR) game, and immersion of self in the VR game. Forty-two students with autism, varying in age and expressive communication ability, were…

  12. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca

    2009-01-01

    The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…

  13. Development of an audio-based virtual gaming environment to assist with navigation skills in the blind.

    PubMed

    Connors, Erin C; Yazzolino, Lindsay A; Sánchez, Jaime; Merabet, Lotfi B

    2013-03-27

    Audio-based Environment Simulator (AbES) is virtual environment software designed to improve real world navigation skills in the blind. Using only audio based cues and set within the context of a video game metaphor, users gather relevant spatial information regarding a building's layout. This allows the user to develop an accurate spatial cognitive map of a large-scale three-dimensional space that can be manipulated for the purposes of a real indoor navigation task. After game play, participants are then assessed on their ability to navigate within the target physical building represented in the game. Preliminary results suggest that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building as indexed by their performance on a series of navigation tasks. These tasks included path finding through the virtual and physical building, as well as a series of drop off tasks. We find that the immersive and highly interactive nature of the AbES software appears to greatly engage the blind user to actively explore the virtual environment. Applications of this approach may extend to larger populations of visually impaired individuals.

  14. Finite Element Methods for real-time Haptic Feedback of Soft-Tissue Models in Virtual Reality Simulators

    NASA Technical Reports Server (NTRS)

    Frank, Andreas O.; Twombly, I. Alexander; Barth, Timothy J.; Smith, Jeffrey D.; Dalton, Bonnie P. (Technical Monitor)

    2001-01-01

    We have applied the linear elastic finite element method to compute haptic force feedback and domain deformations of soft tissue models for use in virtual reality simulators. Our results show that, for virtual object models of high-resolution 3D data (>10,000 nodes), haptic real time computations (>500 Hz) are not currently possible using traditional methods. Current research efforts are focused in the following areas: 1) efficient implementation of fully adaptive multi-resolution methods and 2) multi-resolution methods with specialized basis functions to capture the singularity at the haptic interface (point loading). To achieve real time computations, we propose parallel processing of a Jacobi preconditioned conjugate gradient method applied to a reduced system of equations resulting from surface domain decomposition. This can effectively be achieved using reconfigurable computing systems such as field programmable gate arrays (FPGA), thereby providing a flexible solution that allows for new FPGA implementations as improved algorithms become available. The resulting soft tissue simulation system would meet NASA Virtual Glovebox requirements and, at the same time, provide a generalized simulation engine for any immersive environment application, such as biomedical/surgical procedures or interactive scientific applications.

  15. Knowledge and Valorization of Historical Sites Through 3d Documentation and Modeling

    NASA Astrophysics Data System (ADS)

    Farella, E.; Menna, F.; Nocerino, E.; Morabito, D.; Remondino, F.; Campi, M.

    2016-06-01

    The paper presents the first results of an interdisciplinary project related to the 3D documentation, dissemination, valorization and digital access of archeological sites. Beside the mere 3D documentation aim, the project has two goals: (i) to easily explore and share via web references and results of the interdisciplinary work, including the interpretative process and the final reconstruction of the remains; (ii) to promote and valorize archaeological areas using reality-based 3D data and Virtual Reality devices. This method has been verified on the ruins of the archeological site of Pausilypon, a maritime villa of Roman period (Naples, Italy). Using Unity3D, the virtual tour of the heritage site was integrated and enriched with the surveyed 3D data, text documents, CAAD reconstruction hypotheses, drawings, photos, etc. In this way, starting from the actual appearance of the ruins (panoramic images), passing through the 3D digital surveying models and several other historical information, the user is able to access virtual contents and reconstructed scenarios, all in a single virtual, interactive and immersive environment. These contents and scenarios allow to derive documentation and geometrical information, understand the site, perform analyses, see interpretative processes, communicate historical information and valorize the heritage location.

  16. The use of virtual reality in memory rehabilitation: current findings and future directions.

    PubMed

    Brooks, B M; Rose, F D

    2003-01-01

    There is considerable potential for using virtual reality (VR) in memory rehabilitation which is only just beginning to be realized. PC-based virtual environments are probably better suited for this purpose than more immersive virtual environments because they are relatively inexpensive and portable, and less frightening to patients. Those exploratory studies that have so far been performed indicate that VR involvement would be usefully directed towards improving assessments of memory impairments and in memory remediation using reorganization techniques. In memory assessment, the use of VR could provide more comprehensive, ecologically-valid, and controlled evaluations of prospective, incidental, and spatial memory in a rehabilitation setting than is possible using standardized assessment tests. The additional knowledge gained from these assessments could more effectively direct rehabilitation towards specific impairments of individual patients. In memory remediation, VR training has been found to promote procedural learning in people with memory impairments, and this learning has been found to transfer to improved real-world performance. Future research should investigate ways in which the procedural knowledge gained during VR interaction can be adapted to offset the many disabilities which result from different forms of memory impairment.

  17. An Augmented Reality Nanomanipulator for Learning Nanophysics: The "NanoLearner" Platform

    NASA Astrophysics Data System (ADS)

    Marchi, Florence; Marliere, Sylvain; Florens, Jean Loup; Luciani, Annie; Chevrier, Joel

    The work focuses on the description and evaluation of an augmented reality nanomanipulator, called "NanoLearner" platform used as educational tool in practical works of nanophysics. Through virtual reality associated to multisensory renderings, students are immersed in the nanoworld where they can interact in real time with a sample surface or an object, using their senses as hearing, seeing and touching. The role of each sensorial rendering in the understanding and control of the "approach-retract" interaction has been determined thanks to statistical studies obtained during the practical works. Finally, we present two extensions of the use of this innovative tool for investigating nano effects in living organisms and for allowing grand public to have access to a natural understanding of nanophenomena.

  18. Expeditious illustration of layer-cake models on and above a tactile surface

    NASA Astrophysics Data System (ADS)

    Lopes, Daniel Simões; Mendes, Daniel; Sousa, Maurício; Jorge, Joaquim

    2016-05-01

    Too often illustrating and visualizing 3D geological concepts are performed by sketching in 2D mediums, which may limit drawing performance of initial concepts. Here, the potential of expeditious geological modeling brought by hand gestures is explored. A spatial interaction system was developed to enable rapid modeling, editing, and exploration of 3D layer-cake objects. User interactions are acquired with motion capture and touch screen technologies. Virtual immersion is guaranteed by using stereoscopic technology. The novelty consists of performing expeditious modeling of coarse geological features with only a limited set of hand gestures. Results from usability-studies show that the proposed system is more efficient when compared to a windows-icon-menu-pointer modeling application.

  19. 3DUI assisted lower and upper member therapy.

    PubMed

    Uribe-Quevedo, Alvaro; Perez-Gutierrez, Byron

    2012-01-01

    3DUIs are becoming very popular among researchers, developers and users as they allow more immersive and interactive experiences by taking advantage of the human dexterity. The features offered by these interfaces outside the gaming environment, have allowed the development of applications in the medical area by enhancing the user experience and aiding the therapy process in controlled and monitored environments. Using mainstream videogame 3DUIs based on inertial and image sensors available in the market, this work presents the development of a virtual environment and its navigation through lower member captured gestures for assisting motion during therapy.

  20. Cognitive Presence and Effect of Immersion in Virtual Learning Environment

    ERIC Educational Resources Information Center

    Katernyak, Ihor; Loboda, Viktoriya

    2016-01-01

    This paper presents the approach to successful application of two knowledge management techniques--community of practice and eLearning, in order to create and manage a competence-developing virtual learning environment. It explains how "4A" model of involving practitioners in eLearning process (through attention, actualization,…

  1. Evaluation of Virtual Reality Training Using Affect

    ERIC Educational Resources Information Center

    Tichon, Jennifer

    2012-01-01

    Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality (VR) where dangerous real world scenarios can be safely replicated. However, despite the growing popularity of VR to train cognitive skills such as decision-making and situation awareness, methods for evaluating their use rely…

  2. Undergraduate Student Self-Efficacy and Perceptions of Virtual World Learning Experience

    ERIC Educational Resources Information Center

    Stanton, Lorraine May

    2017-01-01

    Virtual worlds are innovative teaching and learning methods that can provide immersive and engaging learning experiences (Lu, 2010). Though they have potential benefits, students sometimes experience a steep learning curve and discomfort with the technology (Warburton, 2009). This study explored how students in two American Studies classes using…

  3. Virtual Environments and Autism: A Developmental Psychopathological Approach

    ERIC Educational Resources Information Center

    Rajendran, G.

    2013-01-01

    Individuals with autism spectrum disorders supposedly have an affinity with information and communication technology (ICT), making it an ideally suited media for this population. Virtual environments (VEs)--both two-dimensional and immersive--represent a particular kind of ICT that might be of special benefit. Specifically, this paper discusses…

  4. Intelligent Tutors in Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Yan, Peng; Slator, Brian M.; Vender, Bradley; Jin, Wei; Kariluoma, Matti; Borchert, Otto; Hokanson, Guy; Aggarwal, Vaibhav; Cosmano, Bob; Cox, Kathleen T.; Pilch, André; Marry, Andrew

    2013-01-01

    Research into virtual role-based learning has progressed over the past decade. Modern issues include gauging the difficulty of designing a goal system capable of meeting the requirements of students with different knowledge levels, and the reasonability and possibility of taking advantage of the well-designed formula and techniques served in other…

  5. Exploring Moral Action Using lmmersive Virtual Reality

    DTIC Science & Technology

    2016-10-01

    the Obedience. in The Bar experimental scenario is in the context of sexual harassment and has two phases, a ll in immersive virtual rea lity. In...a paper for submission to a high impact journal (depending of course on the final resu lts). 4. Conclusions The original proposal set out the

  6. Similarities and differences between eating disorders and obese patients in a virtual environment for normalizing eating patterns.

    PubMed

    Perpiñá, Conxa; Roncero, María

    2016-05-01

    Virtual reality has demonstrated promising results in the treatment of eating disorders (ED); however, few studies have examined its usefulness in treating obesity. The aim of this study was to compare ED and obese patients on their reality judgment of a virtual environment (VE) designed to normalize their eating pattern. A second objective was to study which variables predicted the reality of the experience of eating a virtual forbidden-fattening food. ED patients, obese patients, and a non-clinical group (N=62) experienced a non-immersive VE, and then completed reality judgment and presence measures. All participants rated the VE with similar scores for quality, interaction, engagement, and ecological validity; however, ED patients obtained the highest scores on emotional involvement, attention, reality judgment/presence, and negative effects. The obese group gave the lowest scores to reality judgment/presence, satisfaction and sense of physical space, and they held an intermediate position in the attribution of reality to virtually eating a "fattening" food. The palatability of a virtual food was predicted by attention capturing and belonging to the obese group, while the attribution of reality to the virtual eating was predicted by engagement and belonging to the ED group. This study offers preliminary results about the differential impact on ED and obese patients of the exposure to virtual food, and about the need to implement a VE that can be useful as a virtual lab for studying eating behavior and treating obesity. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Assessment of radiation awareness training in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Whisker, Vaughn E., III

    The prospect of new nuclear power plant orders in the near future and the graying of the current workforce create a need to train new personnel faster and better. Immersive virtual reality (VR) may offer a solution to the training challenge. VR technology presented in a CAVE Automatic Virtual Environment (CAVE) provides a high-fidelity, one-to-one scale environment where areas of the power plant can be recreated and virtual radiation environments can be simulated, making it possible to safely expose workers to virtual radiation in the context of the actual work environment. The use of virtual reality for training is supported by many educational theories; constructivism and discovery learning, in particular. Educational theory describes the importance of matching the training to the task. Plant access training and radiation worker training, common forms of training in the nuclear industry, rely on computer-based training methods in most cases, which effectively transfer declarative knowledge, but are poor at transferring skills. If an activity were to be added, the training would provide personnel with the opportunity to develop skills and apply their knowledge so they could be more effective when working in the radiation environment. An experiment was developed to test immersive virtual reality's suitability for training radiation awareness. Using a mixed methodology of quantitative and qualitative measures, the subjects' performances before and after training were assessed. First, subjects completed a pre-test to measure their knowledge prior to completing any training. Next they completed unsupervised computer-based training, which consisted of a PowerPoint presentation and a PDF document. After completing a brief orientation activity in the virtual environment, one group of participants received supplemental radiation awareness training in a simulated radiation environment presented in the CAVE, while a second group, the control group, moved directly to the assessment phase of the experiment. The CAVE supplied an activity-based training environment where learners were able to use a virtual survey meter to explore the properties of radiation sources and the effects of time and distance on radiation exposure. Once the training stage had ended, the subjects completed an assessment activity where they were asked to complete four tasks in a simulated radiation environment in the CAVE, which was designed to provide a more authentic assessment than simply testing understanding using a quiz. After the practicum, the subjects completed a post-test. Survey information was also collected to assist the researcher with interpretation of the collected data. Response to the training was measured by completion time, radiation exposure received, successful completion of the four tasks in the practicum, and scores on the post-test. These results were combined to create a radiation awareness score. In addition, observational data was collected as the subjects completed the tasks. The radiation awareness scores of the control group and the group that received supplemental training in the virtual environment were compared. T-tests showed that the effect of the supplemental training was not significant; however, calculation of the effect size showed a small-to-medium effect of the training. The CAVE group received significantly less radiation exposure during the assessment activity, and they completed the activities on an average of one minute faster. These results indicate that the training was effective, primarily for instilling radiation sensitivity. Observational data collected during the assessment supports this conclusion. The training environment provided by the immersive virtual reality recreated a radiation environment where learners could apply knowledge they had been taught by computer-based training. Activity-based training has been shown to be a more effective way to transfer skills because of the similarity between the training environment and the application environment. Virtual reality enables the training environment to look and feel like the application environment. Because of this, radiation awareness training in an immersive virtual environment should be considered by the nuclear industry, which is supported by the results of this experiment.

  8. Temporally coherent 4D video segmentation for teleconferencing

    NASA Astrophysics Data System (ADS)

    Ehmann, Jana; Guleryuz, Onur G.

    2013-09-01

    We develop an algorithm for 4-D (RGB+Depth) video segmentation targeting immersive teleconferencing ap- plications on emerging mobile devices. Our algorithm extracts users from their environments and places them onto virtual backgrounds similar to green-screening. The virtual backgrounds increase immersion and interac- tivity, relieving the users of the system from distractions caused by disparate environments. Commodity depth sensors, while providing useful information for segmentation, result in noisy depth maps with a large number of missing depth values. By combining depth and RGB information, our work signi¯cantly improves the other- wise very coarse segmentation. Further imposing temporal coherence yields compositions where the foregrounds seamlessly blend with the virtual backgrounds with minimal °icker and other artifacts. We achieve said improve- ments by correcting the missing information in depth maps before fast RGB-based segmentation, which operates in conjunction with temporal coherence. Simulation results indicate the e±cacy of the proposed system in video conferencing scenarios.

  9. New tools for sculpting cranial implants in a shared haptic augmented reality environment.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2006-01-01

    New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.

  10. Height, social comparison, and paranoia: An immersive virtual reality experimental study

    PubMed Central

    Freeman, Daniel; Evans, Nicole; Lister, Rachel; Antley, Angus; Dunn, Graham; Slater, Mel

    2014-01-01

    Mistrust of others may build upon perceptions of the self as vulnerable, consistent with an association of paranoia with perceived lower social rank. Height is a marker of social status and authority. Therefore we tested the effect of manipulating height, as a proxy for social rank, on paranoia. Height was manipulated within an immersive virtual reality simulation. Sixty females who reported paranoia experienced a virtual reality train ride twice: at their normal and reduced height. Paranoia and social comparison were assessed. Reducing a person's height resulted in more negative views of the self in comparison with other people and increased levels of paranoia. The increase in paranoia was fully mediated by changes in social comparison. The study provides the first demonstration that reducing height in a social situation increases the occurrence of paranoia. The findings indicate that negative social comparison is a cause of mistrust. PMID:24924485

  11. Second Life in Higher Education: Assessing the Potential for and the Barriers to Deploying Virtual Worlds in Learning and Teaching

    ERIC Educational Resources Information Center

    Warburton, Steven

    2009-01-01

    "Second Life" (SL) is currently the most mature and popular multi-user virtual world platform being used in education. Through an in-depth examination of SL, this article explores its potential and the barriers that multi-user virtual environments present to educators wanting to use immersive 3-D spaces in their teaching. The context is set by…

  12. Virtual Reality in Neurointervention.

    PubMed

    Ong, Chin Siang; Deib, Gerard; Yesantharao, Pooja; Qiao, Ye; Pakpoor, Jina; Hibino, Narutoshi; Hui, Ferdinand; Garcia, Juan R

    2018-06-01

    Virtual reality (VR) allows users to experience realistic, immersive 3D virtual environments with the depth perception and binocular field of view of real 3D settings. Newer VR technology has now allowed for interaction with 3D objects within these virtual environments through the use of VR controllers. This technical note describes our preliminary experience with VR as an adjunct tool to traditional angiographic imaging in the preprocedural workup of a patient with a complex pseudoaneurysm. Angiographic MRI data was imported and segmented to create 3D meshes of bilateral carotid vasculature. The 3D meshes were then projected into VR space, allowing the operator to inspect the carotid vasculature using a 3D VR headset as well as interact with the pseudoaneurysm (handling, rotation, magnification, and sectioning) using two VR controllers. 3D segmentation of a complex pseudoaneurysm in the distal cervical segment of the right internal carotid artery was successfully performed and projected into VR. Conventional and VR visualization modes were equally effective in identifying and classifying the pathology. VR visualization allowed the operators to manipulate the dataset to achieve a greater understanding of the anatomy of the parent vessel, the angioarchitecture of the pseudoaneurysm, and the surface contours of all visualized structures. This preliminary study demonstrates the feasibility of utilizing VR for preprocedural evaluation in patients with anatomically complex neurovascular disorders. This novel visualization approach may serve as a valuable adjunct tool in deciding patient-specific treatment plans and selection of devices prior to intervention.

  13. Smartphone applications for immersive virtual reality therapy for internet addiction and internet gaming disorder.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    There have been rapid advances in technologies over the past decade and virtual reality technology is an area which is increasingly utilized as a healthcare intervention in many disciplines including that of Medicine, Surgery and Psychiatry. In Psychiatry, most of the current interventions involving the usage of virtual reality technology is limited to its application for anxiety disorders. With the advances in technology, Internet addiction and Internet gaming disorders are increasingly prevalent. To date, these disorders are still being treated using conventional psychotherapy methods such as cognitive behavioural therapy. However, there is an increasing number of research combining various other therapies alongside with cognitive behavioural therapy, as an attempt possibly to reduce the drop-out rates and to make such interventions more relevant to the targeted group of addicts, who are mostly adolescents. To date, there has been a prior study done in Korea that has demonstrated the comparable efficacy of virtual reality therapy with that of cognitive behavioural therapy. However, the intervention requires the usage of specialized screens and devices. It is thus the objective of the current article to highlight how smartphone applications could be designed and be utilized for immersive virtual reality treatment, alongside low cost wearables.

  14. An exploratory fNIRS study with immersive virtual reality: a new method for technical implementation.

    PubMed

    Seraglia, Bruno; Gamberini, Luciano; Priftis, Konstantinos; Scatturin, Pietro; Martinelli, Massimiliano; Cutini, Simone

    2011-01-01

    For over two decades Virtual Reality (VR) has been used as a useful tool in several fields, from medical and psychological treatments, to industrial and military applications. Only in recent years researchers have begun to study the neural correlates that subtend VR experiences. Even if the functional Magnetic Resonance Imaging (fMRI) is the most common and used technique, it suffers several limitations and problems. Here we present a methodology that involves the use of a new and growing brain imaging technique, functional Near-infrared Spectroscopy (fNIRS), while participants experience immersive VR. In order to allow a proper fNIRS probe application, a custom-made VR helmet was created. To test the adapted helmet, a virtual version of the line bisection task was used. Participants could bisect the lines in a virtual peripersonal or extrapersonal space, through the manipulation of a Nintendo Wiimote ® controller in order for the participants to move a virtual laser pointer. Although no neural correlates of the dissociation between peripersonal and extrapersonal space were found, a significant hemodynamic activity with respect to the baseline was present in the right parietal and occipital areas. Both advantages and disadvantages of the presented methodology are discussed.

  15. How 3D immersive visualization is changing medical diagnostics

    NASA Astrophysics Data System (ADS)

    Koning, Anton H. J.

    2011-03-01

    Originally the only way to look inside the human body without opening it up was by means of two dimensional (2D) images obtained using X-ray equipment. The fact that human anatomy is inherently three dimensional leads to ambiguities in interpretation and problems of occlusion. Three dimensional (3D) imaging modalities such as CT, MRI and 3D ultrasound remove these drawbacks and are now part of routine medical care. While most hospitals 'have gone digital', meaning that the images are no longer printed on film, they are still being viewed on 2D screens. However, this way valuable depth information is lost, and some interactions become unnecessarily complex or even unfeasible. Using a virtual reality (VR) system to present volumetric data means that depth information is presented to the viewer and 3D interaction is made possible. At the Erasmus MC we have developed V-Scope, an immersive volume visualization system for visualizing a variety of (bio-)medical volumetric datasets, ranging from 3D ultrasound, via CT and MRI, to confocal microscopy, OPT and 3D electron-microscopy data. In this talk we will address the advantages of such a system for both medical diagnostics as well as for (bio)medical research.

  16. Development of a low-cost virtual reality workstation for training and education

    NASA Technical Reports Server (NTRS)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) it involves 3-dimensional computer graphics; (2) it includes real-time feedback and response to user actions; and (3) it must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, but the high cost of VR technology has limited its practical application to fields with big budgets, such as military combat simulation, commercial pilot training, and certain projects within the space program. However, in the last year there has been a revolution in the cost of VR technology. The speed of inexpensive personal computers has increased dramatically, especially with the introduction of the Pentium processor and the PCI bus for IBM-compatibles, and the cost of high-quality virtual reality peripherals has plummeted. The result is that many public schools, colleges, and universities can afford a PC-based workstation capable of running immersive virtual reality applications. My goal this summer was to assemble and evaluate such a system.

  17. Virtual Reality Used to Serve the Glenn Engineering Community

    NASA Technical Reports Server (NTRS)

    Carney, Dorothy V.

    2001-01-01

    There are a variety of innovative new visualization tools available to scientists and engineers for the display and analysis of their models. At the NASA Glenn Research Center, we have an ImmersaDesk, a large, single-panel, semi-immersive display device. This versatile unit can interactively display three-dimensional images in visual stereo. Our challenge is to make this virtual reality platform accessible and useful to researchers. An example of a successful application of this computer technology is the display of blade out simulations. NASA Glenn structural dynamicists, Dr. Kelly Carney and Dr. Charles Lawrence, funded by the Ultra Safe Propulsion Project under Base R&T, are researching blade outs, when turbine engines lose a fan blade during operation. Key objectives of this research include minimizing danger to the aircraft via effective blade containment, predicting destructive loads due to the imbalance following a blade loss, and identifying safe, cost-effective designs and materials for future engines.

  18. VIRTUAL REALITY CUE EXPOSURE THERAPY FOR THE TREATMENT OF TOBACCO DEPENDENCE

    PubMed Central

    Culbertson, Christopher S.; Shulenberger, Stephanie; De La Garza, Richard; Newton, Thomas F.; Brody, Arthur L.

    2012-01-01

    Researchers and clinicians have recently begun using Virtual Reality (VR) to create immersive and interactive cue exposure paradigms. The current study aimed to assess the effectiveness of individual cue exposure therapy (CET), using smoking-related VR cues (smoking-VR) as a smoking cessation treatment compared to a placebo-VR (neutral cue) treatment. The sample consisted of healthy treatment-seeking cigarette smokers, who underwent bi-weekly cognitive behavioral group therapy (CBT) plus either smoking-VR CET or placebo-VR CET (random assignment). Smoking-VR CET participants had a higher quit rate than placebo-VR CET participants (P = 0.015). Smoking-VR CET treated participants also reported smoking significantly fewer cigarettes per day at the end of treatment than placebo-VR CET treated participants (P = 0.034). These data indicate that smoking-related VR CET may prove useful in enhancing the efficacy of CBT treatment for tobacco dependence. PMID:25342999

  19. Repeated Use of Immersive Virtual Reality Therapy to Control Pain during Wound Dressing Changes in Pediatric and Adult Burn Patients

    PubMed Central

    Faber, Albertus W.; Patterson, David R.; Bremer, Marco

    2012-01-01

    Objective The current study explored whether immersive virtual reality continues to reduce pain (via distraction) during more than one wound care session per patient. Patients: Thirty six patients aged 8 to 57 years (mean age of 27.7 years), with an average of 8.4% total body surface area burned (range .25 to 25.5 TBSA) received bandage changes, and wound cleaning. Methods Each patient received one baseline wound cleaning/debridement session with no-VR (control condition) followed by one or more (up to seven) subsequent wound care sessions during VR. After each wound care session (one session per day), worst pain intensity was measured using a Visual Analogue Thermometer (VAT), the dependent variable. Using a within subjects design, worst pain intensity VAT during wound care with no-VR (baseline, Day 0) was compared to pain during wound care while using immersive virtual reality (up to seven days of wound care during VR). Results Compared to pain during no-VR Baseline (Day 0), pain ratings during wound debridement were statistically lower when patients were in virtual reality on Days 1, 2 and 3, and although not significant beyond day 3, the pattern of results from Days 4, 5, and 6 are consistent with the notion that VR continues to reduce pain when used repeatedly. Conclusions Results from the present study suggest that VR continues to be effective when used for three (or possibly more) treatments during severe burn wound debridement. PMID:23970314

  20. Introducing an Avatar Acceptance Model: Student Intention to Use 3D Immersive Learning Tools in an Online Learning Classroom

    ERIC Educational Resources Information Center

    Kemp, Jeremy William

    2011-01-01

    This quantitative survey study examines the willingness of online students to adopt an immersive virtual environment as a classroom tool and compares this with their feelings about more traditional learning modes including our ANGEL learning management system and the Elluminate live Web conferencing tool. I surveyed 1,108 graduate students in…

Top