Immersive Training Systems: Virtual Reality and Education and Training.
ERIC Educational Resources Information Center
Psotka, Joseph
1995-01-01
Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…
LVC interaction within a mixed-reality training system
NASA Astrophysics Data System (ADS)
Pollock, Brice; Winer, Eliot; Gilbert, Stephen; de la Cruz, Julio
2012-03-01
The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems.
ERIC Educational Resources Information Center
Nussli, Natalie; Oh, Kevin
2014-01-01
The overarching question that guides this review is to identify the key components of effective teacher training in virtual schooling, with a focus on three-dimensional (3D) immersive virtual worlds (IVWs). The process of identifying the essential components of effective teacher training in the use of 3D IVWs will be described step-by-step. First,…
Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life
NASA Astrophysics Data System (ADS)
Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia
2011-03-01
Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.
Evaluating the Effects of Immersive Embodied Interaction on Cognition in Virtual Reality
NASA Astrophysics Data System (ADS)
Parmar, Dhaval
Virtual reality is on its advent of becoming mainstream household technology, as technologies such as head-mounted displays, trackers, and interaction devices are becoming affordable and easily available. Virtual reality (VR) has immense potential in enhancing the fields of education and training, and its power can be used to spark interest and enthusiasm among learners. It is, therefore, imperative to evaluate the risks and benefits that immersive virtual reality poses to the field of education. Research suggests that learning is an embodied process. Learning depends on grounded aspects of the body including action, perception, and interactions with the environment. This research aims to study if immersive embodiment through the means of virtual reality facilitates embodied cognition. A pedagogical VR solution which takes advantage of embodied cognition can lead to enhanced learning benefits. Towards achieving this goal, this research presents a linear continuum for immersive embodied interaction within virtual reality. This research evaluates the effects of three levels of immersive embodied interactions on cognitive thinking, presence, usability, and satisfaction among users in the fields of science, technology, engineering, and mathematics (STEM) education. Results from the presented experiments show that immersive virtual reality is greatly effective in knowledge acquisition and retention, and highly enhances user satisfaction, interest and enthusiasm. Users experience high levels of presence and are profoundly engaged in the learning activities within the immersive virtual environments. The studies presented in this research evaluate pedagogical VR software to train and motivate students in STEM education, and provide an empirical analysis comparing desktop VR (DVR), immersive VR (IVR), and immersive embodied VR (IEVR) conditions for learning. This research also proposes a fully immersive embodied interaction metaphor (IEIVR) for learning of computational concepts as a future direction, and presents the challenges faced in implementing the IEIVR metaphor due to extended periods of immersion. Results from the conducted studies help in formulating guidelines for virtual reality and education researchers working in STEM education and training, and for educators and curriculum developers seeking to improve student engagement in the STEM fields.
ERIC Educational Resources Information Center
Torres, Francisco; Neira Tovar, Leticia A.; del Rio, Marta Sylvia
2017-01-01
This study aims to explore the results of welding virtual training performance, designed using a learning model based on cognitive and usability techniques, applying an immersive concept focused on person attention. Moreover, it also intended to demonstrate that exits a moderating effect of performance improvement when the user experience is taken…
Using immersive simulation for training first responders for mass casualty incidents.
Wilkerson, William; Avstreih, Dan; Gruppen, Larry; Beier, Klaus-Peter; Woolliscroft, James
2008-11-01
A descriptive study was performed to better understand the possible utility of immersive virtual reality simulation for training first responders in a mass casualty event. Utilizing a virtual reality cave automatic virtual environment (CAVE) and high-fidelity human patient simulator (HPS), a group of experts modeled a football stadium that experienced a terrorist explosion during a football game. Avatars (virtual patients) were developed by expert consensus that demonstrated a spectrum of injuries ranging from death to minor lacerations. A group of paramedics was assessed by observation for decisions made and action taken. A critical action checklist was created and used for direct observation and viewing videotaped recordings. Of the 12 participants, only 35.7% identified the type of incident they encountered. None identified a secondary device that was easily visible. All participants were enthusiastic about the simulation and provided valuable comments and insights. Learner feedback and expert performance review suggests that immersive training in a virtual environment has the potential to be a powerful tool to train first responders for high-acuity, low-frequency events, such as a terrorist attack.
Nunnerley, Joanne; Gupta, Swati; Snell, Deborah; King, Marcus
2017-05-01
A user-centred design was used to develop and test the feasibility of an immersive 3D virtual reality wheelchair training tool for people with spinal cord injury (SCI). A Wheelchair Training System was designed and modelled using the Oculus Rift headset and a Dynamic Control wheelchair joystick. The system was tested by clinicians and expert wheelchair users with SCI. Data from focus groups and individual interviews were analysed using a general inductive approach to thematic analysis. Four themes emerged: Realistic System, which described the advantages of a realistic virtual environment; a Wheelchair Training System, which described participants' thoughts on the wheelchair training applications; Overcoming Resistance to Technology, the obstacles to introducing technology within the clinical setting; and Working outside the Rehabilitation Bubble which described the protective hospital environment. The Oculus Rift Wheelchair Training System has the potential to provide a virtual rehabilitation setting which could allow wheelchair users to learn valuable community wheelchair use in a safe environment. Nausea appears to be a side effect of the system, which will need to be resolved before this can be a viable clinical tool. Implications for Rehabilitation Immersive virtual reality shows promising benefit for wheelchair training in a rehabilitation setting. Early engagement with consumers can improve product development.
DOT National Transportation Integrated Search
2014-05-01
Immersive Virtual Learning Environments (IVLEs) are extensively used in training, but few rigorous scientific investigations regarding the : transfer of learning have been conducted. Measurement of learning transfer through evaluative methods is key ...
Assessment of radiation awareness training in immersive virtual environments
NASA Astrophysics Data System (ADS)
Whisker, Vaughn E., III
The prospect of new nuclear power plant orders in the near future and the graying of the current workforce create a need to train new personnel faster and better. Immersive virtual reality (VR) may offer a solution to the training challenge. VR technology presented in a CAVE Automatic Virtual Environment (CAVE) provides a high-fidelity, one-to-one scale environment where areas of the power plant can be recreated and virtual radiation environments can be simulated, making it possible to safely expose workers to virtual radiation in the context of the actual work environment. The use of virtual reality for training is supported by many educational theories; constructivism and discovery learning, in particular. Educational theory describes the importance of matching the training to the task. Plant access training and radiation worker training, common forms of training in the nuclear industry, rely on computer-based training methods in most cases, which effectively transfer declarative knowledge, but are poor at transferring skills. If an activity were to be added, the training would provide personnel with the opportunity to develop skills and apply their knowledge so they could be more effective when working in the radiation environment. An experiment was developed to test immersive virtual reality's suitability for training radiation awareness. Using a mixed methodology of quantitative and qualitative measures, the subjects' performances before and after training were assessed. First, subjects completed a pre-test to measure their knowledge prior to completing any training. Next they completed unsupervised computer-based training, which consisted of a PowerPoint presentation and a PDF document. After completing a brief orientation activity in the virtual environment, one group of participants received supplemental radiation awareness training in a simulated radiation environment presented in the CAVE, while a second group, the control group, moved directly to the assessment phase of the experiment. The CAVE supplied an activity-based training environment where learners were able to use a virtual survey meter to explore the properties of radiation sources and the effects of time and distance on radiation exposure. Once the training stage had ended, the subjects completed an assessment activity where they were asked to complete four tasks in a simulated radiation environment in the CAVE, which was designed to provide a more authentic assessment than simply testing understanding using a quiz. After the practicum, the subjects completed a post-test. Survey information was also collected to assist the researcher with interpretation of the collected data. Response to the training was measured by completion time, radiation exposure received, successful completion of the four tasks in the practicum, and scores on the post-test. These results were combined to create a radiation awareness score. In addition, observational data was collected as the subjects completed the tasks. The radiation awareness scores of the control group and the group that received supplemental training in the virtual environment were compared. T-tests showed that the effect of the supplemental training was not significant; however, calculation of the effect size showed a small-to-medium effect of the training. The CAVE group received significantly less radiation exposure during the assessment activity, and they completed the activities on an average of one minute faster. These results indicate that the training was effective, primarily for instilling radiation sensitivity. Observational data collected during the assessment supports this conclusion. The training environment provided by the immersive virtual reality recreated a radiation environment where learners could apply knowledge they had been taught by computer-based training. Activity-based training has been shown to be a more effective way to transfer skills because of the similarity between the training environment and the application environment. Virtual reality enables the training environment to look and feel like the application environment. Because of this, radiation awareness training in an immersive virtual environment should be considered by the nuclear industry, which is supported by the results of this experiment.
Skill training in multimodal virtual environments.
Gopher, Daniel
2012-01-01
Multimodal, immersive, virtual reality (VR) techniques open new perspectives for perceptual-motor skill trainers. They also introduce new risks and dangers. This paper describes the benefits and pitfalls of multimodal training and the cognitive building blocks of a multimodal, VR training simulators.
2007-01-01
educating and training (O’Keefe IV & McIntyre III, 2006). Topics vary widely from standard educational topics such as teaching kids physics, mechanics...Winn, W., & Yu, R. (1997). The Impact of Three Dimensional Immersive Virtual Environments on Modern Pedagogy : Global Change, VR and Learning
Immersive virtual reality platform for medical training: a "killer-application".
2000-01-01
The Medical Readiness Trainer (MRT) integrates fully immersive Virtual Reality (VR), highly advanced medical simulation technologies, and medical data to enable unprecedented medical education and training. The flexibility offered by the MRT environment serves as a practical teaching tool today and in the near future the will serve as an ideal vehicle for facilitating the transition to the next level of medical practice, i.e., telepresence and next generation Internet-based collaborative learning.
Using Immersive Virtual Reality for Electrical Substation Training
ERIC Educational Resources Information Center
Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana
2015-01-01
Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…
Gutiérrez, Fátima; Pierce, Jennifer; Vergara, Víctor M; Coulter, Robert; Saland, Linda; Caudell, Thomas P; Goldsmith, Timothy E; Alverson, Dale C
2007-01-01
Simulations are being used in education and training to enhance understanding, improve performance, and assess competence. However, it is important to measure the performance of these simulations as learning and training tools. This study examined and compared knowledge acquisition using a knowledge structure design. The subjects were first-year medical students at The University of New Mexico School of Medicine. One group used a fully immersed virtual reality (VR) environment using a head mounted display (HMD) and another group used a partially immersed (computer screen) VR environment. The study aims were to determine whether there were significant differences between the two groups as measured by changes in knowledge structure before and after the VR simulation experience. The results showed that both groups benefited from the VR simulation training as measured by the significant increased similarity to the expert knowledge network after the training experience. However, the immersed group showed a significantly higher gain than the partially immersed group. This study demonstrated a positive effect of VR simulation on learning as reflected by improvements in knowledge structure but an enhanced effect of full-immersion using a HMD vs. a screen-based VR system.
Using Immersive Virtual Environments for Certification
NASA Technical Reports Server (NTRS)
Lutz, R.; Cruz-Neira, C.
1998-01-01
Immersive virtual environments (VEs) technology has matured to the point where it can be utilized as a scientific and engineering problem solving tool. In particular, VEs are starting to be used to design and evaluate safety-critical systems that involve human operators, such as flight and driving simulators, complex machinery training, and emergency rescue strategies.
NASA Astrophysics Data System (ADS)
Watanuki, Keiichi; Kojima, Kazuyuki
The environment in which Japanese industry has achieved great respect is changing tremendously due to the globalization of world economies, while Asian countries are undergoing economic and technical development as well as benefiting from the advances in information technology. For example, in the design of custom-made casting products, a designer who lacks knowledge of casting may not be able to produce a good design. In order to obtain a good design and manufacturing result, it is necessary to equip the designer and manufacturer with a support system related to casting design, or a so-called knowledge transfer and creation system. This paper proposes a new virtual reality based knowledge acquisition and job training system for casting design, which is composed of the explicit and tacit knowledge transfer systems using synchronized multimedia and the knowledge internalization system using portable virtual environment. In our proposed system, the education content is displayed in the immersive virtual environment, whereby a trainee may experience work in the virtual site operation. Provided that the trainee has gained explicit and tacit knowledge of casting through the multimedia-based knowledge transfer system, the immersive virtual environment catalyzes the internalization of knowledge and also enables the trainee to gain tacit knowledge before undergoing on-the-job training at a real-time operation site.
NASA Technical Reports Server (NTRS)
Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard
2003-01-01
The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.
Evaluation of Virtual Reality Training Using Affect
ERIC Educational Resources Information Center
Tichon, Jennifer
2012-01-01
Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality (VR) where dangerous real world scenarios can be safely replicated. However, despite the growing popularity of VR to train cognitive skills such as decision-making and situation awareness, methods for evaluating their use rely…
The Student Experience With Varying Immersion Levels of Virtual Reality Simulation.
Farra, Sharon L; Smith, Sherrill J; Ulrich, Deborah L
With increasing use of virtual reality simulation (VRS) in nursing education and given the vast array of technologies available, a variety of levels of immersion and experiences can be provided to students. This study explored two different levels of immersive VRS capability. Study participants included baccalaureate nursing students from three universities across four campuses. Students were trained in the skill of decontamination using traditional methods or with VRS options of mouse and keyboard or head-mounted display technology. Results of focus group interviews reflect the student experience and satisfaction with two different immersive levels of VRS.
Immersive Learning Technologies: Realism and Online Authentic Learning
ERIC Educational Resources Information Center
Herrington, Jan; Reeves, Thomas C.; Oliver, Ron
2007-01-01
The development of immersive learning technologies in the form of virtual reality and advanced computer applications has meant that realistic creations of simulated environments are now possible. Such simulations have been used to great effect in training in the military, air force, and in medical training. But how realistic do problems need to be…
Poeschl, Sandra; Doering, Nicola
2013-01-01
Virtual training applications with high levels of immersion or fidelity (for example for social phobia treatment) produce high levels of presence and therefore belong to the most successful Virtual Reality developments. Whereas display and interaction fidelity (as sub-dimensions of immersion) and their influence on presence are well researched, realism of the displayed simulation depends on the specific application and is therefore difficult to measure. We propose to measure simulation realism by using a self-report questionnaire. The German VR Simulation Realism Scale for VR training applications was developed based on a translation of scene realism items from the Witmer-Singer-Presence Questionnaire. Items for realism of virtual humans (for example for social phobia training applications) were supplemented. A sample of N = 151 students rated simulation realism of a Fear of Public Speaking application. Four factors were derived by item- and principle component analysis (Varimax rotation), representing Scene Realism, Audience Behavior, Audience Appearance and Sound Realism. The scale developed can be used as a starting point for future research and measurement of simulation realism for applications including virtual humans.
Foreign language learning in immersive virtual environments
NASA Astrophysics Data System (ADS)
Chang, Benjamin; Sheldon, Lee; Si, Mei; Hand, Anton
2012-03-01
Virtual reality has long been used for training simulations in fields from medicine to welding to vehicular operation, but simulations involving more complex cognitive skills present new design challenges. Foreign language learning, for example, is increasingly vital in the global economy, but computer-assisted education is still in its early stages. Immersive virtual reality is a promising avenue for language learning as a way of dynamically creating believable scenes for conversational training and role-play simulation. Visual immersion alone, however, only provides a starting point. We suggest that the addition of social interactions and motivated engagement through narrative gameplay can lead to truly effective language learning in virtual environments. In this paper, we describe the development of a novel application for teaching Mandarin using CAVE-like VR, physical props, human actors and intelligent virtual agents, all within a semester-long multiplayer mystery game. Students travel (virtually) to China on a class field trip, which soon becomes complicated with intrigue and mystery surrounding the lost manuscript of an early Chinese literary classic. Virtual reality environments such as the Forbidden City and a Beijing teahouse provide the setting for learning language, cultural traditions, and social customs, as well as the discovery of clues through conversation in Mandarin with characters in the game.
A succinct overview of virtual reality technology use in Alzheimer's disease.
García-Betances, Rebeca I; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda
2015-01-01
We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer's disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers' education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments.
Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces
NASA Astrophysics Data System (ADS)
Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana
Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.
The stress and workload of virtual reality training: the effects of presence, immersion and flow.
Lackey, S J; Salcedo, J N; Szalma, J L; Hancock, P A
2016-08-01
The present investigation evaluated the effects of virtual reality (VR) training on the performance, perceived workload and stress response to a live training exercise in a sample of Soldiers. We also examined the relationship between the perceptions of that same VR as measured by engagement, immersion, presence, flow, perceived utility and ease of use with the performance, workload and stress reported on the live training task. To a degree, these latter relationships were moderated by task performance, as measured by binary (Go/No-Go) ratings. Participants who reported positive VR experiences also tended to experience lower stress and lower workload when performing the live version of the task. Thus, VR training regimens may be efficacious for mitigating the stress and workload associated with criterion tasks, thereby reducing the ultimate likelihood of real-world performance failure. Practitioner Summary: VR provides opportunities for training in artificial worlds comprised of highly realistic features. Our virtual room clearing scenario facilitated the integration of Training and Readiness objectives and satisfied training doctrine obligations in a compelling engaging experience for both novice and experienced trainees.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timothy Shaw; Anthony Baratta; Vaughn Whisker
2005-02-28
Task 4 report of 3 year DOE NERI-sponsored effort evaluating immersive virtual reality (CAVE) technology for design review, construction planning, and maintenance planning and training for next generation nuclear power plants. Program covers development of full-scale virtual mockups generated from 3D CAD data presented in a CAVE visualization facility. This report focuses on using Full-scale virtual mockups for nuclear power plant training applications.
CliniSpace: a multiperson 3D online immersive training environment accessible through a browser.
Dev, Parvati; Heinrichs, W LeRoy; Youngblood, Patricia
2011-01-01
Immersive online medical environments, with dynamic virtual patients, have been shown to be effective for scenario-based learning (1). However, ease of use and ease of access have been barriers to their use. We used feedback from prior evaluation of these projects to design and develop CliniSpace. To improve usability, we retained the richness of prior virtual environments but modified the user interface. To improve access, we used a Software-as-a-Service (SaaS) approach to present a richly immersive 3D environment within a web browser.
Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback.
Alaraj, Ali; Luciano, Cristian J; Bailey, Daniel P; Elsenousi, Abdussalam; Roitberg, Ben Z; Bernardo, Antonio; Banerjee, P Pat; Charbel, Fady T
2015-03-01
With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.
Hybrid Reality Lab Capabilities - Video 2
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2016-01-01
Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.
Huber, Tobias; Paschold, Markus; Hansen, Christian; Lang, Hauke; Kneist, Werner
2018-06-01
Immersive virtual reality (VR) laparoscopy simulation connects VR simulation with head-mounted displays to increase presence during VR training. The goal of the present study was the comparison of 2 different surroundings according to performance and users' preference. With a custom immersive virtual reality laparoscopy simulator, an artificially created VR operating room (AVR) and a highly immersive VR operating room (IVR) were compared. Participants (n = 30) performed 3 tasks (peg transfer, fine dissection, and cholecystectomy) in AVR and IVR in a crossover study design. No overall difference in virtual laparoscopic performance was obtained when comparing results from AVR with IVR. Most participants preferred the IVR surrounding (n = 24). Experienced participants (n = 10) performed significantly better than novices (n = 10) in all tasks regardless of the surrounding ( P < .05). Participants with limited experience (n = 10) showed differing results. Presence, immersion, and exhilaration were significantly higher in IVR. Two thirds assumed that IVR would have a positive influence on their laparoscopic simulator use. This first study comparing AVR and IVR did not reveal differences in virtual laparoscopic performance. IVR is considered the more realistic surrounding and is therefore preferred by the participants.
Ekstrand, Chelsea; Jamal, Ali; Nguyen, Ron; Kudryk, Annalise; Mann, Jennifer; Mendez, Ivar
2018-02-23
Spatial 3-dimensional understanding of the brain is essential to learning neuroanatomy, and 3-dimensional learning techniques have been proposed as tools to enhance neuroanatomy training. The aim of this study was to examine the impact of immersive virtual-reality neuroanatomy training and compare it to traditional paper-based methods. In this randomized controlled study, participants consisted of first- or second-year medical students from the University of Saskatchewan recruited via email and posters displayed throughout the medical school. Participants were randomly assigned to the virtual-reality group or the paper-based group and studied the spatial relations between neural structures for 12 minutes after performing a neuroanatomy baseline test, with both test and control questions. A postintervention test was administered immediately after the study period and 5-9 days later. Satisfaction measures were obtained. Of the 66 participants randomly assigned to the study groups, 64 were included in the final analysis, 31 in the virtual-reality group and 33 in the paper-based group. The 2 groups performed comparably on the baseline questions and showed significant performance improvement on the test questions following study. There were no significant differences between groups for the control questions, the postintervention test questions or the 7-day postintervention test questions. Satisfaction survey results indicated that neurophobia was decreased. Results from this study provide evidence that training in neuroanatomy in an immersive and interactive virtual-reality environment may be an effective neuroanatomy learning tool that warrants further study. They also suggest that integration of virtual-reality into neuroanatomy training may improve knowledge retention, increase study motivation and decrease neurophobia. Copyright 2018, Joule Inc. or its licensors.
Ekstrand, Chelsea; Jamal, Ali; Nguyen, Ron; Kudryk, Annalise; Mann, Jennifer; Mendez, Ivar
2018-01-01
Background: Spatial 3-dimensional understanding of the brain is essential to learning neuroanatomy, and 3-dimensional learning techniques have been proposed as tools to enhance neuroanatomy training. The aim of this study was to examine the impact of immersive virtual-reality neuroanatomy training and compare it to traditional paper-based methods. Methods: In this randomized controlled study, participants consisted of first- or second-year medical students from the University of Saskatchewan recruited via email and posters displayed throughout the medical school. Participants were randomly assigned to the virtual-reality group or the paper-based group and studied the spatial relations between neural structures for 12 minutes after performing a neuroanatomy baseline test, with both test and control questions. A postintervention test was administered immediately after the study period and 5-9 days later. Satisfaction measures were obtained. Results: Of the 66 participants randomly assigned to the study groups, 64 were included in the final analysis, 31 in the virtual-reality group and 33 in the paper-based group. The 2 groups performed comparably on the baseline questions and showed significant performance improvement on the test questions following study. There were no significant differences between groups for the control questions, the postintervention test questions or the 7-day postintervention test questions. Satisfaction survey results indicated that neurophobia was decreased. Interpretation: Results from this study provide evidence that training in neuroanatomy in an immersive and interactive virtual-reality environment may be an effective neuroanatomy learning tool that warrants further study. They also suggest that integration of virtual-reality into neuroanatomy training may improve knowledge retention, increase study motivation and decrease neurophobia. PMID:29510979
Development of a low-cost virtual reality workstation for training and education
NASA Technical Reports Server (NTRS)
Phillips, James A.
1996-01-01
Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) it involves 3-dimensional computer graphics; (2) it includes real-time feedback and response to user actions; and (3) it must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, but the high cost of VR technology has limited its practical application to fields with big budgets, such as military combat simulation, commercial pilot training, and certain projects within the space program. However, in the last year there has been a revolution in the cost of VR technology. The speed of inexpensive personal computers has increased dramatically, especially with the introduction of the Pentium processor and the PCI bus for IBM-compatibles, and the cost of high-quality virtual reality peripherals has plummeted. The result is that many public schools, colleges, and universities can afford a PC-based workstation capable of running immersive virtual reality applications. My goal this summer was to assemble and evaluate such a system.
A Succinct Overview of Virtual Reality Technology Use in Alzheimer’s Disease
García-Betances, Rebeca I.; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda
2015-01-01
We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer’s disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers’ education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments. PMID:26029101
ERIC Educational Resources Information Center
Hadipriono, Fabian C.; And Others
An interactive training model called SAVR (Safety in Construction Using Virtual Reality) was developed to train construction students, novice engineers, and construction workers to prevent falls from scaffolding. The model was implemented in a graphics supercomputer, the ONYX Reality Engine2. The SAVR model provides trainees with an immersive,…
Kim, Aram; Darakjian, Nora; Finley, James M
2017-02-21
Virtual reality (VR) has recently been explored as a tool for neurorehabilitation to enable individuals with Parkinson's disease (PD) to practice challenging skills in a safe environment. Current technological advances have enabled the use of affordable, fully immersive head-mounted displays (HMDs) for potential therapeutic applications. However, while previous studies have used HMDs in individuals with PD, these were only used for short bouts of walking. Clinical applications of VR for gait training would likely involve an extended exposure to the virtual environment, which has the potential to cause individuals with PD to experience simulator-related adverse effects due to their age or pathology. Thus, our objective was to evaluate the safety of using an HMD for longer bouts of walking in fully immersive VR for older adults and individuals with PD. Thirty-three participants (11 healthy young, 11 healthy older adults, and 11 individuals with PD) were recruited for this study. Participants walked for 20 min while viewing a virtual city scene through an HMD (Oculus Rift DK2). Safety was evaluated using the mini-BESTest, measures of center of pressure (CoP) excursion, and questionnaires addressing symptoms of simulator sickness (SSQ) and measures of stress and arousal. Most participants successfully completed all trials without any discomfort. There were no significant changes for any of our groups in symptoms of simulator sickness or measures of static and dynamic balance after exposure to the virtual environment. Surprisingly, measures of stress decreased in all groups while the PD group also increased the level of arousal after exposure. Older adults and individuals with PD were able to successfully use immersive VR during walking without adverse effects. This provides systematic evidence supporting the safety of immersive VR for gait training in these populations.
Object Creation and Human Factors Evaluation for Virtual Environments
NASA Technical Reports Server (NTRS)
Lindsey, Patricia F.
1998-01-01
The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.
Learning Relative Motion Concepts in Immersive and Non-immersive Virtual Environments
NASA Astrophysics Data System (ADS)
Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria
2013-12-01
The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop virtual environment (DVE) conditions. Our results show that after the simulation activities, both IVE and DVE groups exhibited a significant shift toward a scientific understanding in their conceptual models and epistemological beliefs about the nature of relative motion, and also a significant improvement on relative motion problem-solving tests. In addition, we analyzed students' performance on one-dimensional and two-dimensional questions in the relative motion problem-solving test separately and found that after training in the simulation, the IVE group performed significantly better than the DVE group on solving two-dimensional relative motion problems. We suggest that egocentric encoding of the scene in IVE (where the learner constitutes a part of a scene they are immersed in), as compared to allocentric encoding on a computer screen in DVE (where the learner is looking at the scene from "outside"), is more beneficial than DVE for studying more complex (two-dimensional) relative motion problems. Overall, our findings suggest that such aspects of virtual realities as immersivity, first-hand experience, and the possibility of changing different frames of reference can facilitate understanding abstract scientific phenomena and help in displacing intuitive misconceptions with more accurate mental models.
Virtual reality training improves balance function.
Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng
2014-09-01
Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function.
Virtual reality training improves balance function
Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng
2014-01-01
Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function. PMID:25368651
Evaluation of the Virtual Squad Training System
2010-01-01
ABSTRACT (Maximum 200 words): The Virtual Squad Training System ( VSTS ) is a network of nine individual immersive simulators with Helmet-Mounted...Displays (HMDs), and a command station for controlling computer generated entities. The VSTS includes both tethered and wearable simulators. The VSTS was...affected Soldiers’ ratings of the VSTS . Simulator sickness incidence was low compared to previous evaluations of antecedent systems using HMDs
DOT National Transportation Integrated Search
2014-05-01
mmersive Virtual Learning Environments (IVLEs) are extensively used in training, but few rigorous scienti c investigations regarding : the transfer of learning have been conducted. Measurement of learning transfer through evaluative methods is key...
Immersive virtual reality used as a platform for perioperative training for surgical residents.
Witzke, D B; Hoskins, J D; Mastrangelo, M J; Witzke, W O; Chu, U B; Pande, S; Park, A E
2001-01-01
Perioperative preparations such as operating room setup, patient and equipment positioning, and operating port placement are essential to operative success in minimally invasive surgery. We developed an immersive virtual reality-based training system (REMIS) to provide residents (and other health professionals) with training and evaluation in these perioperative skills. Our program uses the qualities of immersive VR that are available today for inclusion in an ongoing training curriculum for surgical residents. The current application consists of a primary platform for patient positioning for a laparoscopic cholecystectomy. Having completed this module we can create many different simulated problems for other procedures. As a part of the simulation, we have devised a computer-driven real-time data collection system to help us in evaluating trainees and providing feedback during the simulation. The REMIS program trains and evaluates surgical residents and obviates the need to use expensive operating room and surgeon time. It also allows residents to train based on their schedule and does not put patients at increased risk. The method is standardized, allows for repetition if needed, evaluates individual performance, provides the possible complications of incorrect choices, provides training in 3-D environment, and has the capability of being used for various scenarios and professions.
Simulation of Physical Experiments in Immersive Virtual Environments
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Wasfy, Tamer M.
2001-01-01
An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.
Immersive Simulation Training for the Dismounted Soldier
2007-02-01
users can be immersed in the same simulation, each perceiving it from a personal point of view. VR is used in some electronic games , in amusement-park...not take virtual simulations seriously because they are similar to video games , and might, therefore, practice actions that are inappropriate in the...of video and PC-based games for training is that the potential trainees are interested in and experienced with such games . They, therefore, need little
An Analysis of VR Technology Used in Immersive Simulations with a Serious Game Perspective.
Menin, Aline; Torchelsen, Rafael; Nedel, Luciana
2018-03-01
Using virtual environments (VEs) is a safer and cost-effective alternative to executing dangerous tasks, such as training firefighters and industrial operators. Immersive virtual reality (VR) combined with game aspects have the potential to improve the user experience in the VE by increasing realism, engagement, and motivation. This article investigates the impact of VR technology on 46 immersive gamified simulations with serious purposes and classifies it towards a taxonomy. Our findings suggest that immersive VR improves simulation outcomes, such as increasing learning gain and knowledge retention and improving clinical outcomes for rehabilitation. However, it also has limitations such as motion sickness and restricted access to VR hardware. Our contributions are to provide a better understanding of the benefits and limitations of using VR in immersive simulations with serious purposes, to propose a taxonomy that classifies them, and to discuss whether methods and participants profiles influence results.
Desktop-VR system for preflight 3D navigation training
NASA Astrophysics Data System (ADS)
Aoki, Hirofumi; Oman, Charles M.; Buckland, Daniel A.; Natapoff, Alan
Crews who inhabit spacecraft with complex 3D architecture frequently report inflight disorientation and navigation problems. Preflight virtual reality (VR) training may reduce those risks. Although immersive VR techniques may better support spatial orientation training in a local environment, a non-immersive desktop (DT) system may be more convenient for navigation training in "building scale" spaces, especially if the two methods achieve comparable results. In this study trainees' orientation and navigation performance during simulated space station emergency egress tasks was compared while using immersive head-mounted display (HMD) and DT-VR systems. Analyses showed no differences in pointing angular-error or egress time among the groups. The HMD group was significantly faster than DT group when pointing from destination to start location and from start toward different destination. However, this may be attributed to differences in the input device used (a head-tracker for HMD group vs. a keyboard touchpad or a gamepad in the DT group). All other 3D navigation performance measures were similar using the immersive and non-immersive VR systems, suggesting that the simpler desktop VR system may be useful for astronaut 3D navigation training.
Accessible virtual reality therapy using portable media devices.
Bruck, Susan; Watters, Paul A
2010-01-01
Simulated immersive environments displayed on large screens are a valuable therapeutic asset in the treatment of a range of psychological disorders. Permanent environments are expensive to build and maintain, require specialized clinician training and technical support and often have limited accessibility for clients. Ideally, virtual reality exposure therapy (VRET) could be accessible to the broader community if we could use inexpensive hardware with specifically designed software. This study tested whether watching a handheld non-immersive media device causes nausea and other cybersickness responses. Using a repeated measure design we found that nausea, general discomfort, eyestrain, blurred vision and an increase in salivation significantly increased in response to handheld non-immersive media device exposure.
Emerging CAE technologies and their role in Future Ambient Intelligence Environments
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.
2011-03-01
Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.
Gomez, Jocelyn; Hoffman, Hunter G; Bistricky, Steven L; Gonzalez, Miriam; Rosenberg, Laura; Sampaio, Mariana; Garcia-Palacios, Azucena; Navarro-Haro, Maria V; Alhalabi, Wadee; Rosenberg, Marta; Meyer, Walter J; Linehan, Marsha M
2017-01-01
Sustaining a burn injury increases an individual's risk of developing psychological problems such as generalized anxiety, negative emotions, depression, acute stress disorder, or post-traumatic stress disorder. Despite the growing use of Dialectical Behavioral Therapy® (DBT®) by clinical psychologists, to date, there are no published studies using standard DBT® or DBT® skills learning for severe burn patients. The current study explored the feasibility and clinical potential of using Immersive Virtual Reality (VR) enhanced DBT® mindfulness skills training to reduce negative emotions and increase positive emotions of a patient with severe burn injuries. The participant was a hospitalized (in house) 21-year-old Spanish speaking Latino male patient being treated for a large (>35% TBSA) severe flame burn injury. Methods: The patient looked into a pair of Oculus Rift DK2 virtual reality goggles to perceive the computer-generated virtual reality illusion of floating down a river, with rocks, boulders, trees, mountains, and clouds, while listening to DBT® mindfulness training audios during 4 VR sessions over a 1 month period. Study measures were administered before and after each VR session. Results: As predicted, the patient reported increased positive emotions and decreased negative emotions. The patient also accepted the VR mindfulness treatment technique. He reported the sessions helped him become more comfortable with his emotions and he wanted to keep using mindfulness after returning home. Conclusions: Dialectical Behavioral Therapy is an empirically validated treatment approach that has proved effective with non-burn patient populations for treating many of the psychological problems experienced by severe burn patients. The current case study explored for the first time, the use of immersive virtual reality enhanced DBT® mindfulness skills training with a burn patient. The patient reported reductions in negative emotions and increases in positive emotions, after VR DBT® mindfulness skills training. Immersive Virtual Reality is becoming widely available to mainstream consumers, and thus has the potential to make this treatment available to a much wider number of patient populations, including severe burn patients. Additional development, and controlled studies are needed.
Gomez, Jocelyn; Hoffman, Hunter G.; Bistricky, Steven L.; Gonzalez, Miriam; Rosenberg, Laura; Sampaio, Mariana; Garcia-Palacios, Azucena; Navarro-Haro, Maria V.; Alhalabi, Wadee; Rosenberg, Marta; Meyer, Walter J.; Linehan, Marsha M.
2017-01-01
Sustaining a burn injury increases an individual's risk of developing psychological problems such as generalized anxiety, negative emotions, depression, acute stress disorder, or post-traumatic stress disorder. Despite the growing use of Dialectical Behavioral Therapy® (DBT®) by clinical psychologists, to date, there are no published studies using standard DBT® or DBT® skills learning for severe burn patients. The current study explored the feasibility and clinical potential of using Immersive Virtual Reality (VR) enhanced DBT® mindfulness skills training to reduce negative emotions and increase positive emotions of a patient with severe burn injuries. The participant was a hospitalized (in house) 21-year-old Spanish speaking Latino male patient being treated for a large (>35% TBSA) severe flame burn injury. Methods: The patient looked into a pair of Oculus Rift DK2 virtual reality goggles to perceive the computer-generated virtual reality illusion of floating down a river, with rocks, boulders, trees, mountains, and clouds, while listening to DBT® mindfulness training audios during 4 VR sessions over a 1 month period. Study measures were administered before and after each VR session. Results: As predicted, the patient reported increased positive emotions and decreased negative emotions. The patient also accepted the VR mindfulness treatment technique. He reported the sessions helped him become more comfortable with his emotions and he wanted to keep using mindfulness after returning home. Conclusions: Dialectical Behavioral Therapy is an empirically validated treatment approach that has proved effective with non-burn patient populations for treating many of the psychological problems experienced by severe burn patients. The current case study explored for the first time, the use of immersive virtual reality enhanced DBT® mindfulness skills training with a burn patient. The patient reported reductions in negative emotions and increases in positive emotions, after VR DBT® mindfulness skills training. Immersive Virtual Reality is becoming widely available to mainstream consumers, and thus has the potential to make this treatment available to a much wider number of patient populations, including severe burn patients. Additional development, and controlled studies are needed. PMID:28993747
Virtual Reality Cerebral Aneurysm Clipping Simulation With Real-time Haptic Feedback
Alaraj, Ali; Luciano, Cristian J.; Bailey, Daniel P.; Elsenousi, Abdussalam; Roitberg, Ben Z.; Bernardo, Antonio; Banerjee, P. Pat; Charbel, Fady T.
2014-01-01
Background With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. Objective To develop and evaluate the usefulness of a new haptic-based virtual reality (VR) simulator in the training of neurosurgical residents. Methods A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the Immersive Touch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomography angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-D immersive VR environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from three residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Results Residents felt that the simulation would be useful in preparing for real-life surgery. About two thirds of the residents felt that the 3-D immersive anatomical details provided a very close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They believed the simulation is useful for preoperative surgical rehearsal and neurosurgical training. One third of the residents felt that the technology in its current form provided very realistic haptic feedback for aneurysm surgery. Conclusion Neurosurgical residents felt that the novel immersive VR simulator is helpful in their training especially since they do not get a chance to perform aneurysm clippings until very late in their residency programs. PMID:25599200
Huber, Tobias; Paschold, Markus; Hansen, Christian; Wunderling, Tom; Lang, Hauke; Kneist, Werner
2017-11-01
Virtual reality (VR) and head mount displays (HMDs) have been advanced for multimedia and information technologies but have scarcely been used in surgical training. Motion sickness and individual psychological changes have been associated with VR. The goal was to observe first experiences and performance scores using a new combined highly immersive virtual reality (IVR) laparoscopy setup. During the study, 10 members of the surgical department performed three tasks (fine dissection, peg transfer, and cholecystectomy) on a VR simulator. We then combined a VR HMD with the VR laparoscopic simulator and displayed the simulation on a 360° video of a laparoscopic operation to create an IVR laparoscopic simulation. The tasks were then repeated. Validated questionnaires on immersion and motion sickness were used for the study. Participants' times for fine dissection were significantly longer during the IVR session (regular: 86.51 s [62.57 s; 119.62 s] vs. IVR: 112.35 s [82.08 s; 179.40 s]; p = 0.022). The cholecystectomy task had higher error rates during IVR. Motion sickness did not occur at any time for any participant. Participants experienced a high level of exhilaration, rarely thought about others in the room, and had a high impression of presence in the generated IVR world. This is the first clinical and technical feasibility study using the full IVR laparoscopy setup combined with the latest laparoscopic simulator in a 360° surrounding. Participants were exhilarated by the high level of immersion. The setup enables a completely new generation of surgical training.
Yasuda, Kazuhiro; Muroi, Daisuke; Ohira, Masahiro; Iwata, Hiroyasu
2017-10-01
Unilateral spatial neglect (USN) is defined as impaired ability to attend and see on one side, and when present, it interferes seriously with daily life. These symptoms can exist for near and far spaces combined or independently, and it is important to provide effective intervention for near and far space neglect. The purpose of this pilot study was to propose an immersive virtual reality (VR) rehabilitation program using a head-mounted display that is able to train both near and far space neglect, and to validate the immediate effect of the VR program in both near and far space neglect. Ten USN patients underwent the VR program with a pre-post design and no control. In the virtual environment, we developed visual searching and reaching tasks using an immersive VR system. Behavioral inattention test (BIT) scores obtained pre- and immediate post-VR program were compared. BIT scores obtained pre- and post-VR program revealed that far space neglect but not near space neglect improved promptly after the VR program. This effect for far space neglect was observed in the cancelation task, but not in the line bisection task. Positive effects of the immersive VR program for far space neglect are suggested by the results of the present pilot study. However, further studies with rigorous designs are needed to validate its clinical effectiveness.
Virtual reality enhanced mannequin (VREM) that is well received by resuscitation experts.
Semeraro, Federico; Frisoli, Antonio; Bergamasco, Massimo; Cerchiari, Erga L
2009-04-01
The objective of this study was to test acceptance of, and interest in, a newly developed prototype of virtual reality enhanced mannequin (VREM) on a sample of congress attendees who volunteered to participate in the evaluation session and to respond to a specifically designed questionnaire. A commercial Laerdal HeartSim 4000 mannequin was developed to integrate virtual reality (VR) technologies with specially developed virtual reality software to increase the immersive perception of emergency scenarios. To evaluate the acceptance of a virtual reality enhanced mannequin (VREM), we presented it to a sample of 39 possible users. Each evaluation session involved one trainee and two instructors with a standardized procedure and scenario: the operator was invited by the instructor to wear the data-gloves and the head mounted display and was briefly introduced to the scope of the simulation. The instructor helped the operator familiarize himself with the environment. After the patient's collapse, the operator was asked to check the patient's clinical conditions and start CPR. Finally, the patient started to recover signs of circulation and the evaluation session was concluded. Each participant was then asked to respond to a questionnaire designed to explore the trainee's perception in the areas of user-friendliness, realism, and interaction/immersion. Overall, the evaluation of the system was very positive, as was the feeling of immersion and realism of the environment and simulation. Overall, 84.6% of the participants judged the virtual reality experience as interesting and believed that its development could be very useful for healthcare training. The prototype of the virtual reality enhanced mannequin was well-liked, without interfence by interaction devices, and deserves full technological development and validation in emergency medical training.
Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo
2014-04-15
Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.
Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo
2014-01-01
Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907
Virtual Reality to Train Diagnostic Skills in Eating Disorders. Comparison of two Low Cost Systems.
Gutiérrez-Maldonado, José; Ferrer-García, Marta; Plasanjuanelo, Joana; Andrés-Pueyo, Antonio; Talarn-Caparrós, Antoni
2015-01-01
Enhancing the ability to perform differential diagnosis and psychopathological exploration is important for students who wish to work in the clinical field, as well as for professionals already working in this area. Virtual reality (VR) simulations can immerse students totally in educational experiences in a way that is not possible using other methods. Learning in a VR environment can also be more effective and motivating than usual classroom practices. Traditionally, immersion has been considered central to the quality of a VR system; immersive VR is considered a special and unique experience that cannot achieved by three-dimensional (3D) interactions on desktop PCs. However, some authors have suggested that if the content design is emotionally engaging, immersive systems are not always necessary. The main purpose of this study is to compare the efficacy and usability of two low-cost VR systems, offering different levels of immersion, in order to develop the ability to perform diagnostic interviews in eating disorders by means of simulations of psychopathological explorations.
Creating virtual humans for simulation-based training and planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stansfield, S.; Sobel, A.
1998-05-12
Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system formore » planning, rehearsing and training assault operations.« less
Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel
2015-01-01
The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants’ behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants’ height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother). PMID:26157414
Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel
2015-01-01
The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants' behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants' height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother).
Immersive Virtual Reality in the Psychology Classroom: What Purpose Could it Serve?
ERIC Educational Resources Information Center
Coxon, Matthew
2013-01-01
Virtual reality is by no means a new technology, yet it is increasingly being used, to different degrees, in education, training, rehabilitation, therapy, and home entertainment. Although the exact reasons for this shift are not the subject of this short opinion piece, it is possible to speculate that decreased costs, and increased performance, of…
Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!
NASA Astrophysics Data System (ADS)
Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.
2015-04-01
Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.
Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.
Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk
2013-08-01
Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.
Sensorimotor Learning during a Marksmanship Task in Immersive Virtual Reality
Rao, Hrishikesh M.; Khanna, Rajan; Zielinski, David J.; Lu, Yvonne; Clements, Jillian M.; Potter, Nicholas D.; Sommer, Marc A.; Kopper, Regis; Appelbaum, Lawrence G.
2018-01-01
Sensorimotor learning refers to improvements that occur through practice in the performance of sensory-guided motor behaviors. Leveraging novel technical capabilities of an immersive virtual environment, we probed the component kinematic processes that mediate sensorimotor learning. Twenty naïve subjects performed a simulated marksmanship task modeled after Olympic Trap Shooting standards. We measured movement kinematics and shooting performance as participants practiced 350 trials while receiving trial-by-trial feedback about shooting success. Spatiotemporal analysis of motion tracking elucidated the ballistic and refinement phases of hand movements. We found systematic changes in movement kinematics that accompanied improvements in shot accuracy during training, though reaction and response times did not change over blocks. In particular, we observed longer, slower, and more precise ballistic movements that replaced effort spent on corrections and refinement. Collectively, these results leverage developments in immersive virtual reality technology to quantify and compare the kinematics of movement during early learning of full-body sensorimotor orienting. PMID:29467693
NASA Astrophysics Data System (ADS)
McKenna, Kyra; McMenemy, Karen; Ferguson, R. S.; Dick, Alistair; Potts, Stephen
2008-02-01
Computer simulators are a popular method of training surgeons in the techniques of laparoscopy. However, for the trainee to feel totally immersed in the process, the graphical display should be as lifelike as possible and two-handed force feedback interaction is required. This paper reports on how a compelling immersive experience can be delivered at low cost using commonly available hardware components. Three specific themes are brought together. Firstly, programmable shaders executing in standard PC graphics adapter's deliver the appearance of anatomical realism, including effects of: translucent tissue surfaces, semi-transparent membranes, multilayer image texturing and real-time shadowing. Secondly, relatively inexpensive 'off the shelf' force feedback devices contribute to a holistic immersive experience. The final element described is the custom software that brings these together with hierarchically organized and optimized polygonal models for abdominal anatomy.
A Critical Review of the Use of Virtual Reality in Construction Engineering Education and Training.
Wang, Peng; Wu, Peng; Wang, Jun; Chi, Hung-Lin; Wang, Xiangyu
2018-06-08
Virtual Reality (VR) has been rapidly recognized and implemented in construction engineering education and training (CEET) in recent years due to its benefits of providing an engaging and immersive environment. The objective of this review is to critically collect and analyze the VR applications in CEET, aiming at all VR-related journal papers published from 1997 to 2017. The review follows a three-stage analysis on VR technologies, applications and future directions through a systematic analysis. It is found that the VR technologies adopted for CEET evolve over time, from desktop-based VR, immersive VR, 3D game-based VR, to Building Information Modelling (BIM)-enabled VR. A sibling technology, Augmented Reality (AR), for CEET adoptions has also emerged in recent years. These technologies have been applied in architecture and design visualization, construction health and safety training, equipment and operational task training, as well as structural analysis. Future research directions, including the integration of VR with emerging education paradigms and visualization technologies, have also been provided. The findings are useful for both researchers and educators to usefully integrate VR in their education and training programs to improve the training performance.
Virtually compliant: Immersive video gaming increases conformity to false computer judgments.
Weger, Ulrich W; Loughnan, Stephen; Sharma, Dinkar; Gonidis, Lazaros
2015-08-01
Real-life encounters with face-to-face contact are on the decline in a world in which many routine tasks are delegated to virtual characters-a development that bears both opportunities and risks. Interacting with such virtual-reality beings is particularly common during role-playing videogames, in which we incarnate into the virtual reality of an avatar. Video gaming is known to lead to the training and development of real-life skills and behaviors; hence, in the present study we sought to explore whether role-playing video gaming primes individuals' identification with a computer enough to increase computer-related social conformity. Following immersive video gaming, individuals were indeed more likely to give up their own best judgment and to follow the vote of computers, especially when the stimulus context was ambiguous. Implications for human-computer interactions and for our understanding of the formation of identity and self-concept are discussed.
Neurosurgical tactile discrimination training with haptic-based virtual reality simulation.
Patel, Achal; Koshy, Nick; Ortega-Barnett, Juan; Chan, Hoi C; Kuo, Yong-Fan; Luciano, Cristian; Rizzi, Silvio; Matulyauskas, Martin; Kania, Patrick; Banerjee, Pat; Gasco, Jaime
2014-12-01
To determine if a computer-based simulation with haptic technology can help surgical trainees improve tactile discrimination using surgical instruments. Twenty junior medical students participated in the study and were randomized into two groups. Subjects in Group A participated in virtual simulation training using the ImmersiveTouch simulator (ImmersiveTouch, Inc., Chicago, IL, USA) that required differentiating the firmness of virtual spheres using tactile and kinesthetic sensation via haptic technology. Subjects in Group B did not undergo any training. With their visual fields obscured, subjects in both groups were then evaluated on their ability to use the suction and bipolar instruments to find six elastothane objects with areas ranging from 1.5 to 3.5 cm2 embedded in a urethane foam brain cavity model while relying on tactile and kinesthetic sensation only. A total of 73.3% of the subjects in Group A (simulation training) were able to find the brain cavity objects in comparison to 53.3% of the subjects in Group B (no training) (P = 0.0183). There was a statistically significant difference in the total number of Group A subjects able to find smaller brain cavity objects (size ≤ 2.5 cm2) compared to that in Group B (72.5 vs. 40%, P = 0.0032). On the other hand, no significant difference in the number of subjects able to detect larger objects (size ≧ 3 cm2) was found between Groups A and B (75 vs. 80%, P = 0.7747). Virtual computer-based simulators with integrated haptic technology may improve tactile discrimination required for microsurgical technique.
NASA Technical Reports Server (NTRS)
1998-01-01
Flogiston Corporation was included in Spinoff 1992 for its stress reducing chair based on data from NASA's Anthropometric Source Book. Now, under a NASA SBIR (Small Business Innovative Research), Flogiston developed a new form of immersion in virtual and video spaces called Neutral Immersion. The technology is called Flostation and is the commercial spinoff from the astronaut training system. Motorola purchased the first Flostation for testing and demonstrating a new warping chip it is developing.
NASA Technical Reports Server (NTRS)
Gendron, Gerald
2012-01-01
Over the next decade, those entering Service and Joint Staff positions within the military will come from a different generation than the current leadership. They will come from Generation Y and have differing preferences for learning. Immersive learning environments like serious games and virtual world initiatives can complement traditional training methods to provide a better overall training program for staffs. Generation Y members desire learning methods which are relevant and interactive, regardless of whether they are delivered over the internet or in person. This paper focuses on a project undertaken to assess alternative training methods to teach special operations staffs. It provides a summary of the needs analysis used to consider alternatives and to better posture the Department of Defense for future training development.
Virtual Reality Exposure Training for Musicians: Its Effect on Performance Anxiety and Quality.
Bissonnette, Josiane; Dubé, Francis; Provencher, Martin D; Moreno Sala, Maria T
2015-09-01
Music performance anxiety affects numerous musicians, with many of them reporting impairment of performance due to this problem. This exploratory study investigated the effects of virtual reality exposure training on students with music performance anxiety. Seventeen music students were randomly assigned to a control group (n=8) or a virtual training group (n=9). Participants were asked to play a musical piece by memory in two separate recitals within a 3-week interval. Anxiety was then measured with the Personal Report of Confidence as a Performer Scale and the S-Anxiety scale from the State-Trait Anxiety Inventory (STAI-Y). Between pre- and post-tests, the virtual training group took part in virtual reality exposure training consisting of six 1-hour long sessions of virtual exposure. The results indicate a significant decrease in performance anxiety for musicians in the treatment group for those with a high level of state anxiety, for those with a high level of trait anxiety, for women, and for musicians with high immersive tendencies. Finally, between the pre- and post-tests, we observed a significant increase in performance quality for the experimental group, but not for the control group.
Rizzo, Albert; Buckwalter, J Galen; John, Bruce; Newman, Brad; Parsons, Thomas; Kenny, Patrick; Williams, Josh
2012-01-01
The incidence of posttraumatic stress disorder (PTSD) in returning OEF/OIF military personnel is creating a significant healthcare challenge. This has served to motivate research on how to better develop and disseminate evidence-based treatments for PTSD. One emerging form of treatment for combat-related PTSD that has shown promise involves the delivery of exposure therapy using immersive Virtual Reality (VR). Initial outcomes from open clinical trials have been positive and fully randomized controlled trials are currently in progress to further validate this approach. Based on our research group's initial positive outcomes using VR to emotionally engage and successfully treat persons undergoing exposure therapy for PTSD, we have begun development in a similar VR-based approach to deliver stress resilience training with military service members prior to their initial deployment. The Stress Resilience In Virtual Environments (STRIVE) project aims to create a set of combat simulations (derived from our existing Virtual Iraq/Afghanistan exposure therapy system) that are part of a multi-episode narrative experience. Users can be immersed within challenging combat contexts and interact with virtual characters within these episodes as part of an experiential learning approach for training a range of psychoeducational and cognitive-behavioral emotional coping strategies believed to enhance stress resilience. The STRIVE project aims to present this approach to service members prior to deployment as part of a program designed to better prepare military personnel for the types of emotional challenges that are inherent in the combat environment. During these virtual training experiences users are monitored physiologically as part of a larger investigation into the biomarkers of the stress response. One such construct, Allostatic Load, is being directly investigated via physiological and neuro-hormonal analysis from specimen collections taken immediately before and after engagement in the STRIVE virtual experience.
Learning Relative Motion Concepts in Immersive and Non-Immersive Virtual Environments
ERIC Educational Resources Information Center
Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria
2013-01-01
The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop…
NASA Astrophysics Data System (ADS)
Ritter, Kenneth August, III
Industry has a continuing need to train its workforce on recent engineering developments, but many engineering products and processes are hard to explain because of limitations of size, visibility, time scale, cost, and safety. The product or process might be difficult to see because it is either very large or very small, because it is enclosed within an opaque container, or because it happens very fast or very slowly. Some engineering products and processes are also costly or unsafe to use for training purposes, and sometimes the domain expert is not physically available at the training location. All these limitations can potentially be addressed using advanced visualization techniques such as virtual reality. This dissertation describes the development of an immersive virtual reality application using the Six Sigma DMADV process to explain the main equipment and processes used in a concentrating solar power plant. The virtual solar energy center (VEC) application was initially developed and tested in a Cave Automatic Virtual Environment (CAVE) during 2013 and 2014. The software programs used for development were SolidWorks, 3ds Max Design, and Unity 3D. Current hardware and software technologies that could complement this research were analyzed. The NVIDA GRID Visual Computing Appliance (VCA) was chosen as the rendering solution for animating complex CAD models in this application. The MiddleVR software toolkit was selected as the toolkit for VR interactions and CAVE display. A non-immersive 3D version of the VEC application was tested and shown to be an effective training tool in late 2015. An immersive networked version of the VEC allows the user to receive live instruction from a trainer being projected via depth camera imagery from a remote location. Four comparative analysis studies were performed. These studies used the average normalized gain from pre-test scores to determine the effectiveness of the various training methods. With the DMADV approach, solutions were identified and verified during each iteration of the development, which saved valuable time and resulted in better results being achieved in each revision of the application, with the final version having 88% positive responses and same effectiveness as other methods assessed.
An Immersive VR System for Sports Education
NASA Astrophysics Data System (ADS)
Song, Peng; Xu, Shuhong; Fong, Wee Teck; Chin, Ching Ling; Chua, Gim Guan; Huang, Zhiyong
The development of new technologies has undoubtedly promoted the advances of modern education, among which Virtual Reality (VR) technologies have made the education more visually accessible for students. However, classroom education has been the focus of VR applications whereas not much research has been done in promoting sports education using VR technologies. In this paper, an immersive VR system is designed and implemented to create a more intuitive and visual way of teaching tennis. A scalable system architecture is proposed in addition to the hardware setup layout, which can be used for various immersive interactive applications such as architecture walkthroughs, military training simulations, other sports game simulations, interactive theaters, and telepresent exhibitions. Realistic interaction experience is achieved through accurate and robust hybrid tracking technology, while the virtual human opponent is animated in real time using shader-based skin deformation. Potential future extensions are also discussed to improve the teaching/learning experience.
An Onboard ISS Virtual Reality Trainer
NASA Technical Reports Server (NTRS)
Miralles, Evelyn
2013-01-01
Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the station to perform these repairs. After the retirement of the shuttle, this is no longer an available option. As such, the need for the ISS crew members to review scenarios while on flight, either for tasks they already trained or for contingency operations has become a very critical subject. In many situations, the time between the last session of Neutral Buoyancy Laboratory (NBL) training and an Extravehicular Activity (EVA) task might be 6 to 8 months. In order to help with training for contingency repairs and to maintain EVA proficiency while on flight, the Johnson Space Center Virtual Reality Lab (VRLab) designed an onboard immersive ISS Virtual Reality Trainer (VRT), incorporating a unique optical system and making use of the already successful Dynamic Onboard Ubiquitous Graphical (DOUG) graphics software, to assist crew members with current procedures and contingency EVAs while on flight. The VRT provides an immersive environment similar to the one experienced at the VRLab crew training facility at NASA Johnson Space Center. EVA tasks are critical for a mission since as time passes the crew members may lose proficiency on previously trained tasks. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the ISS ages. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before.
Virtual community centre for power wheelchair training: Experience of children and clinicians.
Torkia, Caryne; Ryan, Stephen E; Reid, Denise; Boissy, Patrick; Lemay, Martin; Routhier, François; Contardo, Resi; Woodhouse, Janet; Archambault, Phillipe S
2017-11-02
To: 1) characterize the overall experience in using the McGill immersive wheelchair - community centre (miWe-CC) simulator; and 2) investigate the experience of presence (i.e., sense of being in the virtual rather than in the real, physical environment) while driving a PW in the miWe-CC. A qualitative research design with structured interviews was used. Fifteen clinicians and 11 children were interviewed after driving a power wheelchair (PW) in the miWe-CC simulator. Data were analyzed using the conventional and directed content analysis approaches. Overall, participants enjoyed using the simulator and experienced a sense of presence in the virtual space. They felt a sense of being in the virtual environment, involved and focused on driving the virtual PW rather than on the surroundings of the actual room where they were. Participants reported several similarities between the virtual community centre layout and activities of the miWe-CC and the day-to-day reality of paediatric PW users. The simulator replicated participants' expectations of real-life PW use and promises to have an effect on improving the driving skills of new PW users. Implications for rehabilitation Among young users, the McGill immersive wheelchair (miWe) simulator provides an experience of presence within the virtual environment. This experience of presence is generated by a sense of being in the virtual scene, a sense of being involved, engaged, and focused on interacting within the virtual environment, and by the perception that the virtual environment is consistent with the real world. The miWe is a relevant and accessible approach, complementary to real world power wheelchair training for young users.
Nararro-Haro, Maria V.; Hoffman, Hunter G.; Garcia-Palacios, Azucena; Sampaio, Mariana; Alhalabi, Wadee; Hall, Karyn; Linehan, Marsha
2016-01-01
Borderline personality disorder (BPD) is a severe mental disorder characterized by a dysfunctional pattern of affective instability, impulsivity, and disturbed interpersonal relationships. Dialectical Behavior Therapy (DBT®) is the most effective treatment for Borderline Personality Disorder, but demand for DBT® far exceeds existing clinical resources. Most patients with BPD never receive DBT®. Incorporating computer technology into the DBT® could help increase dissemination. Immersive Virtual Reality technology (VR) is becoming widely available to mainstream consumers. This case study explored the feasibility/clinical potential of using immersive virtual reality technology to enhance DBT® mindfulness skills training of a 32 year old female diagnosed with BPD. Prior to using VR, the patient experienced difficulty practicing DBT® mindfulness due to her emotional reactivity, and difficulty concentrating. To help the patient focus her attention, and to facilitate DBT® mindfulness skills learning, the patient looked into virtual reality goggles, and had the illusion of slowly “floating down” a 3D computer-generated river while listening to DBT® mindfulness training audios. Urges to commit suicide, urges to self harm, urges to quit therapy, urges to use substances, and negative emotions were all reduced after each VR mindfulness session and VR mindfulness was well accepted/liked by the patient. Although case studies are scientifically inconclusive by nature, results from this feasibility study were encouraging. Future controlled studies are needed to quantify whether VR-enhanced mindfulness training has long term benefits e.g., increasing patient acceptance and/or improving therapeutic outcome. Computerizing some of the DBT® skills treatment modules would reduce cost and increase dissemination. PMID:27853437
Nararro-Haro, Maria V; Hoffman, Hunter G; Garcia-Palacios, Azucena; Sampaio, Mariana; Alhalabi, Wadee; Hall, Karyn; Linehan, Marsha
2016-01-01
Borderline personality disorder (BPD) is a severe mental disorder characterized by a dysfunctional pattern of affective instability, impulsivity, and disturbed interpersonal relationships. Dialectical Behavior Therapy (DBT®) is the most effective treatment for Borderline Personality Disorder, but demand for DBT® far exceeds existing clinical resources. Most patients with BPD never receive DBT®. Incorporating computer technology into the DBT® could help increase dissemination. Immersive Virtual Reality technology (VR) is becoming widely available to mainstream consumers. This case study explored the feasibility/clinical potential of using immersive virtual reality technology to enhance DBT® mindfulness skills training of a 32 year old female diagnosed with BPD. Prior to using VR, the patient experienced difficulty practicing DBT® mindfulness due to her emotional reactivity, and difficulty concentrating. To help the patient focus her attention, and to facilitate DBT® mindfulness skills learning, the patient looked into virtual reality goggles, and had the illusion of slowly "floating down" a 3D computer-generated river while listening to DBT® mindfulness training audios. Urges to commit suicide, urges to self harm, urges to quit therapy, urges to use substances, and negative emotions were all reduced after each VR mindfulness session and VR mindfulness was well accepted/liked by the patient. Although case studies are scientifically inconclusive by nature, results from this feasibility study were encouraging. Future controlled studies are needed to quantify whether VR-enhanced mindfulness training has long term benefits e.g., increasing patient acceptance and/or improving therapeutic outcome. Computerizing some of the DBT® skills treatment modules would reduce cost and increase dissemination.
2004-12-01
Post Traumatic Stress Disorder ( PTSD ) is reported to be caused by traumatic events that are outside the range of usual human...available X- Box game, Full Spectrum Warrior. 2. POST TRAUMATIC STRESS DISORDER According to the DSM-IV (1994), PTSD is caused by traumatic events...H. (2002). Virtual reality exposure therapy for World Trade Center Post Traumatic Stress Disorder . Cyberpsychology
De Luca, Rosaria; Torrisi, Michele; Piccolo, Adriana; Bonfiglio, Giovanni; Tomasello, Provvidenza; Naro, Antonino; Calabrò, Rocco Salvatore
2017-10-11
Cognitive impairment, as well as mood and anxiety disorders, occur frequently in patients following stroke. Aim of this study was to evaluate the effects of a combined rehabilitative treatment using conventional relaxation and respiratory techniques, in a specific rehabilitative virtual environment (by using Bts-Nirvana). A 58-year-old woman, affected by hemorrhagic stroke, underwent two different rehabilitation trainings, including either standard relaxation techniques alone in a common clinical setting or the same psychological approach in a semi-immersive virtual environment with an augmented sensorial (audio-video) and motor feedback (sensory motor-interaction). We evaluated the patient's cognitive and psychological profile before and after the two different trainings, by using a specific psychometric battery, aimed to assess cognitive status, attention processes and to estimate the presence of mood alterations, anxiety and coping strategies. Only at the end of the combined approach, we observed a significant improvement in attention and memory functions, with a nearly complete relief of anxiety symptoms and an improvement in coping strategies. Relaxation and respiratory techniques in a semi-immersive virtual reality environment, using Bts-Nirvana, may be a promising tool in improving attention process, coping strategies, and anxiety in individuals with neurological disorders, including stroke.
Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing
Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T.
2016-01-01
In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user’s hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning. PMID:26999151
Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing.
Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T
2016-03-18
In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user's hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning.
Training presence: the importance of virtual reality experience on the "sense of being there".
Gamito, Pedro; Oliveira, Jorge; Morais, Diogo; Baptista, André; Santos, Nuno; Soares, Fábio; Saraiva, Tomaz; Rosa, Pedro
2010-01-01
Nature and origin of presence are still unclear. Although it can be characterized, under a neurophysiological perspective, as a process resulting from a synchrony between cognitive and perceptive systems, the multitude of associated processes reduces the chances of brain mapping presence. In this way, our study was designed in order to understand the possible role of VR experience on presence in a virtual environment. For our study, 16 participants (M=28.39 years; SD=13.44) of both genders without computer experience were selected. The study design consisted of two assessments (initial and final), where the participants were evaluated with BFI, PQ, ITQ, QC, MCSDS-SF, STAI, visual attention and behavioral measures after playing an first person shooter (FPS) game. In order to manipulate the level of VR experience the participants were trained on, a different FPS was used during the 12 weekly sessions of 30 minutes. Results revealed significant differences between the first and final assessment for presence (F(1,15)=11.583; MSE=775.538; p<01) and immersion scores (F(1,15)=6.234; MSE=204.962; p<05), indicating higher levels of presence and immersion in the final assessment. No statistical significant results were obtained for cybersickness or the behavioral measures. In summary, our results showed that training and the subsequent higher computer experience levels can increase immersion and presence.
Height, social comparison, and paranoia: An immersive virtual reality experimental study
Freeman, Daniel; Evans, Nicole; Lister, Rachel; Antley, Angus; Dunn, Graham; Slater, Mel
2014-01-01
Mistrust of others may build upon perceptions of the self as vulnerable, consistent with an association of paranoia with perceived lower social rank. Height is a marker of social status and authority. Therefore we tested the effect of manipulating height, as a proxy for social rank, on paranoia. Height was manipulated within an immersive virtual reality simulation. Sixty females who reported paranoia experienced a virtual reality train ride twice: at their normal and reduced height. Paranoia and social comparison were assessed. Reducing a person's height resulted in more negative views of the self in comparison with other people and increased levels of paranoia. The increase in paranoia was fully mediated by changes in social comparison. The study provides the first demonstration that reducing height in a social situation increases the occurrence of paranoia. The findings indicate that negative social comparison is a cause of mistrust. PMID:24924485
ERIC Educational Resources Information Center
Savin-Baden, Maggi
2008-01-01
Learning in immersive virtual worlds (simulations and virtual worlds such as Second Life) could become a central learning approach in many curricula, but the socio-political impact of virtual world learning on higher education remains under-researched. Much of the recent research into learning in immersive virtual worlds centres around games and…
A haptic interface for virtual simulation of endoscopic surgery.
Rosenberg, L B; Stredney, D
1996-01-01
Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.
Virtual reality as a tool for cross-cultural communication: an example from military team training
NASA Astrophysics Data System (ADS)
Downes-Martin, Stephen; Long, Mark; Alexander, Joanna R.
1992-06-01
A major problem with communication across cultures, whether professional or national, is that simple language translation if often insufficient to communicate the concepts. This is especially true when the communicators come from highly specialized fields of knowledge or from national cultures with long histories of divergence. This problem becomes critical when the goal of the communication is national negotiation dealing with such high risk items as arms negotiation or trade wars. Virtual Reality technology has considerable potential for facilitating communication across cultures, by immersing the communicators within multiple visual representations of the concepts, and providing control over those representations. Military distributed team training provides a model for virtual reality suitable for cross cultural communication such as negotiation. In both team training and negotiation, the participants must cooperate, agree on a set of goals, and achieve mastery over the concepts being negotiated. Team training technologies suitable for supporting cross cultural negotiation exist (branch wargaming, computer image generation and visualization, distributed simulation), and have developed along different lines than traditional virtual reality technology. Team training de-emphasizes the realism of physiological interfaces between the human and the virtual reality, and emphasizes the interaction of humans with each other and with intelligent simulated agents within the virtual reality. This approach to virtual reality is suggested as being more fruitful for future work.
A model for flexible tools used in minimally invasive medical virtual environments.
Soler, Francisco; Luzon, M Victoria; Pop, Serban R; Hughes, Chris J; John, Nigel W; Torres, Juan Carlos
2011-01-01
Within the limits of current technology, many applications of a virtual environment will trade-off accuracy for speed. This is not an acceptable compromise in a medical training application where both are essential. Efficient algorithms must therefore be developed. The purpose of this project is the development and validation of a novel physics-based real time tool manipulation model, which is easy to integrate into any medical virtual environment that requires support for the insertion of long flexible tools into complex geometries. This encompasses medical specialities such as vascular interventional radiology, endoscopy, and laparoscopy, where training, prototyping of new instruments/tools and mission rehearsal can all be facilitated by using an immersive medical virtual environment. Our model recognises and uses accurately patient specific data and adapts to the geometrical complexity of the vessel in real time.
OR fire virtual training simulator: design and face validity.
Dorozhkin, Denis; Olasky, Jaisa; Jones, Daniel B; Schwaitzberg, Steven D; Jones, Stephanie B; Cao, Caroline G L; Molina, Marcos; Henriques, Steven; Wang, Jinling; Flinn, Jeff; De, Suvranu
2017-09-01
The Virtual Electrosurgical Skill Trainer is a tool for training surgeons the safe operation of electrosurgery tools in both open and minimally invasive surgery. This training includes a dedicated team-training module that focuses on operating room (OR) fire prevention and response. The module was developed to allow trainees, practicing surgeons, anesthesiologist, and nurses to interact with a virtual OR environment, which includes anesthesia apparatus, electrosurgical equipment, a virtual patient, and a fire extinguisher. Wearing a head-mounted display, participants must correctly identify the "fire triangle" elements and then successfully contain an OR fire. Within these virtual reality scenarios, trainees learn to react appropriately to the simulated emergency. A study targeted at establishing the face validity of the virtual OR fire simulator was undertaken at the 2015 Society of American Gastrointestinal and Endoscopic Surgeons conference. Forty-nine subjects with varying experience participated in this Institutional Review Board-approved study. The subjects were asked to complete the OR fire training/prevention sequence in the VEST simulator. Subjects were then asked to answer a subjective preference questionnaire consisting of sixteen questions, focused on the usefulness and fidelity of the simulator. On a 5-point scale, 12 of 13 questions were rated at a mean of 3 or greater (92%). Five questions were rated above 4 (38%), particularly those focusing on the simulator effectiveness and its usefulness in OR fire safety training. A total of 33 of the 49 participants (67%) chose the virtual OR fire trainer over the traditional training methods such as a textbook or an animal model. Training for OR fire emergencies in fully immersive VR environments, such as the VEST trainer, may be the ideal training modality. The face validity of the OR fire training module of the VEST simulator was successfully established on many aspects of the simulation.
The Persistent Issue of Simulator Sickness in Naval Aviation Training.
Geyer, Daniel J; Biggs, Adam T
2018-04-01
Virtual simulations offer nearly unlimited training potential for naval aviation due to the wide array of scenarios that can be simulated in a safe, reliable, and cost-effective environment. This versatility has created substantial interest in using existing and emerging virtual technology to enhance training scenarios. However, the virtual simulations themselves may hinder training initiatives by inducing simulator sickness among the trainees, which is a series of symptoms similar to motion sickness that can arise from simulator use. Simulator sickness has been a problem for military aviation since the first simulators were introduced. The problem has also persisted despite the increasing fidelity and sense of immersion offered by new generations of simulators. As such, it is essential to understand the various problems so that trainers can ensure the best possible use of the simulators. This review will examine simulator sickness as it pertains to naval aviation training. Topics include: the prevailing theories on why symptoms develop, methods of measurement, contributing factors, effects on training, effects when used shipboard, aftereffects, countermeasures, and recommendations for future research involving virtual simulations in an aviation training environment.Geyer DJ, Biggs AT. The persistent issue of simulator sickness in naval aviation training. Aerosp Med Hum Perform. 2018; 89(4):396-405.
Virtual reality training improves students' knowledge structures of medical concepts.
Stevens, Susan M; Goldsmith, Timothy E; Summers, Kenneth L; Sherstyuk, Andrei; Kihmm, Kathleen; Holten, James R; Davis, Christopher; Speitel, Daniel; Maris, Christina; Stewart, Randall; Wilks, David; Saland, Linda; Wax, Diane; Panaiotis; Saiki, Stanley; Alverson, Dale; Caudell, Thomas P
2005-01-01
Virtual environments can provide training that is difficult to achieve under normal circumstances. Medical students can work on high-risk cases in a realistic, time-critical environment, where students practice skills in a cognitively demanding and emotionally compelling situation. Research from cognitive science has shown that as students acquire domain expertise, their semantic organization of core domain concepts become more similar to those of an expert's. In the current study, we hypothesized that students' knowledge structures would become more expert-like as a result of their diagnosing and treating a patient experiencing a hematoma within a virtual environment. Forty-eight medical students diagnosed and treated a hematoma case within a fully immersed virtual environment. Student's semantic organization of 25 case-related concepts was assessed prior to and after training. Students' knowledge structures became more integrated and similar to an expert knowledge structure of the concepts as a result of the learning experience. The methods used here for eliciting, representing, and evaluating knowledge structures offer a sensitive and objective means for evaluating student learning in virtual environments and medical simulations.
Levy
1996-08-01
New interactive computer technologies are having a significant influence on medical education, training, and practice. The newest innovation in computer technology, virtual reality, allows an individual to be immersed in a dynamic computer-generated, three-dimensional environment and can provide realistic simulations of surgical procedures. A new virtual reality hysteroscope passes through a sensing device that synchronizes movements with a three-dimensional model of a uterus. Force feedback is incorporated into this model, so the user actually experiences the collision of an instrument against the uterine wall or the sensation of the resistance or drag of a resectoscope as it cuts through a myoma in a virtual environment. A variety of intrauterine pathologies and procedures are simulated, including hyperplasia, cancer, resection of a uterine septum, polyp, or myoma, and endometrial ablation. This technology will be incorporated into comprehensive training programs that will objectively assess hand-eye coordination and procedural skills. It is possible that by incorporating virtual reality into hysteroscopic training programs, a decrease in the learning curve and the number of complications presently associated with the procedures may be realized. Prospective studies are required to assess these potential benefits.
Community-based pedestrian safety training in virtual reality: A pragmatic trial.
Schwebel, David C; Combs, Tabitha; Rodriguez, Daniel; Severson, Joan; Sisiopiku, Virginia
2016-01-01
Child pedestrian injuries are a leading cause of mortality and morbidity across the United States and the world. Repeated practice at the cognitive-perceptual task of crossing a street may lead to safer pedestrian behavior. Virtual reality offers a unique opportunity for repeated practice without the risk of actual injury. This study conducted a pre-post within-subjects trial of training children in pedestrian safety using a semi-mobile, semi-immersive virtual pedestrian environment placed at schools and community centers. Pedestrian safety skills among a group of 44 seven- and eight-year-old children were assessed in a laboratory, and then children completed six 15-minute training sessions in the virtual pedestrian environment at their school or community center following pragmatic trial strategies over the course of three weeks. Following training, pedestrian safety skills were re-assessed. Results indicate improvement in delay entering traffic following training. Safe crossings did not demonstrate change. Attention to traffic and time to contact with oncoming vehicles both decreased somewhat, perhaps an indication that training was incomplete and children were in the process of actively learning to be safer pedestrians. The findings suggest virtual reality environments placed in community centers hold promise for teaching children to be safer pedestrians, but future research is needed to determine the optimal training dosage. Copyright © 2015 Elsevier Ltd. All rights reserved.
Lo Priore, Corrado; Castelnuovo, Gianluca; Liccione, Diego; Liccione, Davide
2003-06-01
The paper discusses the use of immersive virtual reality systems for the cognitive rehabilitation of dysexecutive syndrome, usually caused by prefrontal brain injuries. With respect to classical P&P and flat-screen computer rehabilitative tools, IVR systems might prove capable of evoking a more intense and compelling sense of presence, thanks to the highly naturalistic subject-environment interaction allowed. Within a constructivist framework applied to holistic rehabilitation, we suggest that this difference might enhance the ecological validity of cognitive training, partly overcoming the implicit limits of a lab setting, which seem to affect non-immersive procedures especially when applied to dysexecutive symptoms. We tested presence in a pilot study applied to a new VR-based rehabilitation tool for executive functions, V-Store; it allows patients to explore a virtual environment where they solve six series of tasks, ordered for complexity and designed to stimulate executive functions, programming, categorical abstraction, short-term memory and attention. We compared sense of presence experienced by unskilled normal subjects, randomly assigned to immersive or non-immersive (flat screen) sessions of V-Store, through four different indexes: self-report questionnaire, psychophysiological (GSR, skin conductance), neuropsychological (incidental recall memory test related to auditory information coming from the "real" environment) and count of breaks in presence (BIPs). Preliminary results show in the immersive group a significantly higher GSR response during tasks; neuropsychological data (fewer recalled elements from "reality") and less BIPs only show a congruent but yet non-significant advantage for the immersive condition; no differences were evident from the self-report questionnaire. A larger experimental group is currently under examination to evaluate significance of these data, which also might prove interesting with respect to the question of objective-subjective measures of presence.
Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz
2016-03-01
Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey
2003-01-01
The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.
An Onboard ISS Virtual Reality Trainer
NASA Technical Reports Server (NTRS)
Miralles, Evelyn
2013-01-01
Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the Station to perform these specific repairs. With the retirement of the shuttle, this is no longer an available option. As such, the need for ISS crew members to review scenarios while on flight, either for tasks they already trained for on the ground or for contingency operations has become a very critical issue. NASA astronauts prepare for Extra-Vehicular Activities (EVA) or Spacewalks through numerous training media, such as: self-study, part task training, underwater training in the Neutral Buoyancy Laboratory (NBL), hands-on hardware reviews and training at the Virtual Reality Laboratory (VRLab). In many situations, the time between the last session of a training and an EVA task might be 6 to 8 months. EVA tasks are critical for a mission and as time passes the crew members may lose proficiency on previously trained tasks and their options to refresh or learn a new skill while on flight are limited to reading training materials and watching videos. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the Station ages. In order to help the ISS crew members maintain EVA proficiency or train for contingency repairs during their mission, the Johnson Space Center's VRLab designed an immersive ISS Virtual Reality Trainer (VRT). The VRT incorporates a unique optical system that makes use of the already successful Dynamic On-board Ubiquitous Graphics (DOUG) software to assist crew members with procedure reviews and contingency EVAs while on board the Station. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before. The Virtual Reality Trainer (VRT) provides an immersive 3D environment similar to the one experienced at the VRLab crew training facility at the NASA Johnson Space Center. VRT bridges the gap by allowing crew members to experience an interactive, 3D environment to reinforce skills already learned and to explore new work sites and repair procedures outside the Station.
Pan, Xueni; Slater, Mel; Beacco, Alejandro; Navarro, Xavi; Bellido Rivas, Anna I.; Swapp, David; Hale, Joanna; Forbes, Paul Alexander George; Denvir, Catrina; de C. Hamilton, Antonia F.; Delacroix, Sylvie
2016-01-01
Background Dealing with insistent patient demand for antibiotics is an all too common part of a General Practitioner’s daily routine. This study explores the extent to which portable Immersive Virtual Reality technology can help us gain an accurate understanding of the factors that influence a doctor’s response to the ethical challenge underlying such tenacious requests for antibiotics (given the threat posed by growing anti-bacterial resistance worldwide). It also considers the potential of such technology to train doctors to face such dilemmas. Experiment Twelve experienced GPs and nine trainees were confronted with an increasingly angry demand by a woman to prescribe antibiotics to her mother in the face of inconclusive evidence that such antibiotic prescription is necessary. The daughter and mother were virtual characters displayed in immersive virtual reality. The specific purposes of the study were twofold: first, whether experienced GPs would be more resistant to patient demands than the trainees, and second, to investigate whether medical doctors would take the virtual situation seriously. Results Eight out of the 9 trainees prescribed the antibiotics, whereas 7 out of the 12 GPs did so. On the basis of a Bayesian analysis, these results yield reasonable statistical evidence in favor of the notion that experienced GPs are more likely to withstand the pressure to prescribe antibiotics than trainee doctors, thus answering our first question positively. As for the second question, a post experience questionnaire assessing the participants’ level of presence (together with participants’ feedback and body language) suggested that overall participants did tend towards the illusion of being in the consultation room depicted in the virtual reality and that the virtual consultation taking place was really happening. PMID:26889676
Pan, Xueni; Slater, Mel; Beacco, Alejandro; Navarro, Xavi; Bellido Rivas, Anna I; Swapp, David; Hale, Joanna; Forbes, Paul Alexander George; Denvir, Catrina; Hamilton, Antonia F de C; Delacroix, Sylvie
2016-01-01
Dealing with insistent patient demand for antibiotics is an all too common part of a General Practitioner's daily routine. This study explores the extent to which portable Immersive Virtual Reality technology can help us gain an accurate understanding of the factors that influence a doctor's response to the ethical challenge underlying such tenacious requests for antibiotics (given the threat posed by growing anti-bacterial resistance worldwide). It also considers the potential of such technology to train doctors to face such dilemmas. Twelve experienced GPs and nine trainees were confronted with an increasingly angry demand by a woman to prescribe antibiotics to her mother in the face of inconclusive evidence that such antibiotic prescription is necessary. The daughter and mother were virtual characters displayed in immersive virtual reality. The specific purposes of the study were twofold: first, whether experienced GPs would be more resistant to patient demands than the trainees, and second, to investigate whether medical doctors would take the virtual situation seriously. Eight out of the 9 trainees prescribed the antibiotics, whereas 7 out of the 12 GPs did so. On the basis of a Bayesian analysis, these results yield reasonable statistical evidence in favor of the notion that experienced GPs are more likely to withstand the pressure to prescribe antibiotics than trainee doctors, thus answering our first question positively. As for the second question, a post experience questionnaire assessing the participants' level of presence (together with participants' feedback and body language) suggested that overall participants did tend towards the illusion of being in the consultation room depicted in the virtual reality and that the virtual consultation taking place was really happening.
Immersive virtual reality simulations in nursing education.
Kilmon, Carol A; Brown, Leonard; Ghosh, Sumit; Mikitiuk, Artur
2010-01-01
This article explores immersive virtual reality as a potential educational strategy for nursing education and describes an immersive learning experience now being developed for nurses. This pioneering project is a virtual reality application targeting speed and accuracy of nurse response in emergency situations requiring cardiopulmonary resuscitation. Other potential uses and implications for the development of virtual reality learning programs are discussed.
Foloppe, Déborah A; Richard, Paul; Yamaguchi, Takehiko; Etcharry-Bouyx, Frédérique; Allain, Philippe
2018-07-01
Impairments in performing activities of daily living occur early in the course of Alzheimer's disease (AD). There is a great need to develop non-pharmacological therapeutic interventions likely to reduce dependency in everyday activities in AD patients. This study investigated whether it was possible to increase autonomy in these patients in cooking activities using interventions based on errorless learning, vanishing-cue, and virtual reality techniques. We recruited a 79-year-old woman who met NINCDS-ADRDA criteria for probable AD. She was trained in four cooking tasks for four days per task, one hour per day, in virtual and in real conditions. Outcome measures included subjective data concerning the therapeutic intervention and the experience of virtual reality, repeated assessments of training activities, neuropsychological scores, and self-esteem and quality of life measures. The results indicated that our patient could relearn some cooking activities using virtual reality techniques. Transfer to real life was also observed. Improvement of the task performance remained stable over time. This case report supports the value of a non-immersive virtual kitchen to help people with AD to relearn cooking activities.
On the usefulness of the concept of presence in virtual reality applications
NASA Astrophysics Data System (ADS)
Mestre, Daniel R.
2015-03-01
Virtual Reality (VR) leads to realistic experimental situations, while enabling researchers to have deterministic control on these situations, and to precisely measure participants' behavior. However, because more realistic and complex situations can be implemented, important questions arise, concerning the validity and representativeness of the observed behavior, with reference to a real situation. One example is the investigation of a critical (virtually dangerous) situation, in which the participant knows that no actual threat is present in the simulated situation, and might thus exhibit a behavioral response that is far from reality. This poses serious problems, for instance in training situations, in terms of transfer of learning to a real situation. Facing this difficult question, it seems necessary to study the relationships between three factors: immersion (physical realism), presence (psychological realism) and behavior. We propose a conceptual framework, in which presence is a necessary condition for the emergence of a behavior that is representative of what is observed in real conditions. Presence itself depends not only on physical immersive characteristics of the Virtual Reality setup, but also on contextual and psychological factors.
Research on evaluation techniques for immersive multimedia
NASA Astrophysics Data System (ADS)
Hashim, Aslinda M.; Romli, Fakaruddin Fahmi; Zainal Osman, Zosipha
2013-03-01
Nowadays Immersive Multimedia covers most usage in tremendous ways, such as healthcare/surgery, military, architecture, art, entertainment, education, business, media, sport, rehabilitation/treatment and training areas. Moreover, the significant of Immersive Multimedia to directly meet the end-users, clients and customers needs for a diversity of feature and purpose is the assembly of multiple elements that drive effective Immersive Multimedia system design, so evaluation techniques is crucial for Immersive Multimedia environments. A brief general idea of virtual environment (VE) context and `realism' concept that formulate the Immersive Multimedia environments is then provided. This is followed by a concise summary of the elements of VE assessment technique that is applied in Immersive Multimedia system design, which outlines the classification space for Immersive Multimedia environments evaluation techniques and gives an overview of the types of results reported. A particular focus is placed on the implications of the Immersive Multimedia environments evaluation techniques in relation to the elements of VE assessment technique, which is the primary purpose of producing this research. The paper will then conclude with an extensive overview of the recommendations emanating from the research.
Virtual reality training in neurosurgery: Review of current status and future applications
Alaraj, Ali; Lemole, Michael G.; Finkle, Joshua H.; Yudkowsky, Rachel; Wallace, Adam; Luciano, Cristian; Banerjee, P. Pat; Rizzi, Silvio H.; Charbel, Fady T.
2011-01-01
Background: Over years, surgical training is changing and years of tradition are being challenged by legal and ethical concerns for patient safety, work hour restrictions, and the cost of operating room time. Surgical simulation and skill training offer an opportunity to teach and practice advanced techniques before attempting them on patients. Simulation training can be as straightforward as using real instruments and video equipment to manipulate simulated “tissue” in a box trainer. More advanced virtual reality (VR) simulators are now available and ready for widespread use. Early systems have demonstrated their effectiveness and discriminative ability. Newer systems enable the development of comprehensive curricula and full procedural simulations. Methods: A PubMed review of the literature was performed for the MESH words “Virtual reality, “Augmented Reality”, “Simulation”, “Training”, and “Neurosurgery”. Relevant articles were retrieved and reviewed. A review of the literature was performed for the history, current status of VR simulation in neurosurgery. Results: Surgical organizations are calling for methods to ensure the maintenance of skills, advance surgical training, and credential surgeons as technically competent. The number of published literature discussing the application of VR simulation in neurosurgery training has evolved over the last decade from data visualization, including stereoscopic evaluation to more complex augmented reality models. With the revolution of computational analysis abilities, fully immersive VR models are currently available in neurosurgery training. Ventriculostomy catheters insertion, endoscopic and endovascular simulations are used in neurosurgical residency training centers across the world. Recent studies have shown the coloration of proficiency with those simulators and levels of experience in the real world. Conclusion: Fully immersive technology is starting to be applied to the practice of neurosurgery. In the near future, detailed VR neurosurgical modules will evolve to be an essential part of the curriculum of the training of neurosurgeons. PMID:21697968
Virtual reality training for health-care professionals.
Mantovani, Fabrizia; Castelnuovo, Gianluca; Gaggioli, Andrea; Riva, Giuseppe
2003-08-01
Emerging changes in health-care delivery are having a significant impact on the structure of health-care professionals' education. Today it is recognized that medical knowledge doubles every 6-8 years, with new medical procedures emerging everyday. While the half-life of medical information is so short, the average physician practices 30 years and the average nurse 40 years. Continuing education thus represents an important challenge to face. Recent advances in educational technology are offering an increasing number of innovative learning tools. Among these, Virtual Reality represents a promising area with high potential of enhancing the training of health-care professionals. Virtual Reality Training can provide a rich, interactive, engaging educational context, thus supporting experiential learning-by-doing; it can, in fact, contribute to raise interest and motivation in trainees and to effectively support skills acquisition and transfer, since the learning process can be settled within an experiential framework. Current virtual training applications for health-care differ a lot as to both their technological/multimedia sophistication and to the types of skills trained, varying for example from telesurgical applications to interactive simulations of human body and brain, to virtual worlds for emergency training. Other interesting applications include the development of immersive 3D environments for training psychiatrists and psychologists in the treatment of mental disorders. This paper has the main aim of discussing the rationale and main benefits for the use of virtual reality in health-care education and training. Significant research and projects carried out in this field will also be presented, followed by discussion on key issues concerning current limitations and future development directions.
Social Interaction Development through Immersive Virtual Environments
ERIC Educational Resources Information Center
Beach, Jason; Wendt, Jeremy
2014-01-01
The purpose of this pilot study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity…
Applied virtual reality at the Research Triangle Institute
NASA Technical Reports Server (NTRS)
Montoya, R. Jorge
1994-01-01
Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.
Emergent Capabilities Converging into M and S 2.0
NASA Technical Reports Server (NTRS)
Reitz, Emilie; Reist, Jay
2012-01-01
The continued operational environment complexity faced by the Department of Defense, despite a restricted resource environment, is a mandate for greater adaptability and availability in joint training. To address these constraints, this paper proposes a model for the potential integration of adaptability training, virtual world capabilities and immersive training into the wider Joint Live Virtual and Constructive (JLVC) Federation, supported by human, social, cultural and behavior modeling, and measurement and assessment. By fusing those capabilities and modeling and simulation enhancements into the JLVC federation, it will create a force who is more apt to arrive at and implement correct decisions, and more able to appropriately seize initiative in the field. The model would allow for the testing and training of capabilities and TTPs that cannot be reasonably explored to their logical conclusions in a 'live' environment, as well as enhance training fidelity for all echelons and tasks.
Russo, Margherita; De Luca, Rosaria; Naro, Antonino; Sciarrone, Francesca; Aragona, Bianca; Silvestri, Giuseppe; Manuli, Alfredo; Bramanti, Alessia; Casella, Carmela; Bramanti, Placido; Calabrò, Rocco Salvatore
2017-09-01
Aim of the present study was to evaluate whether the presence of body shadows during virtual reality (VR) training with BTS NIRVANA (BTs-N) may lead to a better functional recovery. We enrolled 20 poststroke rehabilitation inpatients, who underwent a neurocognitive-rehabilitative training consisting of 24 sessions (3 times a week for 8 weeks) of BTs-N. All the patients were randomized into 2 groups: semi-immersive virtual training with (S-IVTS group) or without (S-IVT group) body shadows. Each participant was evaluated before (T0) and immediately (T1) after the end of the training (Trial Registration Number: NCT03095560). The S-IVTS group showed a greater improvement in visuo-constructive skills and sustained attention, as compared with the S-IVT group. The other measures showed nonsignificant within-group and between-group differences. Our results showed that body shadow may represent a high-priority class of stimuli that act by "pushing" attention toward the body itself. Further studies are needed to clarify the role of body shadow in promoting the internal representation construction and thus self-recognition.
[Virtual reality therapy in anxiety disorders].
Mitrousia, V; Giotakos, O
2016-01-01
During the last decade a number of studies have been conducted in order to examine if virtual reality exposure therapy can be an alternative form of therapy for the treatment of mental disorders and particularly for the treatment of anxiety disorders. Imaginal exposure therapy, which is one of the components of Cognitive Behavioral Therapy, cannot be easily applied to all patients and in cases like those virtual reality can be used as an alternative or a supportive psychotherapeutic technique. Most studies using virtual reality have focused on anxiety disorders, mainly in specific phobias, but some extend to other disorders such as eating disorders, drug dependence, pain control and palliative care and rehabilitation. Main characteristics of virtual reality therapy are: "interaction", "immersion", and "presence". High levels of "immersion" and "presence" are associated with increased response to exposure therapy in virtual environments, as well as better therapeutic outcomes and sustained therapeutic gains. Typical devices that are used in order patient's immersion to be achieved are the Head-Mounted Displays (HMD), which are only for individual use, and the computer automatic virtual environment (CAVE), which is a multiuser. Virtual reality therapy's disadvantages lie in the difficulties that arise due to the demanded specialized technology skills, devices' cost and side effects. Therapists' training is necessary in order for them to be able to manipulate the software and the hardware and to adjust it to each case's needs. Devices' cost is high but as technology continuously improves it constantly decreases. Immersion during virtual reality therapy can induce mild and temporary side effects such as nausea, dizziness or headache. Until today, however, experience shows that virtual reality offers several advantages. Patient's avoidance to be exposed in phobic stimuli is reduced via the use of virtual reality since the patient is exposed to them as many times as he wishes and under the supervision of the therapist. The technique takes place in the therapist's office which ensures confidentiality and privacy. The therapist is able to control unpredicted events that can occur during patient's exposure in real environments. Mainly the therapist can control the intensity of exposure and adapt it to the patient's needs. Virtual reality can be proven particularly useful in some specific psychological states. For instance, patients with post-traumatic stress disorder (PTSD) who prone to avoid the reminders of the traumatic events. Exposure in virtual reality can solve this problem providing to the patient a large number of stimuli that activate the senses causing the necessary physiological and psychological anxiety reactions, regardless of his willingness or ability to recall in his imagination the traumatic event.
Declarative Knowledge Acquisition in Immersive Virtual Learning Environments
ERIC Educational Resources Information Center
Webster, Rustin
2016-01-01
The author investigated the interaction effect of immersive virtual reality (VR) in the classroom. The objective of the project was to develop and provide a low-cost, scalable, and portable VR system containing purposely designed and developed immersive virtual learning environments for the US Army. The purpose of the mixed design experiment was…
Machinima Interventions: Innovative Approaches to Immersive Virtual World Curriculum Integration
ERIC Educational Resources Information Center
Middleton, Andrew John; Mather, Richard
2008-01-01
The educational value of Immersive Virtual Worlds (IVWs) seems to be in their social immersive qualities and as an accessible simulation technology. In contrast to these synchronous applications this paper discusses the use of educational machinima developed in IVW virtual film sets. It also introduces the concept of media intervention, proposing…
Active Learning through the Use of Virtual Environments
ERIC Educational Resources Information Center
Mayrose, James
2012-01-01
Immersive Virtual Reality (VR) has seen explosive growth over the last decade. Immersive VR attempts to give users the sensation of being fully immersed in a synthetic environment by providing them with 3D hardware, and allowing them to interact with objects in virtual worlds. The technology is extremely effective for learning and exploration, and…
Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P
2004-01-01
Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully between Western Australia and UNM. We successfully demonstrated the ability to fully immerse participants in a distributed virtual environment independent of distance for collaborative team interaction in medical simulation designed for education and training. The ability to make mistakes in a safe environment is well received by students and has a positive impact on their understanding, as well as memory of the principles involved in correcting those mistakes. Bringing people together as virtual teams for interactive experiential learning and collaborative training, independent of distance, provides a platform for distributed "just-in-time" training, performance assessment and credentialing. Further validation is necessary to determine the potential value of the distributed VRE in knowledge transfer, improved future performance and should entail training participants to competence in using these tools.
Kleinert, Robert; Heiermann, Nadine; Wahba, Roger; Chang, De-Huan; Hölscher, Arnulf H; Stippel, Dirk L
2015-01-01
Immersive patient simulators (IPS) allow an illusionary immersion into a synthetic world where the user can freely navigate through a 3-dimensional environment similar to computer games. Playful learning with IPS allows internalization of medical workflows without harming real patients. Ideally, IPS show high student acceptance and can have positive effect on knowledge gain. Development of IPS with high technical quality is resource intensive. Therefore most of the "high-fidelity" IPS are commercially driven. Usage of IPS in the daily curriculum is still rare. There is no academic-driven simulator that is freely accessible to every student and combines high immersion grade with a profound amount of medical content. Therefore it was our aim to develop an academic-driven IPS prototype that is free to use and combines a high immersion grade with profound medical content. In addition, a first validation of the prototype was conducted. The conceptual design included definition of the following parameters: amount of curricular content, grade of technical quality, availability, and level of validation. A preliminary validation was done with 25 students. Students' opinion about acceptance was evaluated by a Likert-scale questionnaire. Effect on knowledge gain was determined by testing concordance and predictive validity. A custom-made simulator prototype (Artificial learning interface for clinical education [ALICE]) displays a virtual clinic environment that can be explored from a first-person view similar to a video game. By controlling an avatar, the user navigates through the environment, is able to treat virtual patients, and faces the consequence of different decisions. ALICE showed high students' acceptance. There was positive correlation for concordance validity and predictive validity. Simulator usage had positive effect on reproduction of trained content and declarative knowledge. We successfully developed a university-based, IPS prototype (ALICE) with profound medical content. ALICE is a nonprofit simulator, easy to use, and showed high students' acceptance; thus it potentially provides an additional tool for supporting student teaching in the daily clinical curriculum. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
An innovative virtual reality training tool for orthognathic surgery.
Pulijala, Y; Ma, M; Pears, M; Peebles, D; Ayoub, A
2018-02-01
Virtual reality (VR) surgery using Oculus Rift and Leap Motion devices is a multi-sensory, holistic surgical training experience. A multimedia combination including 360° videos, three-dimensional interaction, and stereoscopic videos in VR has been developed to enable trainees to experience a realistic surgery environment. The innovation allows trainees to interact with the individual components of the maxillofacial anatomy and apply surgical instruments while watching close-up stereoscopic three-dimensional videos of the surgery. In this study, a novel training tool for Le Fort I osteotomy based on immersive virtual reality (iVR) was developed and validated. Seven consultant oral and maxillofacial surgeons evaluated the application for face and content validity. Using a structured assessment process, the surgeons commented on the content of the developed training tool, its realism and usability, and the applicability of VR surgery for orthognathic surgical training. The results confirmed the clinical applicability of VR for delivering training in orthognathic surgery. Modifications were suggested to improve the user experience and interactions with the surgical instruments. This training tool is ready for testing with surgical trainees. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Evaluating a "Second Life" Problem-Based Learning (PBL) Demonstrator Project: What Can We Learn?
ERIC Educational Resources Information Center
Beaumont, Chris; Savin-Baden, Maggi; Conradi, Emily; Poulton, Terry
2014-01-01
This article reports the findings of a demonstrator project to evaluate how effectively Immersive Virtual Worlds (IVWs) could support problem-based learning. The project designed, created and evaluated eight scenarios within "Second Life" (SL) for undergraduate courses in health care management and paramedic training. Evaluation was…
The ALIVE Project: Astronomy Learning in Immersive Virtual Environments
NASA Astrophysics Data System (ADS)
Yu, K. C.; Sahami, K.; Denn, G.
2008-06-01
The Astronomy Learning in Immersive Virtual Environments (ALIVE) project seeks to discover learning modes and optimal teaching strategies using immersive virtual environments (VEs). VEs are computer-generated, three-dimensional environments that can be navigated to provide multiple perspectives. Immersive VEs provide the additional benefit of surrounding a viewer with the simulated reality. ALIVE evaluates the incorporation of an interactive, real-time ``virtual universe'' into formal college astronomy education. In the experiment, pre-course, post-course, and curriculum tests will be used to determine the efficacy of immersive visualizations presented in a digital planetarium versus the same visual simulations in the non-immersive setting of a normal classroom, as well as a control case using traditional classroom multimedia. To normalize for inter-instructor variability, each ALIVE instructor will teach at least one of each class in each of the three test groups.
Khademi, Maryam; Hondori, Hossein Mousavi; Dodakian, Lucy; Cramer, Steve; Lopes, Cristina V
2013-01-01
Introducing computer games to the rehabilitation market led to development of numerous Virtual Reality (VR) training applications. Although VR has provided tremendous benefit to the patients and caregivers, it has inherent limitations, some of which might be solved by replacing it with Augmented Reality (AR). The task of pick-and-place, which is part of many activities of daily living (ADL's), is one of the major affected functions stroke patients mainly expect to recover. We developed an exercise consisting of moving an object between various points, following a flash light that indicates the next target. The results show superior performance of subjects in spatial AR versus non-immersive VR setting. This could be due to the extraneous hand-eye coordination which exists in VR whereas it is eliminated in spatial AR.
An Interactive Logistics Centre Information Integration System Using Virtual Reality
NASA Astrophysics Data System (ADS)
Hong, S.; Mao, B.
2018-04-01
The logistics industry plays a very important role in the operation of modern cities. Meanwhile, the development of logistics industry has derived various problems that are urgent to be solved, such as the safety of logistics products. This paper combines the study of logistics industry traceability and logistics centre environment safety supervision with virtual reality technology, creates an interactive logistics centre information integration system. The proposed system utilizes the immerse characteristic of virtual reality, to simulate the real logistics centre scene distinctly, which can make operation staff conduct safety supervision training at any time without regional restrictions. On the one hand, a large number of sensor data can be used to simulate a variety of disaster emergency situations. On the other hand, collecting personnel operation data, to analyse the improper operation, which can improve the training efficiency greatly.
ERIC Educational Resources Information Center
Coffey, Amy Jo; Kamhawi, Rasha; Fishwick, Paul; Henderson, Julie
2017-01-01
Relatively few studies have empirically tested computer-based immersive virtual environments' efficacy in teaching or enhancing pro-social attitudes, such as intercultural sensitivity. This channel study experiment was conducted (N = 159) to compare what effects, if any, an immersive 3D virtual environment would have upon subjects' intercultural…
Resident perspectives on communication training that utilizes immersive virtual reality.
Real, Francis J; DeBlasio, Dominick; Ollberding, Nicholas J; Davis, David; Cruse, Bradley; Mclinden, Daniel; Klein, Melissa D
2017-01-01
Communication skills can be difficult to teach and assess in busy outpatient settings. These skills are important for effective counseling such as in cases of influenza vaccine hesitancy. It is critical to consider novel educational methods to supplement current strategies aimed at teaching relational skills. An immersive virtual reality (VR) curriculum on addressing influenza vaccine hesitancy was developed using Kern's six-step approach to curriculum design. The curriculum was meant to teach best-practice communication skills in cases of influenza vaccine hesitancy. Eligible participants included postgraduate level (PL) 2 and PL-3 pediatric residents (n = 24). Immediately following the curriculum, a survey was administered to assess residents' attitudes toward the VR curriculum and perceptions regarding the effectiveness of VR in comparison to other educational modalities. A survey was administered 1 month following the VR curriculum to assess trainee-perceived impact of the curriculum on clinical practice. All eligible residents (n = 24) completed the curriculum. Ninety-two percent (n = 22) agreed or strongly agreed that VR simulations were like real-life patient encounters. Seventy-five percent (n = 18) felt that VR was equally effective to standardized patient (SP) encounters and less effective than bedside teaching (P < 0.001). At 1-month follow-up, 67% of residents (n = 16) agreed or strongly agreed that the VR experience improved how they counseled families in cases of influenza vaccine hesitancy. An immersive VR curriculum at our institution was well-received by learners, and residents rated VR as equally effective as SP encounters. As such, immersive VR may be a promising modality for communication training.
ERIC Educational Resources Information Center
Lorenzo, Gonzalo; Pomares, Jorge; Lledo, Asuncion
2013-01-01
This paper presents the use of immersive virtual reality systems in the educational intervention with Asperger students. The starting points of this study are features of these students' cognitive style that requires an explicit teaching style supported by visual aids and highly structured environments. The proposed immersive virtual reality…
Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.
Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E
2007-01-01
This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.
Use of immersive virtual reality to assess episodic memory: A validation study in older adults.
Corriveau Lecavalier, Nick; Ouellet, Émilie; Boller, Benjamin; Belleville, Sylvie
2018-05-29
Virtual reality (VR) allows for the creation of ecological environments that could be used for cognitive assessment and intervention. This study comprises two parts that describe and assess an immersive VR task, the Virtual Shop, which can be used to measure episodic memory. Part 1 addresses its applicability in healthy older adults by measuring presence, motivation, and cybersickness symptoms. Part 2 addresses its construct validity by investigating correlations between performance in the VR task and on a traditional experimental memory task, and by measuring whether the VR task is sensitive to age-related memory differences. Fifty-seven older and 20 younger adults were assessed in the Virtual Shop, in which they memorised and fetched 12 familiar items. Part 1 showed high levels of presence, higher levels of motivation for the VR than for the traditional task, and negligible cybersickness symptoms. Part 2 indicates that memory performance in the VR task is positively correlated with performance on a traditional memory task for both age groups, and age-related differences were found on the VR and traditional memory tasks. Thus, the use of VR is feasible in older adults and the Virtual Shop is a valid task to assess and train episodic memory in this population.
Neurofeedback training with virtual reality for inattention and impulsiveness.
Cho, Baek-Hwan; Kim, Saebyul; Shin, Dong Ik; Lee, Jang Han; Lee, Sang Min; Kim, In Young; Kim, Sun I
2004-10-01
In this research, the effectiveness of neurofeedback, along with virtual reality (VR), in reducing the level of inattention and impulsiveness was investigated. Twenty-eight male participants, aged 14-18, with social problems, took part in this study. They were separated into three groups: a control group, a VR group, and a non-VR group. The VR and non-VR groups underwent eight sessions of neurofeedback training over 2 weeks, while the control group just waited during the same period. The VR group used a head-mounted display (HMD) and a head tracker, which let them look around the virtual world. Conversely, the non-VR group used only a computer monitor with a fixed viewpoint. All participants performed a continuous performance task (CPT) before and after the complete training session. The results showed that both the VR and non-VR groups achieved better scores in the CPT after the training session, while the control group showed no significant difference. Compared with the other groups, the VR group presented a tendency to get better results, suggesting that immersive VR is applicable to neurofeedback for the rehabilitation of inattention and impulsiveness.
Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback
NASA Astrophysics Data System (ADS)
Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve
2011-03-01
Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.
ERIC Educational Resources Information Center
Hartley, Melissa D.; Ludlow, Barbara L.; Duff, Michael C.
2015-01-01
Many colleges and universities rely upon online programs to support distance delivery of personnel preparation programs in special education and related services. These distance education programs enable individuals who live or work in rural communities to access training programs to earn teaching certification and assist rural schools in…
NASA Technical Reports Server (NTRS)
Smith, Jeffrey D.; Twombly, I. Alexander; Maese, A. Christopher; Cagle, Yvonne; Boyle, Richard
2003-01-01
The International Space Station demonstrates the greatest capabilities of human ingenuity, international cooperation and technology development. The complexity of this space structure is unprecedented; and training astronaut crews to maintain all its systems, as well as perform a multitude of research experiments, requires the most advanced training tools and techniques. Computer simulation and virtual environments are currently used by astronauts to train for robotic arm manipulations and extravehicular activities; but now, with the latest computer technologies and recent successes in areas of medical simulation, the capability exists to train astronauts for more hands-on research tasks using immersive virtual environments. We have developed a new technology, the Virtual Glovebox (VGX), for simulation of experimental tasks that astronauts will perform aboard the Space Station. The VGX may also be used by crew support teams for design of experiments, testing equipment integration capability and optimizing the procedures astronauts will use. This is done through the 3D, desk-top sized, reach-in virtual environment that can simulate the microgravity environment in space. Additional features of the VGX allow for networking multiple users over the internet and operation of tele-robotic devices through an intuitive user interface. Although the system was developed for astronaut training and assisting support crews, Earth-bound applications, many emphasizing homeland security, have also been identified. Examples include training experts to handle hazardous biological and/or chemical agents in a safe simulation, operation of tele-robotic systems for assessing and diffusing threats such as bombs, and providing remote medical assistance to field personnel through a collaborative virtual environment. Thus, the emerging VGX simulation technology, while developed for space- based applications, can serve a dual use facilitating homeland security here on Earth.
Virtual reality environments for post-stroke arm rehabilitation.
Subramanian, Sandeep; Knaut, Luiz A; Beaudoin, Christian; McFadyen, Bradford J; Feldman, Anatol G; Levin, Mindy F
2007-06-22
Optimal practice and feedback elements are essential requirements for maximal motor recovery in patients with motor deficits due to central nervous system lesions. A virtual environment (VE) was created that incorporates practice and feedback elements necessary for maximal motor recovery. It permits varied and challenging practice in a motivating environment that provides salient feedback. The VE gives the user knowledge of results feedback about motor behavior and knowledge of performance feedback about the quality of pointing movements made in a virtual elevator. Movement distances are related to length of body segments. We describe an immersive and interactive experimental protocol developed in a virtual reality environment using the CAREN system. The VE can be used as a training environment for the upper limb in patients with motor impairments.
HTC Vive MeVisLab integration via OpenVR for medical applications
Egger, Jan; Gall, Markus; Wallner, Jürgen; Boechat, Pedro; Hann, Alexander; Li, Xing; Chen, Xiaojun; Schmalstieg, Dieter
2017-01-01
Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection. PMID:28323840
HTC Vive MeVisLab integration via OpenVR for medical applications.
Egger, Jan; Gall, Markus; Wallner, Jürgen; Boechat, Pedro; Hann, Alexander; Li, Xing; Chen, Xiaojun; Schmalstieg, Dieter
2017-01-01
Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection.
Immersive Virtual Moon Scene System Based on Panoramic Camera Data of Chang'E-3
NASA Astrophysics Data System (ADS)
Gao, X.; Liu, J.; Mu, L.; Yan, W.; Zeng, X.; Zhang, X.; Li, C.
2014-12-01
The system "Immersive Virtual Moon Scene" is used to show the virtual environment of Moon surface in immersive environment. Utilizing stereo 360-degree imagery from panoramic camera of Yutu rover, the system enables the operator to visualize the terrain and the celestial background from the rover's point of view in 3D. To avoid image distortion, stereo 360-degree panorama stitched by 112 images is projected onto inside surface of sphere according to panorama orientation coordinates and camera parameters to build the virtual scene. Stars can be seen from the Moon at any time. So we render the sun, planets and stars according to time and rover's location based on Hipparcos catalogue as the background on the sphere. Immersing in the stereo virtual environment created by this imaged-based rendering technique, the operator can zoom, pan to interact with the virtual Moon scene and mark interesting objects. Hardware of the immersive virtual Moon system is made up of four high lumen projectors and a huge curve screen which is 31 meters long and 5.5 meters high. This system which take all panoramic camera data available and use it to create an immersive environment, enable operator to interact with the environment and mark interesting objects contributed heavily to establishment of science mission goals in Chang'E-3 mission. After Chang'E-3 mission, the lab with this system will be open to public. Besides this application, Moon terrain stereo animations based on Chang'E-1 and Chang'E-2 data will be showed to public on the huge screen in the lab. Based on the data of lunar exploration,we will made more immersive virtual moon scenes and animations to help the public understand more about the Moon in the future.
Restorative effects of virtual nature settings.
Valtchanov, Deltcho; Barton, Kevin R; Ellard, Colin
2010-10-01
Previous research regarding the potential benefits of exposing individuals to surrogate nature (photographs and videos) has found that such immersion results in restorative effects such as increased positive affect, decreased negative affect, and decreased stress. In the current experiment, we examined whether immersion in a virtual computer-generated nature setting could produce restorative effects. Twenty-two participants were equally divided between two conditions, while controlling for gender. In each condition, participants performed a stress-induction task, and were then immersed in virtual reality (VR) for 10 minutes. The control condition featured a slide show in VR, and the nature experimental condition featured an active exploration of a virtual forest. Participants in the nature condition were found to exhibit increased positive affect and decreased stress after immersion in VR when compared to those in the control condition. The results suggest that immersion in virtual nature settings has similar beneficial effects as exposure to surrogate nature. These results also suggest that VR can be used as a tool to study and understand restorative effects.
Chen, Karen B; Ponto, Kevin; Tredinnick, Ross D; Radwin, Robert G
2015-06-01
This study was a proof of concept for virtual exertions, a novel method that involves the use of body tracking and electromyography for grasping and moving projections of objects in virtual reality (VR). The user views objects in his or her hands during rehearsed co-contractions of the same agonist-antagonist muscles normally used for the desired activities to suggest exerting forces. Unlike physical objects, virtual objects are images and lack mass. There is currently no practical physically demanding way to interact with virtual objects to simulate strenuous activities. Eleven participants grasped and lifted similar physical and virtual objects of various weights in an immersive 3-D Cave Automatic Virtual Environment. Muscle activity, localized muscle fatigue, ratings of perceived exertions, and NASA Task Load Index were measured. Additionally, the relationship between levels of immersion (2-D vs. 3-D) was studied. Although the overall magnitude of biceps activity and workload were greater in VR, muscle activity trends and fatigue patterns for varying weights within VR and physical conditions were the same. Perceived exertions for varying weights were not significantly different between VR and physical conditions. Perceived exertion levels and muscle activity patterns corresponded to the assigned virtual loads, which supported the hypothesis that the method evoked the perception of physical exertions and showed that the method was promising. Ultimately this approach may offer opportunities for research and training individuals to perform strenuous activities under potentially safer conditions that mimic situations while seeing their own body and hands relative to the scene. © 2014, Human Factors and Ergonomics Society.
Does body shadow improve the efficacy of virtual reality-based training with BTS NIRVANA?
Russo, Margherita; De Luca, Rosaria; Naro, Antonino; Sciarrone, Francesca; Aragona, Bianca; Silvestri, Giuseppe; Manuli, Alfredo; Bramanti, Alessia; Casella, Carmela; Bramanti, Placido; Calabrò, Rocco Salvatore
2017-01-01
Abstract Background: Aim of the present study was to evaluate whether the presence of body shadows during virtual reality (VR) training with BTS NIRVANA (BTs-N) may lead to a better functional recovery. Methods: We enrolled 20 poststroke rehabilitation inpatients, who underwent a neurocognitive-rehabilitative training consisting of 24 sessions (3 times a week for 8 weeks) of BTs-N. All the patients were randomized into 2 groups: semi-immersive virtual training with (S-IVTS group) or without (S-IVT group) body shadows. Each participant was evaluated before (T0) and immediately (T1) after the end of the training (Trial Registration Number: NCT03095560). Results: The S-IVTS group showed a greater improvement in visuo-constructive skills and sustained attention, as compared with the S-IVT group. The other measures showed nonsignificant within-group and between-group differences. Conclusion: Our results showed that body shadow may represent a high-priority class of stimuli that act by “pushing” attention toward the body itself. Further studies are needed to clarify the role of body shadow in promoting the internal representation construction and thus self-recognition. PMID:28930852
Ketelhut, Diane Jass; Niemi, Steven M
2007-01-01
This article examines several new and exciting communication technologies. Many of the technologies were developed by the entertainment industry; however, other industries are adopting and modifying them for their own needs. These new technologies allow people to collaborate across distance and time and to learn in simulated work contexts. The article explores the potential utility of these technologies for advancing laboratory animal care and use through better education and training. Descriptions include emerging technologies such as augmented reality and multi-user virtual environments, which offer new approaches with different capabilities. Augmented reality interfaces, characterized by the use of handheld computers to infuse the virtual world into the real one, result in deeply immersive simulations. In these simulations, users can access virtual resources and communicate with real and virtual participants. Multi-user virtual environments enable multiple participants to simultaneously access computer-based three-dimensional virtual spaces, called "worlds," and to interact with digital tools. They allow for authentic experiences that promote collaboration, mentoring, and communication. Because individuals may learn or train differently, it is advantageous to combine the capabilities of these technologies and applications with more traditional methods to increase the number of students who are served by using current methods alone. The use of these technologies in animal care and use programs can create detailed training and education environments that allow students to learn the procedures more effectively, teachers to assess their progress more objectively, and researchers to gain insights into animal care.
A Discussion of Virtual Reality As a New Tool for Training Healthcare Professionals.
Fertleman, Caroline; Aubugeau-Williams, Phoebe; Sher, Carmel; Lim, Ai-Nee; Lumley, Sophie; Delacroix, Sylvie; Pan, Xueni
2018-01-01
Virtual reality technology is an exciting and emerging field with vast applications. Our study sets out the viewpoint that virtual reality software could be a new focus of direction in the development of training tools in medical education. We carried out a panel discussion at the Center for Behavior Change 3rd Annual Conference, prompted by the study, "The Responses of Medical General Practitioners to Unreasonable Patient Demand for Antibiotics--A Study of Medical Ethics Using Immersive Virtual Reality" (1). In Pan et al.'s study, 21 general practitioners (GPs) and GP trainees took part in a videoed, 15-min virtual reality scenario involving unnecessary patient demands for antibiotics. This paper was discussed in-depth at the Center for Behavior Change 3rd Annual Conference; the content of this paper is a culmination of findings and feedback from the panel discussion. The experts involved have backgrounds in virtual reality, general practice, medicines management, medical education and training, ethics, and philosophy. Virtual reality is an unexplored methodology to instigate positive behavioral change among clinicians where other methods have been unsuccessful, such as antimicrobial stewardship. There are several arguments in favor of use of virtual reality in medical education: it can be used for "difficult to simulate" scenarios and to standardize a scenario, for example, for use in exams. However, there are limitations to its usefulness because of the cost implications and the lack of evidence that it results in demonstrable behavior change.
Using virtual reality to train children in safe street-crossing skills.
Schwebel, David C; McClure, Leslie A
2010-02-01
Pedestrian injuries are among the leading causes of morbidity and mortality in middle childhood. One limitation to existing pedestrian safety interventions is that they do not provide children with repeated practice needed to develop the complex perceptual and cognitive skills required for safe street crossing. Virtual reality offers training through repeated unsupervised practice without risk, automated feedback on success of crossings, adjustment of traffic to match children's skill and a fun, appealing environment for training. To test the efficacy of virtual reality to train child pedestrians in safe street crossing. Birmingham, Alabama, USA. A randomised controlled trial is underway with an expected sample of four groups of 60 children aged 7-8 years (total N=240). One group receives training in an interactive, immersive virtual pedestrian environment. A second receives pedestrian safety training via widely used video and computer strategies. The third group receives what is judged to be the most efficacious treatment currently available, individualised behavioural training at streetside locations. The fourth group serves as a no-contact control group. All participants are exposed to a range of field and laboratory-based measures of pedestrian skill during baseline and post-intervention visits, as well as during a 6-month follow-up assessment. Primary analyses will be conducted through linear mixed models testing change over time in the four intervention groups. Three pedestrian safety measures will serve as primary outcomes: temporal gap before initiating crossing, temporal gap remaining after crossing and attention to traffic while waiting to cross.
Web-based Three-dimensional Virtual Body Structures: W3D-VBS
Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex
2002-01-01
Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user’s progress through evaluation tools helps customize lesson plans. A self-guided “virtual tour” of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it. PMID:12223495
Web-based three-dimensional Virtual Body Structures: W3D-VBS.
Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex
2002-01-01
Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user's progress through evaluation tools helps customize lesson plans. A self-guided "virtual tour" of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it.
The Integrated Virtual Environment Rehabilitation Treadmill System
Feasel, Jeff; Whitton, Mary C.; Kassler, Laura; Brooks, Frederick P.; Lewek, Michael D.
2015-01-01
Slow gait speed and interlimb asymmetry are prevalent in a variety of disorders. Current approaches to locomotor retraining emphasize the need for appropriate feedback during intensive, task-specific practice. This paper describes the design and feasibility testing of the integrated virtual environment rehabilitation treadmill (IVERT) system intended to provide real-time, intuitive feedback regarding gait speed and asymmetry during training. The IVERT system integrates an instrumented, split-belt treadmill with a front-projection, immersive virtual environment. The novel adaptive control system uses only ground reaction force data from the treadmill to continuously update the speeds of the two treadmill belts independently, as well as to control the speed and heading in the virtual environment in real time. Feedback regarding gait asymmetry is presented 1) visually as walking a curved trajectory through the virtual environment and 2) proprioceptively in the form of different belt speeds on the split-belt treadmill. A feasibility study involving five individuals with asymmetric gait found that these individuals could effectively control the speed of locomotion and perceive gait asymmetry during the training session. Although minimal changes in overground gait symmetry were observed immediately following a single training session, further studies should be done to determine the IVERT’s potential as a tool for rehabilitation of asymmetric gait by providing patients with congruent visual and proprioceptive feedback. PMID:21652279
Training healthcare personnel for mass-casualty incidents in a virtual emergency department: VED II.
Heinrichs, Wm Leroy; Youngblood, Patricia; Harter, Phillip; Kusumoto, Laura; Dev, Parvati
2010-01-01
Training emergency personnel on the clinical management of a mass-casualty incident (MCI) with prior chemical, biological, radioactive, nuclear, or explosives (CBRNE) -exposed patients is a component of hospital preparedness procedures. The objective of this research was to determine whether a Virtual Emergency Department (VED), designed after the Stanford University Medical Center's Emergency Department (ED) and populated with 10 virtual patient victims who suffered from a dirty bomb blast (radiological) and 10 who suffered from exposure to a nerve toxin (chemical), is an effective clinical environment for training ED physicians and nurses for such MCIs. Ten physicians with an average of four years of post-training experience, and 12 nurses with an average of 9.5 years of post-graduate experience at Stanford University Medical Center and San Mateo County Medical Center participated in this IRB-approved study. All individuals were provided electronic information about the clinical features of patients exposed to a nerve toxin or radioactive blast before the study date and an orientation to the "game" interface, including an opportunity to practice using it immediately prior to the study. An exit questionnaire was conducted using a Likert Scale test instrument. Among these 22 trainees, two-thirds of whom had prior Code Triage (multiple casualty incident) training, and one-half had prior CBRNE training, about two-thirds felt immersed in the virtual world much or all of the time. Prior to the training, only four trainees (18%) were confident about managing CBRNE MCIs. After the training, 19 (86%) felt either "confident" or "very confident", with 13 (59%) attributing this change to practicing in the virtual ED. Twenty-one (95%) of the trainees reported that the scenarios were useful for improving healthcare team skills training, the primary objective for creating them. Eighteen trainees (82%) believed that the cases also were instructive in learning about clinical skills management of such incidents. These data suggest that training healthcare teams in online, virtual environments with dynamic virtual patients is an effective method of training for management of MCIs, particularly for uncommonly occurring incidents.
Training Effectiveness of a Wide Area Virtual Environment in Medical Simulation.
Wier, Grady S; Tree, Rebekah; Nusr, Rasha
2017-02-01
The success of war fighters and medical personnel handling traumatic injuries largely depends on the quality of training they receive before deployment. The purpose of this study was to gauge the utility of a Wide Area Virtual Environment (WAVE) as a training adjunct by comparing and evaluating student performance, measuring sense of realism, and assessing the impact on student satisfaction with their training exposure in an immersive versus a field environment. This comparative prospective cohort study examined the utility of a three-screen WAVE where subjects were immersed in the training environment with medical simulators. Standard field training commenced for the control group subjects. Medical skills, time to completion, and Team Strategies and Tools to Enhance Performance and Patient Safety objective metrics were assessed for each team (n = 94). In addition, self-efficacy questionnaires were collected for each subject (N = 470). Medical teams received poorer overall team scores (F1,186 = 0.756, P = 0.001), took longer to complete the scenario (F1,186 = 25.15, P = 0.001), and scored lower on The National Registry of Emergency Medical Technicians trauma assessment checklist (F1,186 = 1.13, P = 0.000) in the WAVE versus the field environment. Critical thinking and realism factors within the self-efficacy questionnaires scored higher in the WAVE versus the field [(F1,466 = 8.04, P = 0.005), (F1,465 = 18.57, P = 0.000), and (F1,466 = 53.24, P = 0.000), respectively]. Environmental and emotional stressors may negatively affect critical thinking and clinical skill performance of medical teams. However, by introducing more advanced simulation trainings with added stressors, students may be able to adapt and overcome barriers to performance found in high-stress environments.
Coercive Narratives, Motivation and Role Playing in Virtual Worlds
2002-01-01
resource for making immersive virtual environments highly engaging. Interaction also appeals to our natural desire to discover. Reading a book contains...participation in an open-ended Virtual Environment (VE). I intend to take advantage of a participants’ natural tendency to prefer interaction when possible...I hope this work will expand the potential of experience within virtual worlds. K e y w o r d s : Immersive Environments , Virtual Environments
Orientation Preferences and Motion Sickness Induced in a Virtual Reality Environment.
Chen, Wei; Chao, Jian-Gang; Zhang, Yan; Wang, Jin-Kun; Chen, Xue-Wen; Tan, Cheng
2017-10-01
Astronauts' orientation preferences tend to correlate with their susceptibility to space motion sickness (SMS). Orientation preferences appear universally, since variable sensory cue priorities are used between individuals. However, SMS susceptibility changes after proper training, while orientation preferences seem to be intrinsic proclivities. The present study was conducted to investigate whether orientation preferences change if susceptibility is reduced after repeated exposure to a virtual reality (VR) stimulus environment that induces SMS. A horizontal supine posture was chosen to create a sensory context similar to weightlessness, and two VR devices were used to produce a highly immersive virtual scene. Subjects were randomly allocated to an experimental group (trained through exposure to a provocative rotating virtual scene) and a control group (untrained). All subjects' orientation preferences were measured twice with the same interval, but the experimental group was trained three times during the interval, while the control group was not. Trained subjects were less susceptible to SMS, with symptom scores reduced by 40%. Compared with untrained subjects, trained subjects' orientation preferences were significantly different between pre- and posttraining assessments. Trained subjects depended less on visual cues, whereas few subjects demonstrated the opposite tendency. Results suggest that visual information may be inefficient and unreliable for body orientation and stabilization in a rotating visual scene, while reprioritizing preferences for different sensory cues was dynamic and asymmetric between individuals. The present findings should facilitate customization of efficient and proper training for astronauts with different sensory prioritization preferences and dynamic characteristics.Chen W, Chao J-G, Zhang Y, Wang J-K, Chen X-W, Tan C. Orientation preferences and motion sickness induced in a virtual reality environment. Aerosp Med Hum Perform. 2017; 88(10):903-910.
Digital Immersive Virtual Environments and Instructional Computing
ERIC Educational Resources Information Center
Blascovich, Jim; Beall, Andrew C.
2010-01-01
This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…
Immersive Education, an Annotated Webliography
ERIC Educational Resources Information Center
Pricer, Wayne F.
2011-01-01
In this second installment of a two-part feature on immersive education a webliography will provide resources discussing the use of various types of computer simulations including: (a) augmented reality, (b) virtual reality programs, (c) gaming resources for teaching with technology, (d) virtual reality lab resources, (e) virtual reality standards…
NASA Technical Reports Server (NTRS)
Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil
2008-01-01
We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].
NASA Astrophysics Data System (ADS)
Krum, David M.; Sadek, Ramy; Kohli, Luv; Olson, Logan; Bolas, Mark
2010-01-01
As part of the Institute for Creative Technologies and the School of Cinematic Arts at the University of Southern California, the Mixed Reality lab develops technologies and techniques for presenting realistic immersive training experiences. Such experiences typically place users within a complex ecology of social actors, physical objects, and collections of intents, motivations, relationships, and other psychological constructs. Currently, it remains infeasible to completely synthesize the interactivity and sensory signatures of such ecologies. For this reason, the lab advocates mixed reality methods for training and conducts experiments exploring such methods. Currently, the lab focuses on understanding and exploiting the elasticity of human perception with respect to representational differences between real and virtual environments. This paper presents an overview of three projects: techniques for redirected walking, displays for the representation of virtual humans, and audio processing to increase stress.
Renaud, Patrice; Trottier, Dominique; Nolet, Kevin; Rouleau, Joanne L; Goyette, Mathieu; Bouchard, Stéphane
2014-04-01
The eye movements and penile responses of 20 male participants were recorded while they were immersed with virtual sexual stimuli. These participants were divided into two groups according to their capacity to focus their attention in immersion (high and low focus). In order to understand sexual self-regulation better, we subjected participants to three experimental conditions: (a) immersion with a preferred sexual stimulus, without sexual inhibition; (b) immersion with a preferred sexual stimulus, with sexual inhibition; and (c) immersion with a neutral stimulus. A significant difference was observed between the effects of each condition on erectile response and scanpath. The groups differed on self-regulation of their erectile responses and on their scanpath patterns. High focus participants had more difficulties than low focus participants with inhibiting their sexual responses and displayed less scattered eye movement trajectories over the critical areas of the virtual sexual stimuli. Results are interpreted in terms of sexual self-regulation and cognitive absorption in virtual immersion. In addition, the use of validated virtual sexual stimuli is presented as a methodological improvement over static and moving pictures, since it paves the way for the study of the role of social interaction in an ecologically valid and well-controlled way.
Proffitt, Rachel; Lange, Belinda; Chen, Christina; Winstein, Carolee
2015-01-01
The purpose of this study was to explore the subjective experience of older adults interacting with both virtual and real environments. Thirty healthy older adults engaged with real and virtual tasks of similar motor demands: reaching to a target in standing and stepping stance. Immersive tendencies and absorption scales were administered before the session. Game engagement and experience questionnaires were completed after each task, followed by a semistructured interview at the end of the testing session. Data were analyzed respectively using paired t tests and grounded theory methodology. Participants preferred the virtual task over the real task. They also reported an increase in presence and absorption with the virtual task, describing an external focus of attention. Findings will be used to inform future development of appropriate game-based balance training applications that could be embedded in the home or community settings as part of evidence-based fall prevention programs.
Virtual reality triage training provides a viable solution for disaster-preparedness.
Andreatta, Pamela B; Maslowski, Eric; Petty, Sean; Shim, Woojin; Marsh, Michael; Hall, Theodore; Stern, Susan; Frankel, Jen
2010-08-01
The objective of this study was to compare the relative impact of two simulation-based methods for training emergency medicine (EM) residents in disaster triage using the Simple Triage and Rapid Treatment (START) algorithm, full-immersion virtual reality (VR), and standardized patient (SP) drill. Specifically, are there differences between the triage performances and posttest results of the two groups, and do both methods differentiate between learners of variable experience levels? Fifteen Postgraduate Year 1 (PGY1) to PGY4 EM residents were randomly assigned to two groups: VR or SP. In the VR group, the learners were effectively surrounded by a virtual mass disaster environment projected on four walls, ceiling, and floor and performed triage by interacting with virtual patients in avatar form. The second group performed likewise in a live disaster drill using SP victims. Setting and patient presentations were identical between the two modalities. Resident performance of triage during the drills and knowledge of the START triage algorithm pre/post drill completion were assessed. Analyses included descriptive statistics and measures of association (effect size). The mean pretest scores were similar between the SP and VR groups. There were no significant differences between the triage performances of the VR and SP groups, but the data showed an effect in favor of the SP group performance on the posttest. Virtual reality can provide a feasible alternative for training EM personnel in mass disaster triage, comparing favorably to SP drills. Virtual reality provides flexible, consistent, on-demand training options, using a stable, repeatable platform essential for the development of assessment protocols and performance standards.
Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T
2007-07-01
Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.
Christiansen, C; Abreu, B; Ottenbacher, K; Huffman, K; Masel, B; Culpepper, R
1998-08-01
This report describes a reliability study using a prototype computer-simulated virtual environment to assess basic daily living skills in a sample of persons with traumatic brain injury (TBI). The benefits of using virtual reality in training for situations where safety is a factor have been established in defense and industry, but have not been demonstrated in rehabilitation. Thirty subjects with TBI receiving comprehensive rehabilitation services at a residential facility. An immersive virtual kitchen was developed in which a meal preparation task involving multiple steps could be performed. The prototype was tested using subjects who completed the task twice within 7 days. The stability of performance was estimated using intraclass correlation coefficients (ICCs). The ICC value for total performance based on all steps involved in the meal preparation task was .73. When three items with low variance were removed the ICC improved to .81. Little evidence of vestibular optical side-effects was noted in the subjects tested. Adequate initial reliability exists to continue development of the environment as an assessment and training prototype for persons with brain injury.
Virtual reality and paranoid ideations in people with an 'at-risk mental state' for psychosis.
Valmaggia, Lucia R; Freeman, Daniel; Green, Catherine; Garety, Philippa; Swapp, David; Antley, Angus; Prescott, Corinne; Fowler, David; Kuipers, Elizabeth; Bebbington, Paul; Slater, Mel; Broome, Matthew; McGuire, Philip K
2007-12-01
Virtual reality provides a means of studying paranoid thinking in controlled laboratory conditions. However, this method has not been used with a clinical group. To establish the feasibility and safety of using virtual reality methodology in people with an at-risk mental state and to investigate the applicability of a cognitive model of paranoia to this group. Twenty-one participants with an at-risk mental state were assessed before and after entering a virtual reality environment depicting the inside of an underground train. Virtual reality did not raise levels of distress at the time of testing or cause adverse experiences over the subsequent week. Individuals attributed mental states to virtual reality characters including hostile intent. Persecutory ideation in virtual reality was predicted by higher levels of trait paranoia, anxiety, stress, immersion in virtual reality, perseveration and interpersonal sensitivity. Virtual reality is an acceptable experimental technique for use with individuals with at-risk mental states. Paranoia in virtual reality was understandable in terms of the cognitive model of persecutory delusions.
Immersion and the illusion of presence in virtual reality.
Slater, Mel
2018-05-21
This commentary briefly reviews the history of virtual reality and its use for psychology research, and clarifies the concepts of immersion and the illusion of presence. © 2018 The British Psychological Society.
Full Immersive Virtual Environment Cave[TM] in Chemistry Education
ERIC Educational Resources Information Center
Limniou, M.; Roberts, D.; Papadopoulos, N.
2008-01-01
By comparing two-dimensional (2D) chemical animations designed for computer's desktop with three-dimensional (3D) chemical animations designed for the full immersive virtual reality environment CAVE[TM] we studied how virtual reality environments could raise student's interest and motivation for learning. By using the 3ds max[TM], we can visualize…
ERIC Educational Resources Information Center
Ritz, Leah T.; Buss, Alan R.
2016-01-01
Increasing availability of immersive virtual reality (IVR) systems, such as the Cave Automatic Virtual Environment (CAVE) and head-mounted displays, for use in education contexts is providing new opportunities and challenges for instructional designers. By highlighting the affordances of IVR specific to the CAVE, the authors emphasize the…
Using Virtual Reality to Help Students with Social Interaction Skills
ERIC Educational Resources Information Center
Beach, Jason; Wendt, Jeremy
2015-01-01
The purpose of this study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity to…
Student Responses to Their Immersion in a Virtual Environment.
ERIC Educational Resources Information Center
Taylor, Wayne
Undertaken in conjunction with a larger study that investigated the educational efficacy of students building their own virtual worlds, this study measures the reactions of students in grades 4-12 to the experience of being immersed in virtual reality (VR). The study investigated the sense of "presence" experienced by the students, the…
Vourvopoulos, Athanasios; Bermúdez I Badia, Sergi
2016-08-09
The use of Brain-Computer Interface (BCI) technology in neurorehabilitation provides new strategies to overcome stroke-related motor limitations. Recent studies demonstrated the brain's capacity for functional and structural plasticity through BCI. However, it is not fully clear how we can take full advantage of the neurobiological mechanisms underlying recovery and how to maximize restoration through BCI. In this study we investigate the role of multimodal virtual reality (VR) simulations and motor priming (MP) in an upper limb motor-imagery BCI task in order to maximize the engagement of sensory-motor networks in a broad range of patients who can benefit from virtual rehabilitation training. In order to investigate how different BCI paradigms impact brain activation, we designed 3 experimental conditions in a within-subject design, including an immersive Multimodal Virtual Reality with Motor Priming (VRMP) condition where users had to perform motor-execution before BCI training, an immersive Multimodal VR condition, and a control condition with standard 2D feedback. Further, these were also compared to overt motor-execution. Finally, a set of questionnaires were used to gather subjective data on Workload, Kinesthetic Imagery and Presence. Our findings show increased capacity to modulate and enhance brain activity patterns in all extracted EEG rhythms matching more closely those present during motor-execution and also a strong relationship between electrophysiological data and subjective experience. Our data suggest that both VR and particularly MP can enhance the activation of brain patterns present during overt motor-execution. Further, we show changes in the interhemispheric EEG balance, which might play an important role in the promotion of neural activation and neuroplastic changes in stroke patients in a motor-imagery neurofeedback paradigm. In addition, electrophysiological correlates of psychophysiological responses provide us with valuable information about the motor and affective state of the user that has the potential to be used to predict MI-BCI training outcome based on user's profile. Finally, we propose a BCI paradigm in VR, which gives the possibility of motor priming for patients with low level of motor control.
Employing immersive virtual environments for innovative experiments in health care communication.
Persky, Susan
2011-03-01
This report reviews the literature for studies that employ immersive virtual environment technology methods to conduct experimental studies in health care communication. Advantages and challenges of using these tools for research in this area are also discussed. A literature search was conducted using the Scopus database. Results were hand searched to identify the body of studies, conducted since 1995, that are related to the report objective. The review identified four relevant studies that stem from two unique projects. One project focused on the impact of a clinician's characteristics and behavior on health care communication, the other focused on the characteristics of the patient. Both projects illustrate key methodological advantages conferred by immersive virtual environments, including, ability to maintain simultaneously high experimental control and realism, ability to manipulate variables in new ways, and unique behavioral measurement opportunities. Though implementation challenges exist for immersive virtual environment-based research methods, given the technology's unique capabilities, benefits can outweigh the costs in many instances. Immersive virtual environments may therefore prove an important addition to the array of tools available for advancing our understanding of communication in health care. Published by Elsevier Ireland Ltd.
Enabling scientific workflows in virtual reality
Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.
2006-01-01
To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.
Virtual hydrology observatory: an immersive visualization of hydrology modeling
NASA Astrophysics Data System (ADS)
Su, Simon; Cruz-Neira, Carolina; Habib, Emad; Gerndt, Andreas
2009-02-01
The Virtual Hydrology Observatory will provide students with the ability to observe the integrated hydrology simulation with an instructional interface by using a desktop based or immersive virtual reality setup. It is the goal of the virtual hydrology observatory application to facilitate the introduction of field experience and observational skills into hydrology courses through innovative virtual techniques that mimic activities during actual field visits. The simulation part of the application is developed from the integrated atmospheric forecast model: Weather Research and Forecasting (WRF), and the hydrology model: Gridded Surface/Subsurface Hydrologic Analysis (GSSHA). Both the output from WRF and GSSHA models are then used to generate the final visualization components of the Virtual Hydrology Observatory. The various visualization data processing techniques provided by VTK are 2D Delaunay triangulation and data optimization. Once all the visualization components are generated, they are integrated into the simulation data using VRFlowVis and VR Juggler software toolkit. VR Juggler is used primarily to provide the Virtual Hydrology Observatory application with fully immersive and real time 3D interaction experience; while VRFlowVis provides the integration framework for the hydrologic simulation data, graphical objects and user interaction. A six-sided CAVETM like system is used to run the Virtual Hydrology Observatory to provide the students with a fully immersive experience.
Kim, Kwanguk; Kim, Chan-Hyung; Cha, Kyung Ryeol; Park, Junyoung; Han, Kiwan; Kim, Yun Ki; Kim, Jae-Jin; Kim, In Young; Kim, Sun I
2008-12-01
The current study is a preliminary test of a virtual reality (VR) anxiety-provoking tool using a sample of participants with obsessive-compulsive disorder (OCD). The tasks were administrated to 33 participants with OCD and 30 healthy control participants. In the VR task, participants navigated through a virtual environment using a joystick and head-mounted display. The virtual environment consisted of three phases: training, distraction, and the main task. After the training and distraction phases, participants were allowed to check (a common OCD behavior) freely, as they would in the real world, and a visual analogy scale of anxiety was recorded during VR. Participants' anxiety in the virtual environment was measured with a validated measure of psychiatric symptoms and functions and analyzed with a VR questionnaire. Results revealed that those with OCD had significantly higher anxiety in the virtual environment than did healthy controls, and the decreased ratio of anxiety in participants with OCD was also higher than that of healthy controls. Moreover, the degree of anxiety of an individual with OCD was positively correlated with a his or her symptom score and immersive tendency score. These results suggest the possibility that VR technology has a value as an anxiety-provoking or treatment tool for OCD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Markidis, S.; Rizwan, U.
The use of virtual nuclear control room can be an effective and powerful tool for training personnel working in the nuclear power plants. Operators could experience and simulate the functioning of the plant, even in critical situations, without being in a real power plant or running any risk. 3D models can be exported to Virtual Reality formats and then displayed in the Virtual Reality environment providing an immersive 3D experience. However, two major limitations of this approach are that 3D models exhibit static textures, and they are not fully interactive and therefore cannot be used effectively in training personnel. Inmore » this paper we first describe a possible solution for embedding the output of a computer application in a 3D virtual scene, coupling real-world applications and VR systems. The VR system reported here grabs the output of an application running on an X server; creates a texture with the output and then displays it on a screen or a wall in the virtual reality environment. We then propose a simple model for providing interaction between the user in the VR system and the running simulator. This approach is based on the use of internet-based application that can be commanded by a laptop or tablet-pc added to the virtual environment. (authors)« less
VILLAGE--Virtual Immersive Language Learning and Gaming Environment: Immersion and Presence
ERIC Educational Resources Information Center
Wang, Yi Fei; Petrina, Stephen; Feng, Francis
2017-01-01
3D virtual worlds are promising for immersive learning in English as a Foreign Language (EFL). Unlike English as a Second Language (ESL), EFL typically takes place in the learners' home countries, and the potential of the language is limited by geography. Although learning contexts where English is spoken is important, in most EFL courses at the…
ERIC Educational Resources Information Center
Lawless-Reljic, Sabine Karine
2010-01-01
Growing interest of educational institutions in desktop 3D graphic virtual environments for hybrid and distance education prompts questions on the efficacy of such tools. Virtual worlds, such as Second Life[R], enable computer-mediated immersion and interactions encompassing multimodal communication channels including audio, video, and text-.…
Immersive virtual reality for visualization of abdominal CT
NASA Astrophysics Data System (ADS)
Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A.; Bodenheimer, Robert E.
2013-03-01
Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.
Immersive Virtual Reality for Visualization of Abdominal CT.
Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A; Bodenheimer, Robert E
2013-03-28
Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two-dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.
Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution
Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir
2016-01-01
Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks–walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience. PMID:26882473
Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution.
Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir
2016-01-01
Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks-walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience.
The interplays among technology and content, immersant and VE
NASA Astrophysics Data System (ADS)
Song, Meehae; Gromala, Diane; Shaw, Chris; Barnes, Steven J.
2010-01-01
The research program aims to explore and examine the fine balance necessary for maintaining the interplays between technology and the immersant, including identifying qualities that contribute to creating and maintaining a sense of "presence" and "immersion" in an immersive virtual reality (IVR) experience. Building upon and extending previous work, we compare sitting meditation with walking meditation in a virtual environment (VE). The Virtual Meditative Walk, a new work-in-progress, integrates VR and biofeedback technologies with a self-directed, uni-directional treadmill. As immersants learn how to meditate while walking, robust, real-time biofeedback technology continuously measures breathing, skin conductance and heart rate. The physiological states of the immersant will in turn affect the audio and stereoscopic visual media through shutter glasses. We plan to test the potential benefits and limitations of this physically active form of meditation with data from a sitting form of meditation. A mixed-methods approach to testing user outcomes parallels the knowledge bases of the collaborative team: a physician, computer scientists and artists.
Virtual reality 3D headset based on DMD light modulators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernacki, Bruce E.; Evans, Allan; Tang, Edward
We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micro-mirror devices (DMD). Our approach leverages silicon micro mirrors offering 720p resolution displays in a small form-factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high resolution and low power consumption. Applications include night driving, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design is described in which light from the DMD is imaged to infinity and the user’s own eye lens forms a real image on the user’s retina.
Immersive realities: articulating the shift from VR to mobile AR through artistic practice
NASA Astrophysics Data System (ADS)
Margolis, Todd; Cornish, Tracy; Berry, Rodney; DeFanti, Thomas A.
2012-03-01
Our contemporary imaginings of technological engagement with digital environments has transitioned from flying through Virtual Reality to mobile interactions with the physical world through personal media devices. Experiences technologically mediated through social interactivity within physical environments are now being preferenced over isolated environments such as CAVEs or HMDs. Examples of this trend can be seen in early tele-collaborative artworks which strove to use advanced networking to join multiple participants in shared virtual environments. Recent developments in mobile AR allow untethered access to such shared realities in places far removed from labs and home entertainment environments, and without the bulky and expensive technologies attached to our bodies that accompany most VR. This paper addresses the emerging trend favoring socially immersive artworks via mobile Augmented Reality rather than sensorially immersive Virtual Reality installations. With particular focus on AR as a mobile, locative technology, we will discuss how concepts of immersion and interactivity are evolving with this new medium. Immersion in context of mobile AR can be redefined to describe socially interactive experiences. Having distinctly different sensory, spatial and situational properties, mobile AR offers a new form for remixing elements from traditional virtual reality with physically based social experiences. This type of immersion offers a wide array of potential for mobile AR art forms. We are beginning to see examples of how artists can use mobile AR to create social immersive and interactive experiences.
Ali, Saad; Qandeel, Monther; Ramakrishna, Rishi; Yang, Carina W
2018-02-01
Fluoroscopy-guided lumbar puncture (FGLP) is a basic procedural component of radiology residency and neuroradiology fellowship training. Performance of the procedure with limited experience is associated with increased patient discomfort as well as increased radiation dose, puncture attempts, and complication rate. Simulation in health care is a developing field that has potential for enhancing procedural training. We demonstrate the design and utility of a virtual reality simulator for performing FGLP. An FGLP module was developed on an ImmersiveTouch platform, which digitally reproduces the procedural environment with a hologram-like projection. From computed tomography datasets of healthy adult spines, we constructed a 3-D model of the lumbar spine and overlying soft tissues. We assigned different physical characteristics to each tissue type, which the user can experience through haptic feedback while advancing a virtual spinal needle. Virtual fluoroscopy as well as 3-D images can be obtained for procedural planning and guidance. The number of puncture attempts, the distance to the target, the number of fluoroscopic shots, and the approximate radiation dose can be calculated. Preliminary data from users who participated in the simulation were obtained in a postsimulation survey. All users found the simulation to be a realistic replication of the anatomy and procedure and would recommend to a colleague. On a scale of 1-5 (lowest to highest) rating the virtual simulator training overall, the mean score was 4.3 (range 3-5). We describe the design of a virtual reality simulator for performing FGLP and present the initial experience with this new technique. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Modulation of cortical activity in 2D versus 3D virtual reality environments: an EEG study.
Slobounov, Semyon M; Ray, William; Johnson, Brian; Slobounov, Elena; Newell, Karl M
2015-03-01
There is a growing empirical evidence that virtual reality (VR) is valuable for education, training, entertaining and medical rehabilitation due to its capacity to represent real-life events and situations. However, the neural mechanisms underlying behavioral confounds in VR environments are still poorly understood. In two experiments, we examined the effect of fully immersive 3D stereoscopic presentations and less immersive 2D VR environments on brain functions and behavioral outcomes. In Experiment 1 we examined behavioral and neural underpinnings of spatial navigation tasks using electroencephalography (EEG). In Experiment 2, we examined EEG correlates of postural stability and balance. Our major findings showed that fully immersive 3D VR induced a higher subjective sense of presence along with enhanced success rate of spatial navigation compared to 2D. In Experiment 1 power of frontal midline EEG (FM-theta) was significantly higher during the encoding phase of route presentation in the 3D VR. In Experiment 2, the 3D VR resulted in greater postural instability and modulation of EEG patterns as a function of 3D versus 2D environments. The findings support the inference that the fully immersive 3D enriched-environment requires allocation of more brain and sensory resources for cognitive/motor control during both tasks than 2D presentations. This is further evidence that 3D VR tasks using EEG may be a promising approach for performance enhancement and potential applications in clinical/rehabilitation settings. Copyright © 2014 Elsevier B.V. All rights reserved.
Evaluating an immersive virtual environment prototyping and simulation system
NASA Astrophysics Data System (ADS)
Nemire, Kenneth
1997-05-01
An immersive virtual environment (IVE) modeling and simulation tool is being developed for designing advanced weapon and training systems. One unique feature of the tool is that the design, and not just visualization of the design is accomplished with the IVE tool. Acceptance of IVE tools requires comparisons with current commercial applications. In this pilot study, expert users of a popular desktop 3D graphics application performed identical modeling and simulation tasks using both the desktop and IVE applications. The IVE tool consisted of a head-mounted display, 3D spatialized sound, spatial trackers on head and hands, instrumented gloves, and a simulated speech recognition system. The results are preliminary because performance from only four users has been examined. When using the IVE system, users completed the tasks to criteria in less time than when using the desktop application. Subjective ratings of the visual displays in each system were similar. Ratings for the desktop controls were higher than for the IVE controls. Ratings of immersion and user enjoyment were higher for the IVE than for the desktop application. These results are particular remarkable because participants had used the desktop application regularly for three to five years and the prototype IVE tool for only three to six hours.
Design of an immersive simulator for assisted power wheelchair driving.
Devigne, Louise; Babel, Marie; Nouviale, Florian; Narayanan, Vishnu K; Pasteau, Francois; Gallien, Philippe
2017-07-01
Driving a power wheelchair is a difficult and complex visual-cognitive task. As a result, some people with visual and/or cognitive disabilities cannot access the benefits of a power wheelchair because their impairments prevent them from driving safely. In order to improve their access to mobility, we have previously designed a semi-autonomous assistive wheelchair system which progressively corrects the trajectory as the user manually drives the wheelchair and smoothly avoids obstacles. Developing and testing such systems for wheelchair driving assistance requires a significant amount of material resources and clinician time. With Virtual Reality technology, prototypes can be developed and tested in a risk-free and highly flexible Virtual Environment before equipping and testing a physical prototype. Additionally, users can "virtually" test and train more easily during the development process. In this paper, we introduce a power wheelchair driving simulator allowing the user to navigate with a standard wheelchair in an immersive 3D Virtual Environment. The simulation framework is designed to be flexible so that we can use different control inputs. In order to validate the framework, we first performed tests on the simulator with able-bodied participants during which the user's Quality of Experience (QoE) was assessed through a set of questionnaires. Results show that the simulator is a promising tool for future works as it generates a good sense of presence and requires rather low cognitive effort from users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shawver, D.M.; Stansfield, S.
This overview presents current research at Sandia National Laboratories in the Virtual Reality and Intelligent Simulation Lab. Into an existing distributed VR environment which we have been developing, and which provides shared immersion for multiple users, we are adding virtual actor support. The virtual actor support we are adding to this environment is intended to provide semi-autonomous actors, with oversight and high-level guiding control by a director/user, and to allow the overall action to be driven by a scenario. We present an overview of the environment into which our virtual actors will be added in Section 3, and discuss themore » direction of the Virtual Actor research itself in Section 4. We will briefly review related work in Section 2. First however we need to place the research in the context of what motivates it. The motivation for our construction of this environment, and the line of research associated with it, is based on a long-term program of providing support, through simulation, for situational training, by which we mean a type of training in which students learn to handle multiple situations or scenarios. In these situations, the student may encounter events ranging from the routine occurance to the rare emergency. Indeed, the appeal of such training systems is that they could allow the student to experience and develop effective responses for situations they would otherwise have no opportunity to practice, until they happened to encounter an actual occurance. Examples of the type of students for this kind of training would be security forces or emergency response forces. An example of the type of training scenario we would like to support is given in Section 4.2.« less
Encarnação, L Miguel; Bimber, Oliver
2002-01-01
Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.
Evaluation of a low-cost 3D sound system for immersive virtual reality training systems.
Doerr, Kai-Uwe; Rademacher, Holger; Huesgen, Silke; Kubbat, Wolfgang
2007-01-01
Since Head Mounted Displays (HMD), datagloves, tracking systems, and powerful computer graphics resources are nowadays in an affordable price range, the usage of PC-based "Virtual Training Systems" becomes very attractive. However, due to the limited field of view of HMD devices, additional modalities have to be provided to benefit from 3D environments. A 3D sound simulation can improve the capabilities of VR systems dramatically. Unfortunately, realistic 3D sound simulations are expensive and demand a tremendous amount of computational power to calculate reverberation, occlusion, and obstruction effects. To use 3D sound in a PC-based training system as a way to direct and guide trainees to observe specific events in 3D space, a cheaper alternative has to be provided, so that a broader range of applications can take advantage of this modality. To address this issue, we focus in this paper on the evaluation of a low-cost 3D sound simulation that is capable of providing traceable 3D sound events. We describe our experimental system setup using conventional stereo headsets in combination with a tracked HMD device and present our results with regard to precision, speed, and used signal types for localizing simulated sound events in a virtual training environment.
Dunne, James R; McDonald, Claudia L
2010-07-01
Pulse!! The Virtual Clinical Learning Lab at Texas A&M University-Corpus Christi, in collaboration with the United States Navy, has developed a model for research and technological development that they believe is an essential element in the future of military and civilian medical education. The Pulse!! project models a strategy for providing cross-disciplinary expertise and resources to educational, governmental, and business entities challenged with meeting looming health care crises. It includes a three-dimensional virtual learning platform that provides unlimited, repeatable, immersive clinical experiences without risk to patients, and is available anywhere there is a computer. Pulse!! utilizes expertise in the fields of medicine, medical education, computer science, software engineering, physics, computer animation, art, and architecture. Lab scientists collaborate with the commercial virtual-reality simulation industry to produce research-based learning platforms based on cutting-edge computer technology.
Virtual worlds: a new frontier for nurse education?
Green, Janet; Wyllie, Aileen; Jackson, Debra
2014-01-01
Virtual worlds have the potential to offer nursing students social networking and, learning, opportunities through the use of collaborative and immersive learning. If nursing educators, are to stay, abreast of contemporary learning opportunities an exploration of the potential benefits of, virtual, worlds and their possibilities is needed. Literature was sourced that explored virtual worlds, and their, use in education, but nursing education specifically. It is clear that immersive learning has, positive, benefits for nursing, however the best way to approach virtual reality in nursing education, has yet to, be ascertained.
Huang, Xianwei; Naghdy, Fazel; Naghdy, Golshah; Du, Haiping; Todd, Catherine
2018-01-01
Robot-assisted therapy is regarded as an effective and reliable method for the delivery of highly repetitive training that is needed to trigger neuroplasticity following a stroke. However, the lack of fully adaptive assist-as-needed control of the robotic devices and an inadequate immersive virtual environment that can promote active participation during training are obstacles hindering the achievement of better training results with fewer training sessions required. This study thus focuses on these research gaps by combining these 2 key components into a rehabilitation system, with special attention on the rehabilitation of fine hand motion skills. The effectiveness of the proposed system is tested by conducting clinical trials on a chronic stroke patient and verified through clinical evaluation methods by measuring the key kinematic features such as active range of motion (ROM), finger strength, and velocity. By comparing the pretraining and post-training results, the study demonstrates that the proposed method can further enhance the effectiveness of fine hand motion rehabilitation training by improving finger ROM, strength, and coordination. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.
The Evolution of Constructivist Learning Environments: Immersion in Distributed, Virtual Worlds.
ERIC Educational Resources Information Center
Dede, Chris
1995-01-01
Discusses the evolution of constructivist learning environments and examines the collaboration of simulated software models, virtual environments, and evolving mental models via immersion in artificial realities. A sidebar gives a realistic example of a student navigating through cyberspace. (JMV)
Immersive Virtual Reality with Applications to Tele-Operation and Training
2016-03-07
to design accurate models for the control of a remote agent by retargeting human gestures (or body part movements) on the control structure of the...which is designed to co-operate with human inhabitants will need to posses, on some levels, a theory of mind [20]. This will enable the system to...University of Houston-Victoria, a designated Hispanic Serving Institution of higher education. The requested equipment and instrumentation will be
2014-01-01
Presentation of social situations via immersive virtual reality (VR) has the potential to be an ecologically valid way of assessing psychiatric symptoms. In this study we assess the occurrence of paranoid thinking and of symptoms of posttraumatic stress disorder (PTSD) in response to a single neutral VR social environment as predictors of later psychiatric symptoms assessed by standard methods. One hundred six people entered an immersive VR social environment (a train ride), presented via a head-mounted display, 4 weeks after having attended hospital because of a physical assault. Paranoid thinking about the neutral computer-generated characters and the occurrence of PTSD symptoms in VR were assessed. Reactions in VR were then used to predict the occurrence 6 months later of symptoms of paranoia and PTSD, as assessed by standard interviewer and self-report methods. Responses to VR predicted the severity of paranoia and PTSD symptoms as assessed by standard measures 6 months later. The VR assessments also added predictive value to the baseline interviewer methods, especially for paranoia. Brief exposure to environments presented via virtual reality provides a symptom assessment with predictive ability over many months. VR assessment may be of particular benefit for difficult to assess problems, such as paranoia, that have no gold standard assessment method. In the future, VR environments may be used in the clinic to complement standard self-report and clinical interview methods. PMID:24708073
Freeman, Daniel; Antley, Angus; Ehlers, Anke; Dunn, Graham; Thompson, Claire; Vorontsova, Natasha; Garety, Philippa; Kuipers, Elizabeth; Glucksman, Edward; Slater, Mel
2014-09-01
Presentation of social situations via immersive virtual reality (VR) has the potential to be an ecologically valid way of assessing psychiatric symptoms. In this study we assess the occurrence of paranoid thinking and of symptoms of posttraumatic stress disorder (PTSD) in response to a single neutral VR social environment as predictors of later psychiatric symptoms assessed by standard methods. One hundred six people entered an immersive VR social environment (a train ride), presented via a head-mounted display, 4 weeks after having attended hospital because of a physical assault. Paranoid thinking about the neutral computer-generated characters and the occurrence of PTSD symptoms in VR were assessed. Reactions in VR were then used to predict the occurrence 6 months later of symptoms of paranoia and PTSD, as assessed by standard interviewer and self-report methods. Responses to VR predicted the severity of paranoia and PTSD symptoms as assessed by standard measures 6 months later. The VR assessments also added predictive value to the baseline interviewer methods, especially for paranoia. Brief exposure to environments presented via virtual reality provides a symptom assessment with predictive ability over many months. VR assessment may be of particular benefit for difficult to assess problems, such as paranoia, that have no gold standard assessment method. In the future, VR environments may be used in the clinic to complement standard self-report and clinical interview methods. PsycINFO Database Record (c) 2014 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella
In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.
Design and development of a virtual reality simulator for advanced cardiac life support training.
Vankipuram, Akshay; Khanal, Prabal; Ashby, Aaron; Vankipuram, Mithra; Gupta, Ashish; DrummGurnee, Denise; Josey, Karen; Smith, Marshall
2014-07-01
The use of virtual reality (VR) training tools for medical education could lead to improvements in the skills of clinicians while providing economic incentives for healthcare institutions. The use of VR tools can also mitigate some of the drawbacks currently associated with providing medical training in a traditional clinical environment such as scheduling conflicts and the need for specialized equipment (e.g., high-fidelity manikins). This paper presents the details of the framework and the development methodology associated with a VR-based training simulator for advanced cardiac life support, a time critical, team-based medical scenario. In addition, we also report the key findings of a usability study conducted to assess the efficacy of various features of this VR simulator through a postuse questionnaire administered to various care providers. The usability questionnaires were completed by two groups that used two different versions of the VR simulator. One version consisted of the VR trainer with it all its features and a minified version with certain immersive features disabled. We found an increase in usability scores from the minified group to the full VR group.
ERIC Educational Resources Information Center
Lanier, Jaron
2001-01-01
Describes tele-immersion, a new medium for human interaction enabled by digital technologies. It combines the display and interaction techniques of virtual reality with new vision technologies that transcend the traditional limitations of a camera. Tele-immersion stations observe people as moving sculptures without favoring a single point of view.…
Proffitt, Rachel; Lange, Belinda; Chen, Christina; Winstein, Carolee
2014-01-01
The purpose of this study was to explore the subjective experience of older adults interacting with both virtual and real environments. Thirty healthy older adults engaged with real and virtual tasks of similar motor demands: reaching to a target in standing and stepping stance. Immersive tendencies and absorption scales were administered before the session. Game engagement and experience questionnaires were completed after each task, followed by a semi-structured interview at the end of the testing session. Data were analyzed respectively using paired t-tests and grounded theory methodology. Participants preferred the virtual task over the real task. They also reported an increase in presence and absorption with the virtual task, describing an external focus of attention. Findings will be used to inform future development of appropriate game-based balance training applications that could be embedded in the home or community settings as part of evidence-based fall prevention programs. PMID:24334299
Virtual reality simulation in neurosurgery: technologies and evolution.
Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H
2013-01-01
Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.
BIM based virtual environment for fire emergency evacuation.
Wang, Bin; Li, Haijiang; Rezgui, Yacine; Bradley, Alex; Ong, Hoang N
2014-01-01
Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management.
Alaraj, Ali; Charbel, Fady T; Birk, Daniel; Tobin, Matthew; Tobin, Mathew; Luciano, Cristian; Banerjee, Pat P; Rizzi, Silvio; Sorenson, Jeff; Foley, Kevin; Slavin, Konstantin; Roitberg, Ben
2013-01-01
Recent studies have shown that mental script-based rehearsal and simulation-based training improve the transfer of surgical skills in various medical disciplines. Despite significant advances in technology and intraoperative techniques over the last several decades, surgical skills training on neurosurgical operations still carries significant risk of serious morbidity or mortality. Potentially avoidable technical errors are well recognized as contributing to poor surgical outcome. Surgical education is undergoing overwhelming change, as a result of the reduction of work hours and current trends focusing on patient safety and linking reimbursement with clinical outcomes. Thus, there is a need for adjunctive means for neurosurgical training, which is a recent advancement in simulation technology. ImmersiveTouch is an augmented reality system that integrates a haptic device and a high-resolution stereoscopic display. This simulation platform uses multiple sensory modalities, re-creating many of the environmental cues experienced during an actual procedure. Modules available include ventriculostomy, bone drilling, percutaneous trigeminal rhizotomy, and simulated spinal modules such as pedicle screw placement, vertebroplasty, and lumbar puncture. We present our experience with the development of such augmented reality neurosurgical modules and the feedback from neurosurgical residents.
Alaraj, Ali; Charbel, Fady T.; Birk, Daniel; Tobin, Mathew; Luciano, Cristian; Banerjee, Pat P.; Rizzi, Silvio; Sorenson, Jeff; Foley, Kevin; Slavin, Konstantin; Roitberg, Ben
2013-01-01
Recent studies have shown that mental script-based rehearsal and simulation-based training improves the transfer of surgical skills in various medical disciplines. Despite significant advances in technology and intraoperative techniques over the last several decades, surgical skills training on neurosurgical operations still carries significant risk of serious morbidity or mortality. Potentially avoidable technical errors are well recognized as contributing to poor surgical outcome. Surgical education is undergoing overwhelming change, with reduction of working hours and current trends to focus on patient’s safety and linking reimbursement with clinical outcomes, and there is a need for adjunctive means for neurosurgical training;this has been recent advancement in simulation technology. ImmersiveTouch (IT) is an augmented reality (AR) system that integrates a haptic device and a high-resolution stereoscopic display. This simulation platform utilizes multiple sensory modalities, recreating many of the environmental cues experienced during an actual procedure. Modules available include ventriculostomy, bone drilling, percutaneous trigeminal rhizotomy, in addition to simulated spinal modules such as pedicle screw placement, vertebroplasty, and lumbar puncture. We present our experience with development of such AR neurosurgical modules and the feedback from neurosurgical residents. PMID:23254799
Experiencing Soil Science from your office through virtual experiences
NASA Astrophysics Data System (ADS)
Beato, M. Carmen; González-Merino, Ramón; Campillo, M. Carmen; Fernández-Ahumada, Elvira; Ortiz, Leovigilda; Taguas, Encarnación V.; Guerrero, José Emilio
2017-04-01
Currently, numerous tools based on the new information and communication technologies offer a wide range of possibilities for the implementation of interactive methodologies in Education and Science. In particular, virtual reality and immersive worlds - artificially generated computer environments where users interact through a figurative individual that represents them in that environment (their "avatar") - have been identified as the technology that will change the way we live, particularly in educational terms, product development and entertainment areas (Schmorrow, 2009). Gisbert-Cervera et al. (2011) consider that the 3D worlds in education, among others, provide a unique training and exchange of knowledge environment which allows a goal reflection to support activities and achieve learning outcomes. In Soil Sciences, the experimental component is essential to acquire the necessary knowledge to understand the biogeochemical processes taking place and their interactions with time, climate, topography and living organisms present. In this work, an immersive virtual environment which reproduces a series of pits have been developed to evaluate and differentiate soil characteristics such as texture, structure, consistency, color and other physical-chemical and biological properties for educational purposes. Bibliographical material such as pictures, books, papers and were collected in order to classify the information needed and to build the soil profiles into the virtual environment. The programming language for the virtual recreation was Unreal Engine4 (UE4; https://www.unrealengine.com/unreal-engine-4). This program was chosen because it provides two toolsets for programmers and it can also be used in tandem to accelerate development workflows. In addition, Unreal Engine4 technology powers hundreds of games as well as real-time 3D films, training simulations, visualizations and it creates very realistic graphics. For the evaluation of its impact and its usefulness in teaching, a series of surveys will be presented to undergraduate students and teachers. REFERENCES: Gisbert-Cervera M, Esteve-Gonzalez V., Camacho-Marti M.M. (2011). Delve into the Deep: Learning Potential in Metaverses and 3D Worlds. eLearning (25) Papers ISSN: 1887-1542 Schmorrow D.D. (2009). Why virtual? Theoretical Issues in Ergonomics Science 10(3): 279-282.
Using smartphone technology to deliver a virtual pedestrian environment: usability and validation.
Schwebel, David C; Severson, Joan; He, Yefei
2017-09-01
Various programs effectively teach children to cross streets more safely, but all are labor- and cost-intensive. Recent developments in mobile phone technology offer opportunity to deliver virtual reality pedestrian environments to mobile smartphone platforms. Such an environment may offer a cost- and labor-effective strategy to teach children to cross streets safely. This study evaluated usability, feasibility, and validity of a smartphone-based virtual pedestrian environment. A total of 68 adults completed 12 virtual crossings within each of two virtual pedestrian environments, one delivered by smartphone and the other a semi-immersive kiosk virtual environment. Participants completed self-report measures of perceived realism and simulator sickness experienced in each virtual environment, plus self-reported demographic and personality characteristics. All participants followed system instructions and used the smartphone-based virtual environment without difficulty. No significant simulator sickness was reported or observed. Users rated the smartphone virtual environment as highly realistic. Convergent validity was detected, with many aspects of pedestrian behavior in the smartphone-based virtual environment matching behavior in the kiosk virtual environment. Anticipated correlations between personality and kiosk virtual reality pedestrian behavior emerged for the smartphone-based system. A smartphone-based virtual environment can be usable and valid. Future research should develop and evaluate such a training system.
Validation of virtual reality as a tool to understand and prevent child pedestrian injury.
Schwebel, David C; Gaines, Joanna; Severson, Joan
2008-07-01
In recent years, virtual reality has emerged as an innovative tool for health-related education and training. Among the many benefits of virtual reality is the opportunity for novice users to engage unsupervised in a safe environment when the real environment might be dangerous. Virtual environments are only useful for health-related research, however, if behavior in the virtual world validly matches behavior in the real world. This study was designed to test the validity of an immersive, interactive virtual pedestrian environment. A sample of 102 children and 74 adults was recruited to complete simulated road-crossings in both the virtual environment and the identical real environment. In both the child and adult samples, construct validity was demonstrated via significant correlations between behavior in the virtual and real worlds. Results also indicate construct validity through developmental differences in behavior; convergent validity by showing correlations between parent-reported child temperament and behavior in the virtual world; internal reliability of various measures of pedestrian safety in the virtual world; and face validity, as measured by users' self-reported perception of realism in the virtual world. We discuss issues of generalizability to other virtual environments, and the implications for application of virtual reality to understanding and preventing pediatric pedestrian injuries.
The effects of immersiveness on physiology.
Wiederhold, B K; Davis, R; Wiederhold, M D
1998-01-01
The effects of varying levels of immersion in virtual reality environments on participant's heart rate, respiration rate, peripheral skin temperature, and skin resistance levels were examined. Subjective reports of presence were also noted. Participants were presented with a virtual environment of an airplane flight both as seen from a two-dimensional computer screen and as seen from within a head-mounted display. Subjects were randomly assigned to different order of conditions presented, but all subjects received both conditions. Differences between the non-phobics' physiological responses and the phobic's response when placed in a virtual environment related to the phobia were noted. Also noted were changes in physiology based on degree of immersion.
Vora, Jeenal; Nair, Santosh; Gramopadhye, Anand K; Duchowski, Andrew T; Melloy, Brian J; Kanki, Barbara
2002-11-01
The aircraft maintenance industry is a complex system consisting of several interrelated human and machine components. Recognizing this, the Federal Aviation Administration (FAA) has pursued human factors related research. In the maintenance arena the research has focused on the aircraft inspection process and the aircraft inspector. Training has been identified as the primary intervention strategy to improve the quality and reliability of aircraft inspection. If training is to be successful, it is critical that we provide aircraft inspectors with appropriate training tools and environment. In response to this need, the paper outlines the development of a virtual reality (VR) system for aircraft inspection training. VR has generated much excitement but little formal proof that it is useful. However, since VR interfaces are difficult and expensive to build, the computer graphics community needs to be able to predict which applications will benefit from VR. To address this important issue, this research measured the degree of immersion and presence felt by subjects in a virtual environment simulator. Specifically, it conducted two controlled studies using the VR system developed for visual inspection task of an aft-cargo bay at the VR Lab of Clemson University. Beyond assembling the visual inspection virtual environment, a significant goal of this project was to explore subjective presence as it affects task performance. The results of this study indicated that the system scored high on the issues related to the degree of presence felt by the subjects. As a next logical step, this study, then, compared VR to an existing PC-based aircraft inspection simulator. The results showed that the VR system was better and preferred over the PC-based training tool.
Interactive visualization to advance earthquake simulation
Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.
2008-01-01
The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.
Multisensory Integration in the Virtual Hand Illusion with Active Movement
Satoh, Satoru; Hachimura, Kozaburo
2016-01-01
Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality. PMID:27847822
Chau, Brian; Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc
2017-01-01
Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain.
Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc
2017-01-01
Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain. PMID:29616149
High-immersion three-dimensional display of the numerical computer model
NASA Astrophysics Data System (ADS)
Xing, Shujun; Yu, Xunbo; Zhao, Tianqi; Cai, Yuanfa; Chen, Duo; Chen, Zhidong; Sang, Xinzhu
2013-08-01
High-immersion three-dimensional (3D) displays making them valuable tools for many applications, such as designing and constructing desired building houses, industrial architecture design, aeronautics, scientific research, entertainment, media advertisement, military areas and so on. However, most technologies provide 3D display in the front of screens which are in parallel with the walls, and the sense of immersion is decreased. To get the right multi-view stereo ground image, cameras' photosensitive surface should be parallax to the public focus plane and the cameras' optical axes should be offset to the center of public focus plane both atvertical direction and horizontal direction. It is very common to use virtual cameras, which is an ideal pinhole camera to display 3D model in computer system. We can use virtual cameras to simulate the shooting method of multi-view ground based stereo image. Here, two virtual shooting methods for ground based high-immersion 3D display are presented. The position of virtual camera is determined by the people's eye position in the real world. When the observer stand in the circumcircle of 3D ground display, offset perspective projection virtual cameras is used. If the observer stands out the circumcircle of 3D ground display, offset perspective projection virtual cameras and the orthogonal projection virtual cameras are adopted. In this paper, we mainly discussed the parameter setting of virtual cameras. The Near Clip Plane parameter setting is the main point in the first method, while the rotation angle of virtual cameras is the main point in the second method. In order to validate the results, we use the D3D and OpenGL to render scenes of different viewpoints and generate a stereoscopic image. A realistic visualization system for 3D models is constructed and demonstrated for viewing horizontally, which provides high-immersion 3D visualization. The displayed 3D scenes are compared with the real objects in the real world.
Bugnariu, Nicoleta L.
2016-01-01
Abstract Virtual environments (VEs) may be useful for delivering social skills interventions to individuals with autism spectrum disorder (ASD). Immersive VEs provide opportunities for individuals with ASD to learn and practice skills in a controlled replicable setting. However, not all VEs are delivered using the same technology, and the level of immersion differs across settings. We group studies into low-, moderate-, and high-immersion categories by examining five aspects of immersion. In doing so, we draw conclusions regarding the influence of this technical manipulation on the efficacy of VEs as a tool for assessing and teaching social skills. We also highlight ways in which future studies can advance our understanding of how manipulating aspects of immersion may impact intervention success. PMID:26919157
NASA's Hybrid Reality Lab: One Giant Leap for Full Dive
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2017-01-01
This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.
The development, assessment and validation of virtual reality for human anatomy instruction
NASA Technical Reports Server (NTRS)
Marshall, Karen Benn
1996-01-01
This research project seeks to meet the objective of science training by developing, assessing, validating and utilizing VR as a human anatomy training medium. Current anatomy instruction is primarily in the form of lectures and usage of textbooks. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three-dimensional, unlike the one-dimensional depiction found in textbooks and the two-dimensional depiction found on the computer. Virtual reality allows one to step through the computer screen into a 3-D artificial world. The primary objective of this project is to produce a virtual reality application of the abdominopelvic region of a human cadaver that can be taken back to the classroom. The hypothesis is that an immersive learning environment affords quicker anatomic recognition and orientation and a greater level of retention in human anatomy instruction. The goal is to augment not replace traditional modes of instruction.
Heart rate variability (HRV) during virtual reality immersion
Malińska, Marzena; Zużewicz, Krystyna; Bugajska, Joanna; Grabowski, Andrzej
2015-01-01
The goal of the study was assessment of the hour-long training involving handling virtual environment (sVR) and watching a stereoscopic 3D movie on the mechanisms of autonomic heart rate (HR) regulation among the subjects who were not predisposed to motion sickness. In order to exclude predispositions to motion sickness, all the participants (n=19) underwent a Coriolis test. During an exposure to 3D and sVR the ECG signal was continuously recorded using the Holter method. For the twelve consecutive 5-min epochs of ECG signal, the analysis of heart rate variability (HRV) in time and frequency domains was conducted. After 30 min from the beginning of the training in handling the virtual workstation a significant increase in LF spectral power was noted. The values of the sympathovagal LF/HF index while sVR indicated a significant increase in sympathetic predominance in four time intervals, namely between the 5th and the 10th minute, between the 15th and the 20th minute, between the 35th and 40th minute and between the 55th and the 60th minute of exposure. PMID:26327262
The use of high fidelity CAD models as the basis for training on complex systems
NASA Technical Reports Server (NTRS)
Miller, Kellie; Tanner, Steve
1993-01-01
During the design phases of large and complex systems such as NASA's Space Station Freedom (SSF), there are few, if any physical prototypes built. This is often due to their expense and the realization that the design is likely to change. This poses a problem for training, maintainability, and operations groups who are tasked to lay the foundation of plans for using these systems. The Virtual Reality and Visualization Laboratory at the Boeing Advanced Computing Group's Huntsville facility is supporting the use of high fidelity, detailed design models that are generated during the initial design phases, for use in training, maintainability and operations exercises. This capability was used in its non-immersive form to great effect at the SSF Critical Design Review (CDR) during February, 1993. Allowing the user to move about within a CAD design supports many efforts, including training and scenario study. We will demonstrate via a video of the Maintainability SSF CDR how this type of approach can be used and why it is so effective in conveying large amounts of information quickly and concisely. We will also demonstrate why high fidelity models are so important for this type of training system and how it's immersive aspects may be exploited as well.
A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality
Kim, Mingyu; Jeon, Changyu; Kim, Jinmo
2017-01-01
This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545
A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.
Kim, Mingyu; Jeon, Changyu; Kim, Jinmo
2017-05-17
This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.
New Desktop Virtual Reality Technology in Technical Education
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.
2008-01-01
Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…
Situating Pedagogies, Positions and Practices in Immersive Virtual Worlds
ERIC Educational Resources Information Center
Savin-Baden, Maggi; Gourlay, Lesley; Tombs, Cathy; Steils, Nicole; Tombs, Gemma; Mawer, Matt
2010-01-01
Background: The literature on immersive virtual worlds and e-learning to date largely indicates that technology has led the pedagogy. Although rationales for implementing e-learning have included flexibility of provision and supporting diversity, none of these recommendations has helped to provide strong pedagogical location. Furthermore, there is…
A Virtual World for Collaboration: The AETZone
ERIC Educational Resources Information Center
Cheney, Amelia W.; Sanders, Robert L.; Matzen, Nita J.; Bronack, Stephen C.; Riedl, Richard E.; Tashner, John H.
2009-01-01
Participation in learning communities, and the construction of knowledge in communities of practice, are important considerations in the use of 3D immersive worlds. This article describes the creation of this type of learning environment in AETZone, an immersive virtual environment in use within graduate programs at Appalachian State University…
Building a Collaborative Online Literary Experience
ERIC Educational Resources Information Center
Essid, Joe; Wilde, Fran
2011-01-01
Effective virtual simulations can embed participants in imaginary worlds. Researchers working in virtual worlds and gaming often refer to "immersion," a state in which a participant or player loses track of time and becomes one with the simulation. Immersive settings have been shown to deepen learning. Ken Hudson's work with students…
Workshop Report on Virtual Worlds and Immersive Environments
NASA Technical Reports Server (NTRS)
Langhoff, Stephanie R.; Cowan-Sharp, Jessy; Dodson, Karen E.; Damer, Bruce; Ketner, Bob
2009-01-01
The workshop revolved around three framing ideas or scenarios about the evolution of virtual environments: 1. Remote exploration: The ability to create high fidelity environments rendered from external data or models such that exploration, design and analysis that is truly interoperable with the physical world can take place within them. 2. We all get to go: The ability to engage anyone in being a part of or contributing to an experience (such as a space mission), no matter their training or location. It is the creation of a new paradigm for education, outreach, and the conduct of science in society that is truly participatory. 3. Become the data: A vision of a future where boundaries between the physical and the virtual have ceased to be meaningful. What would this future look like? Is this plausible? Is it desirable? Why and why not?
2015-09-01
Training System ARB Aircraft Recovery Bulletins AR Augmented Reality CAG Carrier Air Group CATCC Carrier Air Traffic Control Center COTS...in integration of an optical lens systems into the aircraft carrier. The current generation of optical lens systems integrated into aircraft ...The use of MOVLAS on an aircraft carrier represents a direct communication link between the LSO and pilot. As a backup landing aid system to
Immersive Virtual Worlds in University-Level Human Geography Courses
ERIC Educational Resources Information Center
Dittmer, Jason
2010-01-01
This paper addresses the potential for increased deployment of immersive virtual worlds in higher geographic education. An account of current practice regarding popular culture in the geography classroom is offered, focusing on the objectification of popular culture rather than its constitutive role vis-a-vis place. Current e-learning practice is…
ERIC Educational Resources Information Center
Orman, Evelyn K.; Price, Harry E.; Russell, Christine R.
2017-01-01
Acquiring nonverbal skills necessary to appropriately communicate and educate members of performing ensembles is essential for wind band conductors. Virtual reality learning environments (VRLEs) provide a unique setting for developing these proficiencies. For this feasibility study, we used an augmented immersive VRLE to enhance eye contact, torso…
The Utility of Using Immersive Virtual Environments for the Assessment of Science Inquiry Learning
ERIC Educational Resources Information Center
Code, Jillianne; Clarke-Midura, Jody; Zap, Nick; Dede, Chris
2013-01-01
Determining the effectiveness of any educational technology depends upon teachers' and learners' perception of the functional utility of that tool for teaching, learning, and assessment. The Virtual Performance project at Harvard University is developing and studying the feasibility of using immersive technology to develop performance…
The Design, Development and Evaluation of a Virtual Reality Based Learning Environment
ERIC Educational Resources Information Center
Chen, Chwen Jen
2006-01-01
Many researchers and instructional designers increasingly recognise the benefits of utilising three dimensional virtual reality (VR) technology in instruction. In general, there are two types of VR system, the immersive system and the non-immersive system. This article focuses on the latter system that merely uses the conventional personal…
Measuring Flow Experience in an Immersive Virtual Environment for Collaborative Learning
ERIC Educational Resources Information Center
van Schaik, P.; Martin, S.; Vallance, M.
2012-01-01
In contexts other than immersive virtual environments, theoretical and empirical work has identified flow experience as a major factor in learning and human-computer interaction. Flow is defined as a "holistic sensation that people feel when they act with total involvement". We applied the concept of flow to modeling the experience of…
2013-01-01
Background To increase the ecological validity of neuropsychological instruments the use of virtual reality (VR) applications can be considered as an effective tool in the field of cognitive neurorehabilitation. Despite the growing use of VR programs, only few studies have considered the application of everyday activities like shopping or travelling in VR training devices. Methods We developed a novel 360°- VR supermarket, which is displayed on a circular arrangement of 8 touch-screens – the “OctaVis”. In this setting, healthy human adults had to memorize an auditorily presented shopping list (list A) and subsequently buy all remembered products of this list in the VR supermarket. This procedure was accomplished on three consecutive days. On day four, a new shopping list (list B) was introduced and participants had to memorize and buy only products of this list. On day five, participants had to buy all remembered items of list A again, but without new presentation of list A. Additionally, we obtained measures of participants’ presence, immersion and figural-spatial memory abilities. We also tested a sample of patients with focal epilepsy with an extended version of our shopping task, which consisted of eight days of training. Results We observed a comprehensive and stable effect of learning for the number of correct products, the required time for shopping, and the length of movement trajectories in the VR supermarket in the course of the training program. Task performance was significantly correlated with participants’ figural-spatial memory abilities and subjective level of immersion into the VR. Conclusions Learning effects in our paradigm extend beyond mere verbal learning of the shopping list as the data show evidence for multi-layered learning (at least visual-spatial, strategic, and verbal) on concordant measures. Importantly, learning also correlated with measures of figural-spatial memory and the degree of immersion into the VR. We propose that cognitive training with the VR supermarket program in the OctaVis will be efficient for the assessment and training of real-life cognitive abilities in healthy subjects and patients with epilepsy. It is most likely that our findings will also apply for patients with cognitive disabilities resulting from other neurological and psychiatric syndromes. PMID:23618596
Grewe, Philip; Kohsik, Agnes; Flentge, David; Dyck, Eugen; Botsch, Mario; Winter, York; Markowitsch, Hans J; Bien, Christian G; Piefke, Martina
2013-04-23
To increase the ecological validity of neuropsychological instruments the use of virtual reality (VR) applications can be considered as an effective tool in the field of cognitive neurorehabilitation. Despite the growing use of VR programs, only few studies have considered the application of everyday activities like shopping or travelling in VR training devices. We developed a novel 360°-VR supermarket, which is displayed on a circular arrangement of 8 touch-screens--the "OctaVis". In this setting, healthy human adults had to memorize an auditorily presented shopping list (list A) and subsequently buy all remembered products of this list in the VR supermarket. This procedure was accomplished on three consecutive days. On day four, a new shopping list (list B) was introduced and participants had to memorize and buy only products of this list. On day five, participants had to buy all remembered items of list A again, but without new presentation of list A. Additionally, we obtained measures of participants' presence, immersion and figural-spatial memory abilities. We also tested a sample of patients with focal epilepsy with an extended version of our shopping task, which consisted of eight days of training. We observed a comprehensive and stable effect of learning for the number of correct products, the required time for shopping, and the length of movement trajectories in the VR supermarket in the course of the training program. Task performance was significantly correlated with participants' figural-spatial memory abilities and subjective level of immersion into the VR. Learning effects in our paradigm extend beyond mere verbal learning of the shopping list as the data show evidence for multi-layered learning (at least visual-spatial, strategic, and verbal) on concordant measures. Importantly, learning also correlated with measures of figural-spatial memory and the degree of immersion into the VR. We propose that cognitive training with the VR supermarket program in the OctaVis will be efficient for the assessment and training of real-life cognitive abilities in healthy subjects and patients with epilepsy. It is most likely that our findings will also apply for patients with cognitive disabilities resulting from other neurological and psychiatric syndromes.
Highly immersive virtual reality laparoscopy simulation: development and future aspects.
Huber, Tobias; Wunderling, Tom; Paschold, Markus; Lang, Hauke; Kneist, Werner; Hansen, Christian
2018-02-01
Virtual reality (VR) applications with head-mounted displays (HMDs) have had an impact on information and multimedia technologies. The current work aimed to describe the process of developing a highly immersive VR simulation for laparoscopic surgery. We combined a VR laparoscopy simulator (LapSim) and a VR-HMD to create a user-friendly VR simulation scenario. Continuous clinical feedback was an essential aspect of the development process. We created an artificial VR (AVR) scenario by integrating the simulator video output with VR game components of figures and equipment in an operating room. We also created a highly immersive VR surrounding (IVR) by integrating the simulator video output with a [Formula: see text] video of a standard laparoscopy scenario in the department's operating room. Clinical feedback led to optimization of the visualization, synchronization, and resolution of the virtual operating rooms (in both the IVR and the AVR). Preliminary testing results revealed that individuals experienced a high degree of exhilaration and presence, with rare events of motion sickness. The technical performance showed no significant difference compared to that achieved with the standard LapSim. Our results provided a proof of concept for the technical feasibility of an custom highly immersive VR-HMD setup. Future technical research is needed to improve the visualization, immersion, and capability of interacting within the virtual scenario.
vTrain: a novel curriculum for patient surge training in a multi-user virtual environment (MUVE).
Greci, Laura S; Ramloll, Rameshsharma; Hurst, Samantha; Garman, Karen; Beedasy, Jaishree; Pieper, Eric B; Huang, Ricky; Higginbotham, Erin; Agha, Zia
2013-06-01
During a pandemic influenza, emergency departments will be overwhelmed with a large influx of patients seeking care. Although all hospitals should have a written plan for dealing with this surge of health care utilization, most hospitals struggle with ways to educate the staff and practice for potentially catastrophic events. Hypothesis/Problem To better prepare hospital staff for a patient surge, a novel educational curriculum was developed utilizing an emergency department for a patient surge functional drill. A multidisciplinary team of medical educators, evaluators, emergency preparedness experts, and technology specialists developed a curriculum to: (1) train novice users to function in their job class in a multi-user virtual environment (MUVE); (2) obtain appropriate pre-drill disaster preparedness training; (3) perform functional team exercises in a MUVE; and (4) reflect on their performance after the drill. A total of 14 students participated in one of two iterations of the pilot training program; seven nurses completed the emergency department triage course, and seven hospital administrators completed the Command Post (CP) course. All participants reported positive experiences in written course evaluations and structured verbal debriefings, and self-reported increase in disaster preparedness knowledge. Students also reported improved team communication, planning, team decision making, and the ability to visualize and reflect on their performance. Data from this pilot program suggest that the immersive, virtual teaching method is well suited to team-based, reflective practice and learning of disaster management skills.
Emerging Utility of Virtual Reality as a Multidisciplinary Tool in Clinical Medicine.
Pourmand, Ali; Davis, Steven; Lee, Danny; Barber, Scott; Sikka, Neal
2017-10-01
Among the more recent products borne of the evolution of digital technology, virtual reality (VR) is gaining a foothold in clinical medicine as an adjunct to traditional therapies. Early studies suggest a growing role for VR applications in pain management, clinical skills training, cognitive assessment and cognitive therapy, and physical rehabilitation. To complete a review of the literature, we searched PubMed and MEDLINE databases with the following search terms: "virtual reality," "procedural medicine," "oncology," "physical therapy," and "burn." We further limited our search to publications in the English language. Boolean operators were used to combine search terms. The included search terms yielded 97 potential articles, of which 45 were identified as meeting study criteria, and are included in this review. These articles provide data, which strongly support the hypothesis that VR simulations can enhance pain management (by reducing patient perception of pain and anxiety), can augment clinical training curricula and physical rehabilitation protocols (through immersive audiovisual environments), and can improve clinical assessment of cognitive function (through improved ecological validity). Through computer-generated, life-like digital landscapes, VR stands to change the current approach to pain management, medical training, neurocognitive diagnosis, and physical rehabilitation. Additional studies are needed to help define best practices in VR utilization, and to explore new therapeutic uses for VR in clinical practice.
Volmer, Joe; Burkert, Malte; Krumm, Heiko; Abodahab, Abdurrahman; Dinklage, Patrick; Feltmann, Marius; Kröger, Chris; Panta, Pernes; Schäfer, Felix; Scheidt, David; Sellung, Marcel; Singerhoff, Hauke; Steingrefer, Christofer; Schmidt, Thomas; Hoffmann, Jan-Dirk; Willemsen, Detlev; Reiss, Nils
2017-01-01
Although regular physical activities reduce mortality and increase quality of life many cardiac patients discontinue training due to lack of motivation, lack of time or having health concerns because of a too high training intensity. Therefore, we developed an exergaming based system to enhance long-term motivation in the context of rehabilitation training. We combined different hardware components such as vital sensors, a virtual reality headset, a motion detecting camera, a bicycle ergometer and a motion platform to create an immersive and fun experience for the training user without having to worry about any negative health impact. Our evaluation shows that the system is well accepted by the users and is capable of tackling the aforementioned reasons for an inactive lifestyle. The system is designed to be easily extensible, safe to use and enables professionals to adjust and to telemonitor the training at any time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Do, Phuong T.; Moreland, John R.; Delgado, Catherine
Our research provides an innovative solution for optimizing learning effectiveness and improving postsecondary education through the development of virtual simulators that can be easily used and integrated into existing wind energy curriculum. Two 3D virtual simulators are developed in our laboratory for use in an immersive 3D virtual reality (VR) system or for 3D display on a 2D screen. Our goal is to apply these prototypical simulators to train postsecondary students and professionals in wind energy education; and to offer experiential learning opportunities in 3D modeling, simulation, and visualization. The issue of transferring learned concepts to practical applications is amore » widespread problem in postsecondary education. Related to this issue is a critical demand to educate and train a generation of professionals for the wind energy industry. With initiatives such as the U.S. Department of Energy's “20% Wind Energy by 2030” outlining an exponential increase of wind energy capacity over the coming years, revolutionary educational reform is needed to meet the demand for education in the field of wind energy. These developments and implementation of Virtual Simulators and accompanying curriculum will propel national reforms, meeting the needs of the wind energy industrial movement and addressing broader educational issues that affect a number of disciplines.« less
Do, Phuong T.; Moreland, John R.; Delgado, Catherine; ...
2013-01-01
Our research provides an innovative solution for optimizing learning effectiveness and improving postsecondary education through the development of virtual simulators that can be easily used and integrated into existing wind energy curriculum. Two 3D virtual simulators are developed in our laboratory for use in an immersive 3D virtual reality (VR) system or for 3D display on a 2D screen. Our goal is to apply these prototypical simulators to train postsecondary students and professionals in wind energy education; and to offer experiential learning opportunities in 3D modeling, simulation, and visualization. The issue of transferring learned concepts to practical applications is amore » widespread problem in postsecondary education. Related to this issue is a critical demand to educate and train a generation of professionals for the wind energy industry. With initiatives such as the U.S. Department of Energy's “20% Wind Energy by 2030” outlining an exponential increase of wind energy capacity over the coming years, revolutionary educational reform is needed to meet the demand for education in the field of wind energy. These developments and implementation of Virtual Simulators and accompanying curriculum will propel national reforms, meeting the needs of the wind energy industrial movement and addressing broader educational issues that affect a number of disciplines.« less
2013-05-01
estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data... pedagogy , and instructional quality. Measures of effectiveness data is minimal and often has not been conducted in a rigorous manner. To be clear...instructional pedagogy and instructional quality between the programs offered. Efficacy studies beyond student satisfaction scores have not been done in a
Game design in virtual reality systems for stroke rehabilitation.
Goude, Daniel; Björk, Staffan; Rydmark, Martin
2007-01-01
We propose a model for the structured design of games for post-stroke rehabilitation. The model is based on experiences with game development for a haptic and stereo vision immersive workbench intended for daily use in stroke patients' homes. A central component of this rehabilitation system is a library of games that are simultaneously entertaining for the patient and beneficial for rehabilitation [1], and where each game is designed for specific training tasks through the use of the model.
Interactive Immersive Virtualmuseum: Digital Documentation for Virtual Interaction
NASA Astrophysics Data System (ADS)
Clini, P.; Ruggeri, L.; Angeloni, R.; Sasso, M.
2018-05-01
Thanks to their playful and educational approach Virtual Museum systems are very effective for the communication of Cultural Heritage. Among the latest technologies Immersive Virtual Reality is probably the most appealing and potentially effective to serve this purpose; nevertheless, due to a poor user-system interaction, caused by an incomplete maturity of a specific technology for museum applications, it is still quite uncommon to find immersive installations in museums. This paper explore the possibilities offered by this technology and presents a workflow that, starting from digital documentation, makes possible an interaction with archaeological finds or any other cultural heritage inside different kinds of immersive virtual reality spaces. Two different cases studies are presented: the National Archaeological Museum of Marche in Ancona and the 3D reconstruction of the Roman Forum of Fanum Fortunae. Two different approaches not only conceptually but also in contents; while the Archaeological Museum is represented in the application simply using spherical panoramas to give the perception of the third dimension, the Roman Forum is a 3D model that allows visitors to move in the virtual space as in the real one. In both cases, the acquisition phase of the artefacts is central; artefacts are digitized with the photogrammetric technique Structure for Motion then they are integrated inside the immersive virtual space using a PC with a HTC Vive system that allows the user to interact with the 3D models turning the manipulation of objects into a fun and exciting experience. The challenge, taking advantage of the latest opportunities made available by photogrammetry and ICT, is to enrich visitors' experience in Real Museum making possible the interaction with perishable, damaged or lost objects and the public access to inaccessible or no longer existing places promoting in this way the preservation of fragile sites.
Virtual reality simulation for the optimization of endovascular procedures: current perspectives.
Rudarakanchana, Nung; Van Herzeele, Isabelle; Desender, Liesbeth; Cheshire, Nicholas J W
2015-01-01
Endovascular technologies are rapidly evolving, often requiring coordination and cooperation between clinicians and technicians from diverse specialties. These multidisciplinary interactions lead to challenges that are reflected in the high rate of errors occurring during endovascular procedures. Endovascular virtual reality (VR) simulation has evolved from simple benchtop devices to full physic simulators with advanced haptics and dynamic imaging and physiological controls. The latest developments in this field include the use of fully immersive simulated hybrid angiosuites to train whole endovascular teams in crisis resource management and novel technologies that enable practitioners to build VR simulations based on patient-specific anatomy. As our understanding of the skills, both technical and nontechnical, required for optimal endovascular performance improves, the requisite tools for objective assessment of these skills are being developed and will further enable the use of VR simulation in the training and assessment of endovascular interventionalists and their entire teams. Simulation training that allows deliberate practice without danger to patients may be key to bridging the gap between new endovascular technology and improved patient outcomes.
ERIC Educational Resources Information Center
Waller, David; Richardson, Adam R.
2008-01-01
The tendency to underestimate egocentric distances in immersive virtual environments (VEs) is not well understood. However, previous research (A. R. Richardson & D. Waller, 2007) has demonstrated that a brief period of interaction with the VE prior to making distance judgments can effectively eliminate subsequent underestimation. Here the authors…
ERIC Educational Resources Information Center
Nussli, Natalie; Oh, Kevin; McCandless, Kevin
2014-01-01
The purpose of this mixed methods study was to help pre-service teachers experience and evaluate the potential of Second Life, a three-dimensional immersive virtual environment, for potential integration into their future teaching. By completing collaborative assignments in Second Life, nineteen pre-service general education teachers explored an…
ERIC Educational Resources Information Center
Yang, Mau-Tsuen; Liao, Wan-Che
2014-01-01
The physical-virtual immersion and real-time interaction play an essential role in cultural and language learning. Augmented reality (AR) technology can be used to seamlessly merge virtual objects with real-world images to realize immersions. Additionally, computer vision (CV) technology can recognize free-hand gestures from live images to enable…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drouhard, Margaret MEG G; Steed, Chad A; Hahn, Steven E
In this paper, we propose strategies and objectives for immersive data visualization with applications in materials science using the Oculus Rift virtual reality headset. We provide background on currently available analysis tools for neutron scattering data and other large-scale materials science projects. In the context of the current challenges facing scientists, we discuss immersive virtual reality visualization as a potentially powerful solution. We introduce a prototype immersive visual- ization system, developed in conjunction with materials scientists at the Spallation Neutron Source, which we have used to explore large crystal structures and neutron scattering data. Finally, we offer our perspective onmore » the greatest challenges that must be addressed to build effective and intuitive virtual reality analysis tools that will be useful for scientists in a wide range of fields.« less
D Modelling and Mapping for Virtual Exploration of Underwater Archaeology Assets
NASA Astrophysics Data System (ADS)
Liarokapis, F.; Kouřil, P.; Agrafiotis, P.; Demesticha, S.; Chmelík, J.; Skarlatos, D.
2017-02-01
This paper investigates immersive technologies to increase exploration time in an underwater archaeological site, both for the public, as well as, for researchers and scholars. Focus is on the Mazotos shipwreck site in Cyprus, which is located 44 meters underwater. The aim of this work is two-fold: (a) realistic modelling and mapping of the site and (b) an immersive virtual reality visit. For 3D modelling and mapping optical data were used. The underwater exploration is composed of a variety of sea elements including: plants, fish, stones, and artefacts, which are randomly positioned. Users can experience an immersive virtual underwater visit in Mazotos shipwreck site and get some information about the shipwreck and its contents for raising their archaeological knowledge and cultural awareness.
Interventional radiology virtual simulator for liver biopsy.
Villard, P F; Vidal, F P; ap Cenydd, L; Holbrey, R; Pisharody, S; Johnson, S; Bulpitt, A; John, N W; Bello, F; Gould, D
2014-03-01
Training in Interventional Radiology currently uses the apprenticeship model, where clinical and technical skills of invasive procedures are learnt during practice in patients. This apprenticeship training method is increasingly limited by regulatory restrictions on working hours, concerns over patient risk through trainees' inexperience and the variable exposure to case mix and emergencies during training. To address this, we have developed a computer-based simulation of visceral needle puncture procedures. A real-time framework has been built that includes: segmentation, physically based modelling, haptics rendering, pseudo-ultrasound generation and the concept of a physical mannequin. It is the result of a close collaboration between different universities, involving computer scientists, clinicians, clinical engineers and occupational psychologists. The technical implementation of the framework is a robust and real-time simulation environment combining a physical platform and an immersive computerized virtual environment. The face, content and construct validation have been previously assessed, showing the reliability and effectiveness of this framework, as well as its potential for teaching visceral needle puncture. A simulator for ultrasound-guided liver biopsy has been developed. It includes functionalities and metrics extracted from cognitive task analysis. This framework can be useful during training, particularly given the known difficulties in gaining significant practice of core skills in patients.
A Review of Simulators with Haptic Devices for Medical Training.
Escobar-Castillejos, David; Noguez, Julieta; Neri, Luis; Magana, Alejandra; Benes, Bedrich
2016-04-01
Medical procedures often involve the use of the tactile sense to manipulate organs or tissues by using special tools. Doctors require extensive preparation in order to perform them successfully; for example, research shows that a minimum of 750 operations are needed to acquire sufficient experience to perform medical procedures correctly. Haptic devices have become an important training alternative and they have been considered to improve medical training because they let users interact with virtual environments by adding the sense of touch to the simulation. Previous articles in the field state that haptic devices enhance the learning of surgeons compared to current training environments used in medical schools (corpses, animals, or synthetic skin and organs). Consequently, virtual environments use haptic devices to improve realism. The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices. In particular we focus on stitching, palpation, dental procedures, endoscopy, laparoscopy, and orthopaedics. These simulators are reviewed and compared from the viewpoint of used technology, the number of degrees of freedom, degrees of force feedback, perceived realism, immersion, and feedback provided to the user. In the conclusion, several observations per area and suggestions for future work are provided.
Buttussi, Fabio; Chittaro, Luca
2018-02-01
The increasing availability of head-mounted displays (HMDs) for home use motivates the study of the possible effects that adopting this new hardware might have on users. Moreover, while the impact of display type has been studied for different kinds of tasks, it has been scarcely explored in procedural training. Our study considered three different types of displays used by participants for training in aviation safety procedures with a serious game. The three displays were respectively representative of: (i) desktop VR (a standard desktop monitor), (ii) many setups for immersive VR used in the literature (an HMD with narrow field of view and a 3-DOF tracker), and (iii) new setups for immersive home VR (an HMD with wide field of view and 6-DOF tracker). We assessed effects on knowledge gain, and different self-reported measures (self-efficacy, engagement, presence). Unlike previous studies of display type that measured effects only immediately after the VR experience, we considered also a longer time span (2 weeks). Results indicated that the display type played a significant role in engagement and presence. The training benefits (increased knowledge and self-efficacy) were instead obtained, and maintained at two weeks, regardless of the display used. The paper discusses the implications of these results.
KinImmerse: Macromolecular VR for NMR ensembles
Block, Jeremy N; Zielinski, David J; Chen, Vincent B; Davis, Ian W; Vinson, E Claire; Brady, Rachael; Richardson, Jane S; Richardson, David C
2009-01-01
Background In molecular applications, virtual reality (VR) and immersive virtual environments have generally been used and valued for the visual and interactive experience – to enhance intuition and communicate excitement – rather than as part of the actual research process. In contrast, this work develops a software infrastructure for research use and illustrates such use on a specific case. Methods The Syzygy open-source toolkit for VR software was used to write the KinImmerse program, which translates the molecular capabilities of the kinemage graphics format into software for display and manipulation in the DiVE (Duke immersive Virtual Environment) or other VR system. KinImmerse is supported by the flexible display construction and editing features in the KiNG kinemage viewer and it implements new forms of user interaction in the DiVE. Results In addition to molecular visualizations and navigation, KinImmerse provides a set of research tools for manipulation, identification, co-centering of multiple models, free-form 3D annotation, and output of results. The molecular research test case analyzes the local neighborhood around an individual atom within an ensemble of nuclear magnetic resonance (NMR) models, enabling immersive visual comparison of the local conformation with the local NMR experimental data, including target curves for residual dipolar couplings (RDCs). Conclusion The promise of KinImmerse for production-level molecular research in the DiVE is shown by the locally co-centered RDC visualization developed there, which gave new insights now being pursued in wider data analysis. PMID:19222844
Visualization of reservoir simulation data with an immersive virtual reality system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, B.K.
1996-10-01
This paper discusses an investigation into the use of an immersive virtual reality (VR) system to visualize reservoir simulation output data. The hardware and software configurations of the test-immersive VR system are described and compared to a nonimmersive VR system and to an existing workstation screen-based visualization system. The structure of 3D reservoir simulation data and the actions to be performed on the data within the VR system are discussed. The subjective results of the investigation are then presented, followed by a discussion of possible future work.
BIM Based Virtual Environment for Fire Emergency Evacuation
Rezgui, Yacine; Ong, Hoang N.
2014-01-01
Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management. PMID:25197704
Zanbaka, Catherine A; Lok, Benjamin C; Babu, Sabarish V; Ulinski, Amy C; Hodges, Larry F
2005-01-01
We describe a between-subjects experiment that compared four different methods of travel and their effect on cognition and paths taken in an immersive virtual environment (IVE). Participants answered a set of questions based on Crook's condensation of Bloom's taxonomy that assessed their cognition of the IVE with respect to knowledge, understanding and application, and higher mental processes. Participants also drew a sketch map of the IVE and the objects within it. The users' sense of presence was measured using the Steed-Usoh-Slater Presence Questionnaire. The participants' position and head orientation were automatically logged during their exposure to the virtual environment. These logs were later used to create visualizations of the paths taken. Path analysis, such as exploring the overlaid path visualizations and dwell data information, revealed further differences among the travel techniques. Our results suggest that, for applications where problem solving and evaluation of information is important or where opportunity to train is minimal, then having a large tracked space so that the participant can walk around the virtual environment provides benefits over common virtual travel techniques.
Interactive Visualization to Advance Earthquake Simulation
NASA Astrophysics Data System (ADS)
Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn
2008-04-01
The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.
Stefan, P; Pfandler, M; Wucherer, P; Habert, S; Fürmetz, J; Weidert, S; Euler, E; Eck, U; Lazarovici, M; Weigl, M; Navab, N
2018-04-01
Surgical simulators are being increasingly used as an attractive alternative to clinical training in addition to conventional animal models and human specimens. Typically, surgical simulation technology is designed for the purpose of teaching technical surgical skills (so-called task trainers). Simulator training in surgery is therefore in general limited to the individual training of the surgeon and disregards the participation of the rest of the surgical team. The objective of the project Assessment and Training of Medical Experts based on Objective Standards (ATMEOS) is to develop an immersive simulated operating room environment that enables the training and assessment of multidisciplinary surgical teams under various conditions. Using a mixed reality approach, a synthetic patient model, real surgical instruments and radiation-free virtual X‑ray imaging are combined into a simulation of spinal surgery. In previous research studies, the concept was evaluated in terms of realism, plausibility and immersiveness. In the current research, assessment measurements for technical and non-technical skills are developed and evaluated. The aim is to observe multidisciplinary surgical teams in the simulated operating room during minimally invasive spinal surgery and objectively assess the performance of the individual team members and the entire team. Moreover, the effectiveness of training methods and surgical techniques or success critical factors, e. g. management of crisis situations, can be captured and objectively assessed in the controlled environment.
A Pilot Study of Motivational Interviewing Training in a Virtual World
Heyden, Robin; Heyden, Neil; Schroy, Paul; Andrew, Stephen; Sadikova, Ekaterina; Wiecha, John
2011-01-01
Background Motivational interviewing (MI) is an evidence-based, patient-centered counseling strategy proven to support patients seeking health behavior change. Yet the time and travel commitment for MI training is often a barrier to the adoption of MI by health care professionals. Virtual worlds such as Second Life (SL) are rapidly becoming part of the educational technology landscape and offer not only the potential to improve access to MI training but also to deepen the MI training experience through the use of immersive online environments. Despite SL’s potential for medical education applications, little work is published studying its use for this purpose and still less is known of educational outcomes for physician training in MI using a virtual-world platform. Objective Our aims were to (1) explore the feasibility, acceptability, and effectiveness of a virtual-world platform for delivering MI training designed for physicians and (2) pilot test instructional designs using SL for MI training. Methods We designed and pilot tested an MI training program in the SL virtual world. We trained and enrolled 13 primary care physicians in a two-session, interactive program in SL on the use of MI for counseling patients about colorectal cancer screening. We measured self-reported changes in confidence and clinical practice patterns for counseling on colorectal cancer screening, and acceptability of the virtual-world learning environment and the MI instructional design. Effectiveness of the MI training was assessed by coding and scoring tape-recorded interviews with a blinded mock patient conducted pre- and post-training. Results A total of 13 physicians completed the training. Acceptability ratings for the MI training ranged from 4.1 to 4.7 on a 5-point scale. The SL learning environment was also highly rated, with 77% (n = 10) of the doctors reporting SL to be an effective educational medium. Learners’ confidence and clinical practice patterns for colorectal cancer screening improved after training. Pre- to post-training mean confidence scores for the ability to elicit and address barriers to colorectal cancer screening (4.5 to 6.2, P = .004) and knowledge of decision-making psychology (4.5 to 5.7, P = .02) and behavior change psychology (4.9 to 6.2, P = .02) increased significantly. Global MI skills scores increased significantly and component scores for the MI skills also increased, with statistically significant improvements in 4 of the 5 component skills: empathy (3.12 to 3.85, P = .001), autonomy (3.07 to 3.85, P < .001), collaboration (2.88 to 3.46, P = .02), and evocative response (2.80 to 3.61, P = .008). Conclusions The results of this pilot study suggest that virtual worlds offer the potential for a new medical education pedagogy that will enhance learning outcomes for patient-centered communication skills training. PMID:21946183
ERIC Educational Resources Information Center
Plumert, Jodie M.; Kearney, Joseph K.; Cremer, James F.
2004-01-01
This study examined gap choices and crossing behavior in children and adults using an immersive, interactive bicycling simulator. Ten- and 12-year-olds and adults rode a bicycle mounted on a stationary trainer through a virtual environment consisting of a street with 6 intersections. Participants faced continuous cross traffic traveling at 25mph…
Cognitive factors associated with immersion in virtual environments
NASA Technical Reports Server (NTRS)
Psotka, Joseph; Davison, Sharon
1993-01-01
Immersion into the dataspace provided by a computer, and the feeling of really being there or 'presence', are commonly acknowledged as the uniquely important features of virtual reality environments. How immersed one feels appears to be determined by a complex set of physical components and affordances of the environment, and as yet poorly understood psychological processes. Pimentel and Teixeira say that the experience of being immersed in a computer-generated world involves the same mental shift of 'suspending your disbelief for a period of time' as 'when you get wrapped up in a good novel or become absorbed in playing a computer game'. That sounds as if it could be right, but it would be good to get some evidence for these important conclusions. It might be even better to try to connect these statements with theoretical positions that try to do justice to complex cognitive processes. The basic precondition for understanding Virtual Reality (VR) is understanding the spatial representation systems that localize our bodies or egocenters in space. The effort to understand these cognitive processes is being driven with new energy by the pragmatic demands of successful virtual reality environments, but the literature is largely sparse and anecdotal.
Pereira, Michael; Argelaguet, Ferran; Millán, José Del R; Lécuyer, Anatole
2018-01-01
Competition changes the environment for athletes. The difficulty of training for such stressful events can lead to the well-known effect of "choking" under pressure, which prevents athletes from performing at their best level. To study the effect of competition on the human brain, we recorded pilot electroencephalography (EEG) data while novice shooters were immersed in a realistic virtual environment representing a shooting range. We found a differential between-subject effect of competition on mu (8-12 Hz) oscillatory activity during aiming; compared to training, the more the subject was able to desynchronize his mu rhythm during competition, the better was his shooting performance. Because this differential effect could not be explained by differences in simple measures of the kinematics and muscular activity, nor by the effect of competition or shooting performance per se , we interpret our results as evidence that mu desynchronization has a positive effect on performance during competition.
Pereira, Michael; Argelaguet, Ferran; Millán, José del R.; Lécuyer, Anatole
2018-01-01
Competition changes the environment for athletes. The difficulty of training for such stressful events can lead to the well-known effect of “choking” under pressure, which prevents athletes from performing at their best level. To study the effect of competition on the human brain, we recorded pilot electroencephalography (EEG) data while novice shooters were immersed in a realistic virtual environment representing a shooting range. We found a differential between-subject effect of competition on mu (8–12 Hz) oscillatory activity during aiming; compared to training, the more the subject was able to desynchronize his mu rhythm during competition, the better was his shooting performance. Because this differential effect could not be explained by differences in simple measures of the kinematics and muscular activity, nor by the effect of competition or shooting performance per se, we interpret our results as evidence that mu desynchronization has a positive effect on performance during competition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stansfield, S.; Shawver, D.; Sobel, A.
This paper presents a prototype virtual reality (VR) system for training medical first responders. The initial application is to battlefield medicine and focuses on the training of medical corpsmen and other front-line personnel who might be called upon to provide emergency triage on the battlefield. The system is built upon Sandia`s multi-user, distributed VR platform and provides an interactive, immersive simulation capability. The user is represented by an Avatar and is able to manipulate his virtual instruments and carry out medical procedures. A dynamic casualty simulation provides realistic cues to the patient`s condition (e.g. changing blood pressure and pulse) andmore » responds to the actions of the trainee (e.g. a change in the color of a patient`s skin may result from a check of the capillary refill rate). The current casualty simulation is of an injury resulting in a tension pneumothorax. This casualty model was developed by the University of Pennsylvania and integrated into the Sandia MediSim system.« less
Armstrong, Ryan; de Ribaupierre, Sandrine; Eagleson, Roy
2014-04-01
This paper describes the design and development of a software tool for the evaluation and training of surgical residents using an interactive, immersive, virtual environment. Our objective was to develop a tool to evaluate user spatial reasoning skills and knowledge in a neuroanatomical context, as well as to augment their performance through interactivity. In the visualization, manually segmented anatomical surface images of MRI scans of the brain were rendered using a stereo display to improve depth cues. A magnetically tracked wand was used as a 3D input device for localization tasks within the brain. The movement of the wand was made to correspond to movement of a spherical cursor within the rendered scene, providing a reference for localization. Users can be tested on their ability to localize structures within the 3D scene, and their ability to place anatomical features at the appropriate locations within the rendering. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Postural and Spatial Orientation Driven by Virtual Reality
Keshner, Emily A.; Kenyon, Robert V.
2009-01-01
Orientation in space is a perceptual variable intimately related to postural orientation that relies on visual and vestibular signals to correctly identify our position relative to vertical. We have combined a virtual environment with motion of a posture platform to produce visual-vestibular conditions that allow us to explore how motion of the visual environment may affect perception of vertical and, consequently, affect postural stabilizing responses. In order to involve a higher level perceptual process, we needed to create a visual environment that was immersive. We did this by developing visual scenes that possess contextual information using color, texture, and 3-dimensional structures. Update latency of the visual scene was close to physiological latencies of the vestibulo-ocular reflex. Using this system we found that even when healthy young adults stand and walk on a stable support surface, they are unable to ignore wide field of view visual motion and they adapt their postural orientation to the parameters of the visual motion. Balance training within our environment elicited measurable rehabilitation outcomes. Thus we believe that virtual environments can serve as a clinical tool for evaluation and training of movement in situations that closely reflect conditions found in the physical world. PMID:19592796
Ahn, Dohyun; Seo, Youngnam; Kim, Minkyung; Kwon, Joung Huem; Jung, Younbo; Ahn, Jungsun
2014-01-01
Abstract This study examined the role of display size and mode in increasing users' sense of being together with and of their psychological immersion in a virtual character. Using a high-resolution three-dimensional virtual character, this study employed a 2×2 (stereoscopic mode vs. monoscopic mode×actual human size vs. small size display) factorial design in an experiment with 144 participants randomly assigned to each condition. Findings showed that stereoscopic mode had a significant effect on both users' sense of being together and psychological immersion. However, display size affected only the sense of being together. Furthermore, display size was not found to moderate the effect of stereoscopic mode. PMID:24606057
Routine clinical application of virtual reality in abdominal surgery.
Sampogna, Gianluca; Pugliese, Raffaele; Elli, Marco; Vanzulli, Angelo; Forgione, Antonello
2017-06-01
The advantages of 3D reconstruction, immersive virtual reality (VR) and 3D printing in abdominal surgery have been enunciated for many years, but still today their application in routine clinical practice is almost nil. We investigate their feasibility, user appreciation and clinical impact. Fifteen patients undergoing pancreatic, hepatic or renal surgery were studied realizing a 3D reconstruction of target anatomy. Then, an immersive VR environment was developed to import 3D models, and some details of the 3D scene were printed. All the phases of our workflow employed open-source software and low-cost hardware, easily implementable by other surgical services. A qualitative evaluation of the three approaches was performed by 20 surgeons, who filled in a specific questionnaire regarding a clinical case for each organ considered. Preoperative surgical planning and intraoperative guidance was feasible for all patients included in the study. The vast majority of surgeons interviewed scored their quality and usefulness as very good. Despite extra time, costs and efforts necessary to implement these systems, the benefits shown by the analysis of questionnaires recommend to invest more resources to train physicians to adopt these technologies routinely, even if further and larger studies are still mandatory.
Schmitt, Yuko S; Hoffman, Hunter G; Blough, David K; Patterson, David R; Jensen, Mark P; Soltani, Maryam; Carrougher, Gretchen J; Nakamura, Dana; Sharar, Sam R
2011-02-01
This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6-19 years old) performed range-of-motion exercises under a therapist's direction for 1-5 days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects' perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27-44%) in pain ratings during virtual reality. They also reported improved affect ("fun") during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use. Copyright © 2010 Elsevier Ltd and ISBI. All rights reserved.
Schmitt, Yuko S.; Hoffman, Hunter G.; Blough, David K.; Patterson, David R.; Jensen, Mark P.; Soltani, Maryam; Carrougher, Gretchen J.; Nakamura, Dana; Sharar, Sam R.
2010-01-01
This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6–19 years old) performed range-of-motion exercises under a therapist’s direction for one to five days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects’ perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27–44%) in pain ratings during virtual reality. They also reported improved affect (“fun”) during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use. PMID:20692769
Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.
Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J
2011-11-01
To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Knutzen, K. Brant; Kennedy, David M.
2012-01-01
This article describes the findings of a 3-month study on how social encounters mediated by an online Virtual Immersive Environment (VIE) impacted on the relational self-concept of adolescents. The study gathered data from two groups of students as they took an Introduction to Design and Programming class. Students in group 1 undertook course…
ERIC Educational Resources Information Center
Dalgarno, Barney; Lee, Mark J. W.; Carlson, Lauren; Gregory, Sue; Tynan, Belinda
2011-01-01
This article describes the research design of, and reports selected findings from, a scoping study aimed at examining current and planned applications of 3D immersive virtual worlds at higher education institutions across Australia and New Zealand. The scoping study is the first of its kind in the region, intended to parallel and complement a…
NASA Astrophysics Data System (ADS)
Kersten, T. P.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A. P.
2017-05-01
Recent advances in contemporary Virtual Reality (VR) technologies are going to have a significant impact on veryday life. Through VR it is possible to virtually explore a computer-generated environment as a different reality, and to immerse oneself into the past or in a virtual museum without leaving the current real-life situation. For such the ultimate VR experience, the user should only see the virtual world. Currently, the user must wear a VR headset which fits around the head and over the eyes to visually separate themselves from the physical world. Via the headset images are fed to the eyes through two small lenses. Cultural heritage monuments are ideally suited both for thorough multi-dimensional geometric documentation and for realistic interactive visualisation in immersive VR applications. Additionally, the game industry offers tools for interactive visualisation of objects to motivate users to virtually visit objects and places. In this paper the generation of a virtual 3D model of the Selimiye mosque in the city of Edirne, Turkey and its processing for data integration into the game engine Unity is presented. The project has been carried out as a co-operation between BİMTAŞ, a company of the Greater Municipality of Istanbul, Turkey and the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg, Germany to demonstrate an immersive and interactive visualisation using the new VR system HTC Vive. The workflow from data acquisition to VR visualisation, including the necessary programming for navigation, is described. Furthermore, the possible use (including simultaneous multiple users environments) of such a VR visualisation for a CH monument is discussed in this contribution.
Immersive Technologies and Language Learning
ERIC Educational Resources Information Center
Blyth, Carl
2018-01-01
This article briefly traces the historical conceptualization of linguistic and cultural immersion through technological applications, from the early days of locally networked computers to the cutting-edge technologies known as virtual reality and augmented reality. Next, the article explores the challenges of immersive technologies for the field…
AVESTAR Center for Operational Excellence of Electricity Generation Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, Stephen
2012-08-29
To address industry challenges in attaining operational excellence for electricity generation plants, the U.S. Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has launched a world-class facility for Advanced Virtual Energy Simulation Training and Research (AVESTARTM). This presentation will highlight the AVESTARTM Center simulators, facilities, and comprehensive training, education, and research programs focused on the operation and control of high-efficiency, near-zero-emission electricity generation plants. The AVESTAR Center brings together state-of-the-art, real-time, high-fidelity dynamic simulators with full-scope operator training systems (OTSs) and 3D virtual immersive training systems (ITSs) into an integrated energy plant and control room environment. AVESTAR’s initial offeringmore » combines--for the first time--a “gasification with CO2 capture” process simulator with a “combined-cycle” power simulator together in a single OTS/ITS solution for an integrated gasification combined cycle (IGCC) power plant with carbon dioxide (CO2) capture. IGCC systems are an attractive technology option for power generation, especially when capturing and storing CO2 is necessary to satisfy emission targets. The AVESTAR training program offers a variety of courses that merge classroom learning, simulator-based OTS learning in a control-room operations environment, and immersive learning in the interactive 3D virtual plant environment or ITS. All of the courses introduce trainees to base-load plant operation, control, startups, and shutdowns. Advanced courses require participants to become familiar with coordinated control, fuel switching, power-demand load shedding, and load following, as well as to problem solve equipment and process malfunctions. Designed to ensure work force development, training is offered for control room and plant field operators, as well as engineers and managers. Such comprehensive simulator-based instruction allows for realistic training without compromising worker, equipment, and environmental safety. It also better prepares operators and engineers to manage the plant closer to economic constraints while minimizing or avoiding the impact of any potentially harmful, wasteful, or inefficient events. The AVESTAR Center is also used to augment graduate and undergraduate engineering education in the areas of process simulation, dynamics, control, and safety. Students and researchers gain hands-on simulator-based training experience and learn how the commercial-scale power plants respond dynamically to changes in manipulated inputs, such as coal feed flow rate and power demand. Students also analyze how the regulatory control system impacts power plant performance and stability. In addition, students practice start-up, shutdown, and malfunction scenarios. The 3D virtual ITSs are used for plant familiarization, walk-through, equipment animations, and safety scenarios. To further leverage the AVESTAR facilities and simulators, NETL and its university partners are pursuing an innovative and collaborative R&D program. In the area of process control, AVESTAR researchers are developing enhanced strategies for regulatory control and coordinated plant-wide control, including gasifier and gas turbine lead, as well as advanced process control using model predictive control (MPC) techniques. Other AVESTAR R&D focus areas include high-fidelity equipment modeling using partial differential equations, dynamic reduced order modeling, optimal sensor placement, 3D virtual plant simulation, and modern grid. NETL and its partners plan to continue building the AVESTAR portfolio of dynamic simulators, immersive training systems, and advanced research capabilities to satisfy industry’s growing need for training and experience with the operation and control of clean energy plants. Future dynamic simulators under development include natural gas combined cycle (NGCC) and supercritical pulverized coal (SCPC) plants with post-combustion CO2 capture. These dynamic simulators are targeted for use in establishing a Virtual Carbon Capture Center (VCCC), similar in concept to the DOE’s National Carbon Capture Center for slipstream testing. The VCCC will enable developers of CO2 capture technologies to integrate, test, and optimize the operation of their dynamic capture models within the context of baseline power plant dynamic models. The objective is to provide hands-on, simulator-based “learn-by-operating” test platforms to accelerate the scale-up and deployment of CO2 capture technologies. Future AVESTAR plans also include pursuing R&D on the dynamics, operation, and control of integrated electricity generation and storage systems for the modern grid era. Special emphasis will be given to combining load-following energy plants with renewable and distributed generating supplies and fast-ramping energy storage systems to provide near constant baseload power.« less
NASA Astrophysics Data System (ADS)
Murphy, M.; Chenaux, A.; Keenaghan, G.; GIbson, V..; Butler, J.; Pybusr, C.
2017-08-01
In this paper the recording and design for a Virtual Reality Immersive Model of Armagh Observatory is presented, which will replicate the historic buildings and landscape with distant meridian markers and position of its principal historic instruments within a model of the night sky showing the position of bright stars. The virtual reality model can be used for educational purposes allowing the instruments within the historic building model to be manipulated within 3D space to demonstrate how the position measurements of stars were made in the 18th century. A description is given of current student and researchers activities concerning on-site recording and surveying and the virtual modelling of the buildings and landscape. This is followed by a design for a Virtual Reality Immersive Model of Armagh Observatory use game engine and virtual learning platforms and concepts.
How do children learn to cross the street? The process of pedestrian safety training
Schwebel, David C.; Shen, Jiabin; McClure, Leslie A.
2016-01-01
Objective Pedestrian injuries are a leading cause of child death, and may be reduced by training children to cross streets more safely. Such training is most effective when children receive repeated practice at the complex cognitive-perceptual task of judging moving traffic and selecting safe crossing gaps, but there is limited data on how much practice is required for children to reach adult levels of functioning. Using existing data, we examined how children’s pedestrian skill changed over the course of six pedestrian safety training sessions, each comprised of 45 crossings within a virtual pedestrian environment. Methods As part of a randomized controlled trial on pedestrian safety training, 59 children ages 7-8 crossed the street within a semi-immersive virtual pedestrian environment 270 times over a 3-week period (6 sessions of 45 crossings each). Feedback was provided after each crossing, and traffic speed and density was advanced as children’s skill improved. Post-intervention pedestrian behavior was assessed a week later in the virtual environment and compared to adult behavior with identical traffic patterns. Results Over the course of training, children entered traffic gaps more quickly and chose tighter gaps to cross within; their crossing efficiency appeared to increase. By the end of training, some aspects of children’s pedestrian behavior was comparable to adult behavior but other aspects were not, indicating the training was worthwhile but insufficient for most children to achieve adult levels of functioning. Conclusions Repeated practice in a simulated pedestrian environment helps children learn aspects of safe and efficient pedestrian behavior. Six twice-weekly training sessions of 45 crossings each were insufficient for children to reach adult pedestrian functioning, however, and future research should continue to study the trajectory and quantity of child pedestrian safety training needed for children to become competent pedestrians. PMID:26760077
Optale, Gabriele; Urgesi, Cosimo; Busato, Valentina; Marin, Silvia; Piron, Lamberto; Priftis, Konstantinos; Gamberini, Luciano; Capodieci, Salvatore; Bordin, Adalberto
2010-05-01
Memory decline is a prevalent aspect of aging but may also be the first sign of cognitive pathology. Virtual reality (VR) using immersion and interaction may provide new approaches to the treatment of memory deficits in elderly individuals. The authors implemented a VR training intervention to try to lessen cognitive decline and improve memory functions. The authors randomly assigned 36 elderly residents of a rest care facility (median age 80 years) who were impaired on the Verbal Story Recall Test either to the experimental group (EG) or the control group (CG). The EG underwent 6 months of VR memory training (VRMT) that involved auditory stimulation and VR experiences in path finding. The initial training phase lasted 3 months (3 auditory and 3 VR sessions every 2 weeks), and there was a booster training phase during the following 3 months (1 auditory and 1 VR session per week). The CG underwent equivalent face-to-face training sessions using music therapy. Both groups participated in social and creative and assisted-mobility activities. Neuropsychological and functional evaluations were performed at baseline, after the initial training phase, and after the booster training phase. The EG showed significant improvements in memory tests, especially in long-term recall with an effect size of 0.7 and in several other aspects of cognition. In contrast, the CG showed progressive decline. The authors suggest that VRMT may improve memory function in elderly adults by enhancing focused attention.
Effectiveness of Immersive Virtual Reality in Surgical Training-A Randomized Control Trial.
Pulijala, Yeshwanth; Ma, Minhua; Pears, Matthew; Peebles, David; Ayoub, Ashraf
2018-05-01
Surgical training methods are evolving with the technological advancements, including the application of virtual reality (VR) and augmented reality. However, 28 to 40% of novice residents are not confident in performing a major surgical procedure. VR surgery, an immersive VR (iVR) experience, was developed using Oculus Rift and Leap Motion devices (Leap Motion, Inc, San Francisco, CA) to address this challenge. Our iVR is a multisensory, holistic surgical training application that demonstrates a maxillofacial surgical technique, the Le Fort I osteotomy. The main objective of the present study was to evaluate the effect of using VR surgery on the self-confidence and knowledge of surgical residents. A multisite, single-blind, parallel, randomized controlled trial (RCT) was performed. The participants were novice surgical residents with limited experience in performing the Le Fort I osteotomy. The primary outcome measures were the self-assessment scores of trainee confidence using a Likert scale and an objective assessment of the cognitive skills. Ninety-five residents from 7 dental schools were included in the RCT. The participants were randomly divided into a study group of 51 residents and a control group of 44. Participants in the study group used the VR surgery application on an Oculus Rift with Leap Motion device. The control group participants used similar content in a standard PowerPoint presentation on a laptop. Repeated measures multivariate analysis of variance was applied to the data to assess the overall effect of the intervention on the confidence of the residents. The study group participants showed significantly greater perceived self-confidence levels compared with those in the control group (P = .034; α = 0.05). Novices in the first year of their training showed the greatest improvement in their confidence compared with those in their second and third year. iVR experiences improve the knowledge and self-confidence of the surgical residents. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Embodying compassion: a virtual reality paradigm for overcoming excessive self-criticism.
Falconer, Caroline J; Slater, Mel; Rovira, Aitor; King, John A; Gilbert, Paul; Antley, Angus; Brewin, Chris R
2014-01-01
Virtual reality has been successfully used to study and treat psychological disorders such as phobias and posttraumatic stress disorder but has rarely been applied to clinically-relevant emotions other than fear and anxiety. Self-criticism is a ubiquitous feature of psychopathology and can be treated by increasing levels of self-compassion. We exploited the known effects of identification with a virtual body to arrange for healthy female volunteers high in self-criticism to experience self-compassion from an embodied first-person perspective within immersive virtual reality. Whereas observation and practice of compassionate responses reduced self-criticism, the additional experience of embodiment also increased self-compassion and feelings of being safe. The results suggest potential new uses for immersive virtual reality in a range of clinical conditions.
Embodying Compassion: A Virtual Reality Paradigm for Overcoming Excessive Self-Criticism
Falconer, Caroline J.; Slater, Mel; Rovira, Aitor; King, John A.; Gilbert, Paul; Antley, Angus; Brewin, Chris R.
2014-01-01
Virtual reality has been successfully used to study and treat psychological disorders such as phobias and posttraumatic stress disorder but has rarely been applied to clinically-relevant emotions other than fear and anxiety. Self-criticism is a ubiquitous feature of psychopathology and can be treated by increasing levels of self-compassion. We exploited the known effects of identification with a virtual body to arrange for healthy female volunteers high in self-criticism to experience self-compassion from an embodied first-person perspective within immersive virtual reality. Whereas observation and practice of compassionate responses reduced self-criticism, the additional experience of embodiment also increased self-compassion and feelings of being safe. The results suggest potential new uses for immersive virtual reality in a range of clinical conditions. PMID:25389766
Harnessing Neuroplasticity to Promote Rehabilitation: CI Therapy for TBI
2016-10-01
scheduled plus 33 to be enrolled, because we assume that the proportion of withdrawals will be the same as experienced to date, i.e., 24%. This plan will...period? Victor Mark, Investigator Interactive Immersive Virtual Reality Walking for SCI Neuropathic Pain (Trost) 0.24 calendar months Kim Cerise...Direct Costs: $149,999 This project designs and test an immersive virtual reality treatment method to control neuropathic pain following traumatic spinal
Virtually numbed: immersive video gaming alters real-life experience.
Weger, Ulrich W; Loughnan, Stephen
2014-04-01
As actors in a highly mechanized environment, we are citizens of a world populated not only by fellow humans, but also by virtual characters (avatars). Does immersive video gaming, during which the player takes on the mantle of an avatar, prompt people to adopt the coldness and rigidity associated with robotic behavior and desensitize them to real-life experience? In one study, we correlated participants' reported video-gaming behavior with their emotional rigidity (as indicated by the number of paperclips that they removed from ice-cold water). In a second experiment, we manipulated immersive and nonimmersive gaming behavior and then likewise measured the extent of the participants' emotional rigidity. Both studies yielded reliable impacts, and thus suggest that immersion into a robotic viewpoint desensitizes people to real-life experiences in oneself and others.
Schmitz, Patric; Hildebrandt, Julian; Valdez, Andre Calero; Kobbelt, Leif; Ziefle, Martina
2018-04-01
In virtual environments, the space that can be explored by real walking is limited by the size of the tracked area. To enable unimpeded walking through large virtual spaces in small real-world surroundings, redirection techniques are used. These unnoticeably manipulate the user's virtual walking trajectory. It is important to know how strongly such techniques can be applied without the user noticing the manipulation-or getting cybersick. Previously, this was estimated by measuring a detection threshold (DT) in highly-controlled psychophysical studies, which experimentally isolate the effect but do not aim for perceived immersion in the context of VR applications. While these studies suggest that only relatively low degrees of manipulation are tolerable, we claim that, besides establishing detection thresholds, it is important to know when the user's immersion breaks. We hypothesize that the degree of unnoticed manipulation is significantly different from the detection threshold when the user is immersed in a task. We conducted three studies: a) to devise an experimental paradigm to measure the threshold of limited immersion (TLI), b) to measure the TLI for slowly decreasing and increasing rotation gains, and c) to establish a baseline of cybersickness for our experimental setup. For rotation gains greater than 1.0, we found that immersion breaks quite late after the gain is detectable. However, for gains lesser than 1.0, some users reported a break of immersion even before established detection thresholds were reached. Apparently, the developed metric measures an additional quality of user experience. This article contributes to the development of effective spatial compression methods by utilizing the break of immersion as a benchmark for redirection techniques.
Comparative study on collaborative interaction in non-immersive and immersive systems
NASA Astrophysics Data System (ADS)
Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki
2007-09-01
This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.
Decentralized real-time simulation of forest machines
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Adam, Frank; Hoffmann, Katharina; Rossmann, Juergen; Kraemer, Michael; Schluse, Michael
2000-10-01
To develop realistic forest machine simulators is a demanding task. A useful simulator has to provide a close- to-reality simulation of the forest environment as well as the simulation of the physics of the vehicle. Customers demand a highly realistic three dimensional forestry landscape and the realistic simulation of the complex motion of the vehicle even in rough terrain in order to be able to use the simulator for operator training under close-to- reality conditions. The realistic simulation of the vehicle, especially with the driver's seat mounted on a motion platform, greatly improves the effect of immersion into the virtual reality of a simulated forest and the achievable level of education of the driver. Thus, the connection of the real control devices of forest machines to the simulation system has to be supported, i.e. the real control devices like the joysticks or the board computer system to control the crane, the aggregate etc. Beyond, the fusion of the board computer system and the simulation system is realized by means of sensors, i.e. digital and analog signals. The decentralized system structure allows several virtual reality systems to evaluate and visualize the information of the control devices and the sensors. So, while the driver is practicing, the instructor can immerse into the same virtual forest to monitor the session from his own viewpoint. In this paper, we are describing the realized structure as well as the necessary software and hardware components and application experiences.
Effects of sensory cueing in virtual motor rehabilitation. A review.
Palacios-Navarro, Guillermo; Albiol-Pérez, Sergio; García-Magariño García, Iván
2016-04-01
To critically identify studies that evaluate the effects of cueing in virtual motor rehabilitation in patients having different neurological disorders and to make recommendations for future studies. Data from MEDLINE®, IEEExplore, Science Direct, Cochrane library and Web of Science was searched until February 2015. We included studies that investigate the effects of cueing in virtual motor rehabilitation related to interventions for upper or lower extremities using auditory, visual, and tactile cues on motor performance in non-immersive, semi-immersive, or fully immersive virtual environments. These studies compared virtual cueing with an alternative or no intervention. Ten studies with a total number of 153 patients were included in the review. All of them refer to the impact of cueing in virtual motor rehabilitation, regardless of the pathological condition. After selecting the articles, the following variables were extracted: year of publication, sample size, study design, type of cueing, intervention procedures, outcome measures, and main findings. The outcome evaluation was done at baseline and end of the treatment in most of the studies. All of studies except one showed improvements in some or all outcomes after intervention, or, in some cases, in favor of the virtual rehabilitation group compared to the control group. Virtual cueing seems to be a promising approach to improve motor learning, providing a channel for non-pharmacological therapeutic intervention in different neurological disorders. However, further studies using larger and more homogeneous groups of patients are required to confirm these findings. Copyright © 2016 Elsevier Inc. All rights reserved.
A standardized set of 3-D objects for virtual reality research and applications.
Peeters, David
2018-06-01
The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.
Virtual surgery in a (tele-)radiology framework.
Glombitza, G; Evers, H; Hassfeld, S; Engelmann, U; Meinzer, H P
1999-09-01
This paper presents telemedicine as an extension of a teleradiology framework through tools for virtual surgery. To classify the described methods and applications, the research field of virtual reality (VR) is broadly reviewed. Differences with respect to technical equipment, methodological requirements and areas of application are pointed out. Desktop VR, augmented reality, and virtual reality are differentiated and discussed in some typical contexts of diagnostic support, surgical planning, therapeutic procedures, simulation and training. Visualization techniques are compared as a prerequisite for virtual reality and assigned to distinct levels of immersion. The advantage of a hybrid visualization kernel is emphasized with respect to the desktop VR applications that are subsequently shown. Moreover, software design aspects are considered by outlining functional openness in the architecture of the host system. Here, a teleradiology workstation was extended by dedicated tools for surgical planning through a plug-in mechanism. Examples of recent areas of application are introduced such as liver tumor resection planning, diagnostic support in heart surgery, and craniofacial surgery planning. In the future, surgical planning systems will become more important. They will benefit from improvements in image acquisition and communication, new image processing approaches, and techniques for data presentation. This will facilitate preoperative planning and intraoperative applications.
Gruzelier, John; Inoue, Atsuko; Smart, Roger; Steed, Anthony; Steffert, Tony
2010-08-16
Actors were trained in sensory-motor rhythm (SMR) neurofeedback interfaced with a computer rendition of a theatre auditorium. Enhancement of SMR led to changes in the lighting while inhibition of theta and high beta led to a reduction in intrusive audience noise. Participants were randomised to a virtual reality (VR) representation in a ReaCTor, with surrounding image projection seen through glasses, or to a 2D computer screen, which is the conventional neurofeedback medium. In addition there was a no-training comparison group. Acting performance was evaluated by three experts from both filmed, studio monologues and Hamlet excerpts on the stage of Shakespeare's Globe Theatre. Neurofeedback learning reached an asymptote earlier as did identification of the required mental state following training in the ReaCTor training compared with the computer screen, though groups reached the same asymptote. These advantages were paralleled by higher ratings of acting performance overall, well-rounded performance, and especially the creativity subscale including imaginative expression, conviction and characterisation. On the Flow State scales both neurofeedback groups scored higher than the no-training controls on self-ratings of sense of control, confidence and feeling at-one. This is the first demonstration of enhancement of artistic performance with eyes-open neurofeedback training, previously demonstrated only with eyes-closed slow-wave training. Efficacy is attributed to psychological engagement through the ecologically relevant learning context of the acting-space, putatively allowing transfer to the real world otherwise achieved with slow-wave training through imaginative visualisation. The immersive VR technology was more successful than a 2D rendition. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yue, Kang; Wang, Danli; Yang, Xinpan; Hu, Haichen; Liu, Yuqing; Zhu, Xiuqing
2016-10-01
To date, as the different application fields, most VR-based training systems have been different. Therefore, we should take the characteristics of application field into consideration and adopt different evaluation methods when evaluate the user experience of these training systems. In this paper, we propose a method to evaluate the user experience of virtual astronauts training system. Also, we design an experiment based on the proposed method. The proposed method takes learning performance as one of the evaluation dimensions, also combines with other evaluation dimensions such as: presence, immersion, pleasure, satisfaction and fatigue to evaluation user experience of the System. We collect subjective and objective data, the subjective data are mainly from questionnaire designed based on the evaluation dimensions and user interview conducted before and after the experiment. While the objective data are consisted of Electrocardiogram (ECG), reaction time, numbers of reaction error and the video data recorded during the experiment. For the analysis of data, we calculate the integrated score of each evaluation dimension by using factor analysis. In order to improve the credibility of the assessment, we use the ECG signal and reaction test data before and after experiment to validate the changes of fatigue during the experiment, and the typical behavioral features extracted from the experiment video to explain the result of subjective questionnaire. Experimental results show that the System has a better user experience and learning performance, but slight visual fatigue exists after experiment.
Cranial implant design using augmented reality immersive system.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2007-01-01
Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.
Gasco, Jaime; Patel, Achal; Luciano, Cristian; Holbrook, Thomas; Ortega-Barnett, Juan; Kuo, Yong-Fang; Rizzi, Silvio; Kania, Patrick; Banerjee, Pat; Roitberg, Ben Z
2013-12-01
To understand the perceived utility of a novel simulator to improve operative skill, eye-hand coordination, and depth perception. We used the ImmersiveTouch simulation platform (ImmersiveTouch, Inc., Chicago, Illinois, USA) in two U.S. Accreditation Council for Graduate Medical Education-accredited neurosurgical training programs: the University of Chicago and the University of Texas Medical Branch. A total of 54 trainees participated in the study, which consisted of 14 residents (group A), 20 senior medical students who were neurosurgery candidates (group B), and 20 junior medical students (group C). The participants performed a simulation task that established bipolar hemostasis in a virtual brain cavity and provided qualitative feedback regarding perceived benefits in eye-hand coordination, depth perception, and potential to assist in improving operating skills. The perceived ability of the simulator to positively influence skills judged by the three groups: group A, residents; group B, senior medical students; and group C, junior medical students was, respectively, 86%, 100%, and 100% for eye-hand coordination; 86%, 100%, and 95% for depth perception; and 79%, 100%, and 100% for surgical skills in the operating room. From all groups, 96.2% found the simulation somewhat or very useful to improve eye-hand coordination, and 94% considered it beneficial to improve depth perception and operating room skills. This simulation module may be suitable for resident training, as well as for the development of career interest and skill acquisition; however, validation for this type of simulation needs to be further developed. Copyright © 2013 Elsevier Inc. All rights reserved.
Perez-Marcos, Daniel; Solazzi, Massimiliano; Steptoe, William; Oyekoya, Oyewole; Frisoli, Antonio; Weyrich, Tim; Steed, Anthony; Tecchia, Franco; Slater, Mel; Sanchez-Vives, Maria V.
2012-01-01
Although telerehabilitation systems represent one of the most technologically appealing clinical solutions for the immediate future, they still present limitations that prevent their standardization. Here we propose an integrated approach that includes three key and novel factors: (a) fully immersive virtual environments, including virtual body representation and ownership; (b) multimodal interaction with remote people and virtual objects including haptic interaction; and (c) a physical representation of the patient at the hospital through embodiment agents (e.g., as a physical robot). The importance of secure and rapid communication between the nodes is also stressed and an example implemented solution is described. Finally, we discuss the proposed approach with reference to the existing literature and systems. PMID:22787454
Virtual Reality: Emerging Applications and Future Directions
ERIC Educational Resources Information Center
Ludlow, Barbara L.
2015-01-01
Virtual reality is an emerging technology that has resulted in rapid expansion in the development of virtual immersive environments for use as educational simulations in schools, colleges and universities. This article presents an overview of virtual reality, describes a number of applications currently being used by special educators for…
ERIC Educational Resources Information Center
O'Connor, Eileen A.; Domingo, Jelia
2017-01-01
With the advent of open source virtual environments, the associated cost reductions, and the more flexible options, avatar-based virtual reality environments are within reach of educators. By using and repurposing readily available virtual environments, instructors can bring engaging, community-building, and immersive learning opportunities to…
Recent developments in virtual experience design and production
NASA Astrophysics Data System (ADS)
Fisher, Scott S.
1995-03-01
Today, the media of VR and Telepresence are in their infancy and the emphasis is still on technology and engineering. But, it is not the hardware people might use that will determine whether VR becomes a powerful medium--instead, it will be the experiences that they are able to have that will drive its acceptance and impact. A critical challenge in the elaboration of these telepresence capabilities will be the development of environments that are as unpredictable and rich in interconnected processes as an actual location or experience. This paper will describe the recent development of several Virtual Experiences including: `Menagerie', an immersive Virtual Environment inhabited by virtual characters designed to respond to and interact with its users; and `The Virtual Brewery', an immersive public VR installation that provides multiple levels of interaction in an artistic interpretation of the brewing process.
Using Interactive Visualization to Analyze Solid Earth Data and Geodynamics Models
NASA Astrophysics Data System (ADS)
Kellogg, L. H.; Kreylos, O.; Billen, M. I.; Hamann, B.; Jadamec, M. A.; Rundle, J. B.; van Aalsburg, J.; Yikilmaz, M. B.
2008-12-01
The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. Major projects such as EarthScope and GeoEarthScope are producing the data needed to characterize the structure and kinematics of Earth's surface and interior at unprecedented resolution. At the same time, high-performance computing enables high-precision and fine- detail simulation of geodynamics processes, complementing the observational data. To facilitate interpretation and analysis of these datasets, to evaluate models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. VR has traditionally been used primarily as a presentation tool allowing active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for accelerated scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. Our approach to VR takes advantage of the specialized skills of geoscientists who are trained to interpret geological and geophysical data generated from field observations. Interactive tools allow the scientist to explore and interpret geodynamic models, tomographic models, and topographic observations, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulations or field observations. The use of VR technology enables us to improve our interpretation of crust and mantle structure and of geodynamical processes. Mapping tools based on computer visualization allow virtual "field studies" in inaccessible regions, and an interactive tool allows us to construct digital fault models for use in numerical models. Using the interactive tools on a high-end platform such as an immersive virtual reality room known as a Cave Automatic Virtual Environment (CAVE), enables the scientist to stand in data three-dimensional dataset while taking measurements. The CAVE involves three or more projection surfaces arranged as walls in a room. Stereo projectors combined with a motion tracking system and immersion recreates the experience of carrying out research in the field. This high-end system provides significant advantages for scientists working with complex volumetric data. The interactive tools also work on low-cost platforms that provide stereo views and the potential for interactivity such as a Geowall or a 3D enabled TV. The Geowall is also a well-established tool for education, and in combination with the tools we have developed, enables the rapid transfer of research data and new knowledge to the classroom. The interactive visualization tools can also be used on a desktop or laptop with or without stereo capability. Further information about the Virtual Reality User Interface (VRUI), the 3DVisualizer, the Virtual mapping tools, and the LIDAR viewer, can be found on the KeckCAVES website, www.keckcaves.org.
ERIC Educational Resources Information Center
Voelkel, Robert H.; Johnson, Christie W.; Gilbert, Kristen A.
2016-01-01
The purpose of this article is to present how one university incorporates immersive simulations through platforms which employ avatars to enhance graduate student understanding and learning in educational leadership programs. While using simulations and immersive virtual environments continues to grow, the literature suggests limited evidence of…
Hudak, Justin; Blume, Friederike; Dresler, Thomas; Haeussinger, Florian B; Renner, Tobias J; Fallgatter, Andreas J; Gawrilow, Caterina; Ehlis, Ann-Christine
2017-01-01
Based on neurofeedback (NF) training as a neurocognitive treatment in attention-deficit/hyperactivity disorder (ADHD), we designed a randomized, controlled functional near-infrared spectroscopy (fNIRS) NF intervention embedded in an immersive virtual reality classroom in which participants learned to control overhead lighting with their dorsolateral prefrontal brain activation. We tested the efficacy of the intervention on healthy adults displaying high impulsivity as a sub-clinical population sharing common features with ADHD. Twenty participants, 10 in an experimental and 10 in a shoulder muscle-based electromyography control group, underwent eight training sessions across 2 weeks. Training was bookended by a pre- and post-test including go/no-go, n-back, and stop-signal tasks (SST). Results indicated a significant reduction in commission errors on the no-go task with a simultaneous increase in prefrontal oxygenated hemoglobin concentration for the experimental group, but not for the control group. Furthermore, the ability of the subjects to gain control over the feedback parameter correlated strongly with the reduction in commission errors for the experimental, but not for the control group, indicating the potential importance of learning feedback control in moderating behavioral outcomes. In addition, participants of the fNIRS group showed a reduction in reaction time variability on the SST. Results indicate a clear effect of our NF intervention in reducing impulsive behavior possibly via a strengthening of frontal lobe functioning. Virtual reality additions to conventional NF may be one way to improve the ecological validity and symptom-relevance of the training situation, hence positively affecting transfer of acquired skills to real life.
Bhargava, Ayush; Bertrand, Jeffrey W; Gramopadhye, Anand K; Madathil, Kapil C; Babu, Sabarish V
2018-04-01
With costs of head-mounted displays (HMDs) and tracking technology decreasing rapidly, various virtual reality applications are being widely adopted for education and training. Hardware advancements have enabled replication of real-world interactions in virtual environments to a large extent, paving the way for commercial grade applications that provide a safe and risk-free training environment at a fraction of the cost. But this also mandates the need to develop more intrinsic interaction techniques and to empirically evaluate them in a more comprehensive manner. Although there exists a body of previous research that examines the benefits of selected levels of interaction fidelity on performance, few studies have investigated the constituent components of fidelity in a Interaction Fidelity Continuum (IFC) with several system instances and their respective effects on performance and learning in the context of a real-world skills training application. Our work describes a large between-subjects investigation conducted over several years that utilizes bimanual interaction metaphors at six discrete levels of interaction fidelity to teach basic precision metrology concepts in a near-field spatial interaction task in VR. A combined analysis performed on the data compares and contrasts the six different conditions and their overall effects on performance and learning outcomes, eliciting patterns in the results between the discrete application points on the IFC. With respect to some performance variables, results indicate that simpler restrictive interaction metaphors and highest fidelity metaphors perform better than medium fidelity interaction metaphors. In light of these results, a set of general guidelines are created for developers of spatial interaction metaphors in immersive virtual environments for precise fine-motor skills training simulations.
Military medical modeling and simulation in the 21st century.
Moses, G; Magee, J H; Bauer, J J; Leitch, R
2001-01-01
As we enter the 21st century, military medicine struggles with critical issues. One of the most important issues is how to train medical personnel in peace for the realities of war. In April, 1998, The General Accounting Office (GAO) reported, "Military medical personnel have almost no chance during peacetime to practice battlefield trauma care skills. As a result, physicians both within and outside the Department of Defense (DOD) believe that military medical personnel are not prepared to provide trauma care to the severely injured soldiers in wartime. With some of today's training methods disappearing, the challenge of providing both initial; and sustainment training for almost 100,000 military medical personnel is becoming insurmountable. The "training gap" is huge and impediments to training are mounting. For example, restrictions on animal use are increasing and the cost of conducting live mass casualty exercises is prohibitive. Many medical simulation visionaries believe that four categories of medical simulation are emerging to address these challenges. These categories include PC-based multimedia, digital mannequins, virtual workbenches, and total immersion virtual reality (TIVR). The use of simulation training can provide a risk = free realistic learning environment for the spectrum of medical skills training, from buddy-aid to trauma surgery procedures. This will, in turn, enhance limited hands on training opportunities and revolutionize the way we train in peace to deliver medicine in war. High-fidelity modeling will permit manufacturers to prototype new devices before manufacture. Also, engineers will be able to test a device for themselves in a variety of simulated anatomical representations, permitting them to "practice medicine".
NASA Astrophysics Data System (ADS)
Mekuria, Rufael; Cesar, Pablo; Doumanis, Ioannis; Frisiello, Antonella
2015-09-01
Compression of 3D object based video is relevant for 3D Immersive applications. Nevertheless, the perceptual aspects of the degradation introduced by codecs for meshes and point clouds are not well understood. In this paper we evaluate the subjective and objective degradations introduced by such codecs in a state of art 3D immersive virtual room. In the 3D immersive virtual room, users are captured with multiple cameras, and their surfaces are reconstructed as photorealistic colored/textured 3D meshes or point clouds. To test the perceptual effect of compression and transmission, we render degraded versions with different frame rates in different contexts (near/far) in the scene. A quantitative subjective study with 16 users shows that negligible distortion of decoded surfaces compared to the original reconstructions can be achieved in the 3D virtual room. In addition, a qualitative task based analysis in a full prototype field trial shows increased presence, emotion, user and state recognition of the reconstructed 3D Human representation compared to animated computer avatars.
Fieldwork Skills in Virtual Worlds
NASA Astrophysics Data System (ADS)
Craven, Benjamin; Lloyd, Geoffrey; Gordon, Clare; Houghton, Jacqueline; Morgan, Daniel
2017-04-01
Virtual reality has an increasingly significant role to play in teaching and research, but for geological applications realistic landscapes are required that contain sufficient detail to prove viable for investigation by both inquisitive students and critical researchers. To create such virtual landscapes, we combine DTM data with digitally modelled outcrops in the game engine Unity. Our current landscapes are fictional worlds, invented to focus on generation techniques and the strategic and spatial immersion within a digital environment. These have proved very successful in undergraduate teaching; however, we are now moving onto recreating real landscapes for more advanced teaching and research. The first of these is focussed on Rhoscolyn, situated within the Ynys Mon Geopark on Anglesey, UK. It is a popular area for both teaching and research in structural geology so has a wide usage demographic. The base of the model is created from DTM data, both 1 m LiDAR and 5 m GPS point data, and manipulated with QGIS before import to Unity. Substance is added to the world via models of architectural elements (e.g. walls and buildings) and appropriate flora and fauna, including sounds. Texturing of these models is performed using 25 cm aerial imagery and field photographs. Whilst such elements enhance immersion, it is the use of digital outcrop models that fully completes the experience. From fieldwork, we have a library of photogrammetric outcrops that can be modelled into 3D features using free (VisualSFM and MeshLab) and non-free (AgiSoft Photoscan) tools. These models are then refined and converted in Maya to create models for better insertion into the Unity environment. The finished product is a virtual landscape; a Rhoscolyn `world' that is sufficiently detailed to provide a base not only for geological teaching and training but also for geological research. Additionally, the `Rhoscolyn World' represents a significant tool for those students who are unable to attend conventional field classes and really enhances their learning experience. This project is part of the larger Virtual Landscapes project, which is a collaboration between The University of Leeds and Leeds College of Art, UK. All our current virtual landscapes are freely available online at www.see.leeds.ac.uk/virtual-landscapes/.
Immersive Virtual Reality to Improve Walking Abilities in Cerebral Palsy: A Pilot Study.
Gagliardi, Chiara; Turconi, Anna Carla; Biffi, Emilia; Maghini, Cristina; Marelli, Alessia; Cesareo, Ambra; Diella, Eleonora; Panzeri, Daniele
2018-04-27
Immersive virtual reality (IVR) offers new possibilities to perform treatments in an ecological and interactive environment with multimodal online feedbacks. Sixteen school-aged children (mean age 11 ± 2.4 years) with Bilateral CP-diplegia, attending mainstream schools were recruited for a pilot study in a pre-post treatment experimental design. The intervention was focused on walking competences and endurance and performed by the Gait Real-time Analysis Interactive Lab (GRAIL), an innovative treadmill platform based on IVR. The participants underwent eighteen therapy sessions in 4 weeks. Functional evaluations, instrumental measures including GAIT analysis and parental questionnaire were utilized to assess the treatment effects. Walking pattern (stride length left and right side, respectively p = 0.001 and 0.003; walking speed p = 0.001), endurance (6MWT, p = 0.026), gross motor abilities (GMFM-88, p = 0.041) and most kinematic and kinetic parameters significantly improved after the intervention. The changes were mainly predicted by age and cognitive abilities. The effect could have been due to the possibility of IVR to foster integration of motor/perceptual competences beyond the training of the walking ability, giving a chance of improvement also to older and already treated children.
Stanney, Kay M; Hale, Kelly S; Nahmens, Isabelina; Kennedy, Robert S
2003-01-01
For those interested in using head-coupled PC-based immersive virtual environment (VE) technology to train, entertain, or inform, it is essential to understand the effects this technology has on its users. This study investigated potential adverse effects, including the sickness associated with exposure and extreme responses (emesis, flashbacks). Participants were exposed to a VE for 15 to 60 min, with either complete or streamlined navigational control and simple or complex scenes, after which time measures of sickness were obtained. More than 80% of participants experienced nausea, oculomotor disturbances, and/or disorientation, with disorientation potentially lasting > 24 hr. Of the participants, 12.9% prematurely ended their exposure because of adverse effects; of these, 9.2% experienced an emetic response, whereas only 1.2% of all participants experienced emesis. The results indicate that designers may be able to reduce these rates by limiting exposure duration and reducing the degrees of freedom of the user's navigational control. Results from gender, body mass, and past experience comparisons indicated it may be possible to identify those who will experience adverse effects attributable to exposure and warn such individuals. Applications for this research include military, entertainment, and any other interactive systems for which designers seek to avoid adverse effects associated with exposure.
A Storm's Approach; Hurricane Shelter Training in a Digital Age
NASA Technical Reports Server (NTRS)
Boyarsky, Andrew; Burden, David; Gronstedt, Anders; Jinman, Andrew
2012-01-01
New York City's Office of Emergency Management (OEM) originally ran hundreds of classroom based courses, where they brought together civil servants to learn how to run a Hurricane Shelter (HS). This approach was found to be costly, time consuming and lacked any sense of an impending disaster and need for emergency response. In partnership with the City of New York University School of Professional studies, Gronstedt Group and Daden Limited, the OEM wanted to create a simulation that overcame these issues, providing users with a more immersive and realistic approach at a lower cost. The HS simulation was built in the virtual world Second Life (SL). Virtual worlds are a genre of online communities that often take the form of a computer-based simulated environments, through which users can interact with one another and use or create objects. Using this technology allowed managers to apply their knowledge in both classroom and remote learning environments. The shelter simulation is operational 24/7, guiding users through a 4 1/2 hour narrative from start to finish. This paper will describe the rationale for the project, the technical approach taken - particularly the use of a web based authoring tool to create and manage the immersive simulation, and the results from operational use.
Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds.
Wright, W Geoffrey
2014-01-01
Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed.
ConfocalVR: Immersive Visualization Applied to Confocal Microscopy.
Stefani, Caroline; Lacy-Hulbert, Adam; Skillman, Thomas
2018-06-24
ConfocalVR is a virtual reality (VR) application created to improve the ability of researchers to study the complexity of cell architecture. Confocal microscopes take pictures of fluorescently labeled proteins or molecules at different focal planes to create a stack of 2D images throughout the specimen. Current software applications reconstruct the 3D image and render it as a 2D projection onto a computer screen where users need to rotate the image to expose the full 3D structure. This process is mentally taxing, breaks down if you stop the rotation, and does not take advantage of the eye's full field of view. ConfocalVR exploits consumer-grade virtual reality (VR) systems to fully immerse the user in the 3D cellular image. In this virtual environment the user can: 1) adjust image viewing parameters without leaving the virtual space, 2) reach out and grab the image to quickly rotate and scale the image to focus on key features, and 3) interact with other users in a shared virtual space enabling real-time collaborative exploration and discussion. We found that immersive VR technology allows the user to rapidly understand cellular architecture and protein or molecule distribution. We note that it is impossible to understand the value of immersive visualization without experiencing it first hand, so we encourage readers to get access to a VR system, download this software, and evaluate it for yourself. The ConfocalVR software is available for download at http://www.confocalvr.com, and is free for nonprofits. Copyright © 2018. Published by Elsevier Ltd.
Gokeler, Alli; Bisschop, Marsha; Myer, Gregory D; Benjaminse, Anne; Dijkstra, Pieter U; van Keeken, Helco G; van Raay, Jos J A M; Burgerhof, Johannes G M; Otten, Egbert
2016-07-01
The purpose of this study was to evaluate the influence of immersion in a virtual reality environment on knee biomechanics in patients after ACL reconstruction (ACLR). It was hypothesized that virtual reality techniques aimed to change attentional focus would influence altered knee flexion angle, knee extension moment and peak vertical ground reaction force (vGRF) in patients following ACLR. Twenty athletes following ACLR and 20 healthy controls (CTRL) performed a step-down task in both a non-virtual reality environment and a virtual reality environment displaying a pedestrian traffic scene. A motion analysis system and force plates were used to measure kinematics and kinetics during a step-down task to analyse each single-leg landing. A significant main effect was found for environment for knee flexion excursion (P = n.s.). Significant interaction differences were found between environment and groups for vGRF (P = 0.004), knee moment (P < 0.001), knee angle at peak vGRF (P = 0.01) and knee flexion excursion (P = 0.03). There was larger effect of virtual reality environment on knee biomechanics in patients after ACLR compared with controls. Patients after ACLR immersed in virtual reality environment demonstrated knee joint biomechanics that approximate those of CTRL. The results of this study indicate that a realistic virtual reality scenario may distract patients after ACLR from conscious motor control. Application of clinically available technology may aid in current rehabilitation programmes to target altered movement patterns after ACLR. Diagnostic study, Level III.
Vogt, Tobias; Herpers, Rainer; Askew, Christopher D.; Scherfgen, David; Strüder, Heiko K.; Schneider, Stefan
2015-01-01
Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence. PMID:26366305
Vogt, Tobias; Herpers, Rainer; Askew, Christopher D; Scherfgen, David; Strüder, Heiko K; Schneider, Stefan
2015-01-01
Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence.
Virtual Enterprise: Transforming Entrepreneurship Education
ERIC Educational Resources Information Center
Borgese, Anthony
2011-01-01
Entrepreneurship education is ripe for utilizing experiential learning methods. Experiential methods are best learned when there is constant immersion into the subject matter. One such transformative learning methodology is Virtual Enterprise (VE). Virtual Enterprise is a multi-faceted, experiential learning methodology disseminated by the City…
Enhancing Pre-Service Teachers' Awareness to Pupils' Test-Anxiety with 3D Immersive Simulation
ERIC Educational Resources Information Center
Passig, David; Moshe, Ronit
2008-01-01
This study investigated whether participating in a 3D immersive virtual reality world simulating the experience of test-anxiety would affect preservice teachers' awareness to the phenomenon. Ninety subjects participated in this study, and were divided into three groups. The experimental group experienced a 3D immersive simulation which made…
‘My Virtual Dream’: Collective Neurofeedback in an Immersive Art Environment
Kovacevic, Natasha; Ritter, Petra; Tays, William; Moreno, Sylvain; McIntosh, Anthony Randal
2015-01-01
While human brains are specialized for complex and variable real world tasks, most neuroscience studies reduce environmental complexity, which limits the range of behaviours that can be explored. Motivated to overcome this limitation, we conducted a large-scale experiment with electroencephalography (EEG) based brain-computer interface (BCI) technology as part of an immersive multi-media science-art installation. Data from 523 participants were collected in a single night. The exploratory experiment was designed as a collective computer game where players manipulated mental states of relaxation and concentration with neurofeedback targeting modulation of relative spectral power in alpha and beta frequency ranges. Besides validating robust time-of-night effects, gender differences and distinct spectral power patterns for the two mental states, our results also show differences in neurofeedback learning outcome. The unusually large sample size allowed us to detect unprecedented speed of learning changes in the power spectrum (~ 1 min). Moreover, we found that participants' baseline brain activity predicted subsequent neurofeedback beta training, indicating state-dependent learning. Besides revealing these training effects, which are relevant for BCI applications, our results validate a novel platform engaging art and science and fostering the understanding of brains under natural conditions. PMID:26154513
VERS: a virtual environment for reconstructive surgery planning
NASA Astrophysics Data System (ADS)
Montgomery, Kevin N.
1997-05-01
The virtual environment for reconstructive surgery (VERS) project at the NASA Ames Biocomputation Center is applying virtual reality technology to aid surgeons in planning surgeries. We are working with a craniofacial surgeon at Stanford to assemble and visualize the bone structure of patients requiring reconstructive surgery either through developmental abnormalities or trauma. This project is an extension of our previous work in 3D reconstruction, mesh generation, and immersive visualization. The current VR system, consisting of an SGI Onyx RE2, FakeSpace BOOM and ImmersiveWorkbench, Virtual Technologies CyberGlove and Ascension Technologies tracker, is currently in development and has already been used to visualize defects preoperatively. In the near future it will be used to more fully plan the surgery and compute the projected result to soft tissue structure. This paper presents the work in progress and details the production of a high-performance, collaborative, and networked virtual environment.
Crowd behaviour during high-stress evacuations in an immersive virtual environment
Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W.; Gross, Markus; Helbing, Dirk; Hölscher, Christoph
2016-01-01
Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. PMID:27605166
Crowd behaviour during high-stress evacuations in an immersive virtual environment.
Moussaïd, Mehdi; Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W; Gross, Markus; Helbing, Dirk; Hölscher, Christoph
2016-09-01
Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. © 2016 The Authors.
Female artists and the VR crucible: expanding the aesthetic vocabulary
NASA Astrophysics Data System (ADS)
Morie, Jacquelyn Ford
2012-03-01
Virtual Reality was a technological wonder in its early days, and it was widely held to be a domain where men were the main practitioners. However, a survey done in 2007 of VR Artworks (Immersive Virtual Environments or VEs) showed that women have actually created the majority of artistic immersive works. This argues against the popular idea that the field has been totally dominated by men. While men have made great contributions in advancing the field, especially technologically, it appears most artistic works emerge from a decidedly feminine approach. Such an approach seems well suited to immersive environments as it incorporates aspects of inclusion, wholeness, and a blending of the body and the spirit. Female attention to holistic concerns fits the gestalt approach needed to create in a fully functional yet open-ended virtual world, which focuses not so much on producing a finished object (like a text or a sculpture) but rather on creating a possibility for becoming, like bringing a child into the world. Immersive VEs are not objective works of art to be hung on a wall and critiqued. They are vehicles for experience, vessels to live within for a piece of time.
Immersion of virtual reality for rehabilitation - Review.
Rose, Tyler; Nam, Chang S; Chen, Karen B
2018-05-01
Virtual reality (VR) shows promise in the application of healthcare and because it presents patients an immersive, often entertaining, approach to accomplish the goal of improvement in performance. Eighteen studies were reviewed to understand human performance and health outcomes after utilizing VR rehabilitation systems. We aimed to understand: (1) the influence of immersion in VR performance and health outcomes; (2) the relationship between enjoyment and potential patient adherence to VR rehabilitation routine; and (3) the influence of haptic feedback on performance in VR. Performance measures including postural stability, navigation task performance, and joint mobility showed varying relations to immersion. Limited data did not allow a solid conclusion between enjoyment and adherence, but patient enjoyment and willingness to participate were reported in care plans that incorporates VR. Finally, different haptic devices such as gloves and controllers provided both strengths and weakness in areas such movement velocity, movement accuracy, and path efficiency. Copyright © 2018 Elsevier Ltd. All rights reserved.
Design of a Serious Game for Handling Obstetrical Emergencies.
Jean Dit Gautier, Estelle; Bot-Robin, Virginie; Libessart, Aurélien; Doucède, Guillaume; Cosson, Michel; Rubod, Chrystèle
2016-12-21
The emergence of new technologies in the obstetrical field should lead to the development of learning applications, specifically for obstetrical emergencies. Many childbirth simulations have been recently developed. However, to date none of them have been integrated into a serious game. Our objective was to design a new type of immersive serious game, using virtual glasses to facilitate the learning of pregnancy and childbirth pathologies. We have elaborated a new game engine, placing the student in some maternity emergency situations and delivery room simulations. A gynecologist initially wrote a scenario based on a real clinical situation. He also designed, along with an educational engineer, a tree diagram, which served as a guide for dialogues and actions. A game engine, especially developed for this case, enabled us to connect actions to the graphic universe (fully 3D modeled and based on photographic references). We used the Oculus Rift in order to immerse the player in virtual reality. Each action in the game was linked to a certain number of score points, which could either be positive or negative. Different pathological pregnancy situations have been targeted and are as follows: care of spontaneous miscarriage, threat of preterm birth, forceps operative delivery for fetal abnormal heart rate, and reduction of a shoulder dystocia. The first phase immerses the learner into an action scene, as a doctor. The second phase ask the student to make a diagnosis. Once the diagnosis is made, different treatments are suggested. Our serious game offers a new perspective for obstetrical emergency management trainings and provides students with active learning by immersing them into an environment, which recreates all or part of the real obstetrical world of emergency. It is consistent with the latest recommendations, which clarify the importance of simulation in teaching and in ongoing professional development. ©Estelle Jean dit Gautier, Virginie Bot-Robin, Aurélien Libessart, Guillaume Doucède, Michel Cosson, Chrystèle Rubod. Originally published in JMIR Serious Games (http://games.jmir.org), 21.12.2016.
Study on Collaborative Object Manipulation in Virtual Environment
NASA Astrophysics Data System (ADS)
Mayangsari, Maria Niken; Yong-Moo, Kwon
This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.
ERIC Educational Resources Information Center
Jacob, Laura Beth
2012-01-01
Virtual world environments have evolved from object-oriented, text-based online games to complex three-dimensional immersive social spaces where the lines between reality and computer-generated begin to blur. Educators use virtual worlds to create engaging three-dimensional learning spaces for students, but the impact of virtual worlds in…
Faculty Perspectives of Faculty Persona in a Virtual World
ERIC Educational Resources Information Center
Blackmon, Stephanie J.
2013-01-01
Immersive virtual worlds provide a new way to deliver online courses or parts of online and face-to-face courses. There is a growing body of research on online learning, and the data on virtual worlds is also increasing. However, literature concerning professors' experiences with specific aspects of virtual worlds is limited. For example,…
Will Anything Useful Come Out of Virtual Reality? Examination of a Naval Application
1993-05-01
The term virtual reality can encompass varying meanings, but some generally accepted attributes of a virtual environment are that it is immersive...technology, but at present there are few practical applications which are utilizing the broad range of virtual reality technology. This paper will discuss an...Operability, operator functions, Virtual reality , Man-machine interface, Decision aids/decision making, Decision support. ASW.
Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters
NASA Astrophysics Data System (ADS)
Apostolellis, Panagiotis; Daradoumis, Thanasis
Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.
Dimbwadyo-Terrer, Iris; Trincado-Alonso, Fernando; de Los Reyes-Guzmán, Ana; Aznar, Miguel A; Alcubilla, Cesar; Pérez-Nombela, Soraya; Del Ama-Espinosa, Antonio; Polonio-López, Begoña; Gil-Agudo, Ángel
2016-08-01
Purpose state: The aim of this preliminary study was to test a data glove, CyberTouch™, combined with a virtual reality (VR) environment, for using in therapeutic training of reaching movements after spinal cord injury (SCI). Nine patients with thoracic SCI were selected to perform a pilot study by comparing two treatments: patients in the intervention group (IG) conducted a VR training based on the use of a data glove, CyberTouch™ for 2 weeks, while patients in the control group (CG) only underwent the traditional rehabilitation. Furthermore, two functional parameters were implemented in order to assess patient's performance of the sessions: normalized trajectory lengths and repeatability. Although no statistical significance was found, the data glove group seemed to obtain clinical changes in the muscle balance (MB) and functional parameters, and in the dexterity, coordination and fine grip tests. Moreover, every patient showed variations in at least one of the functional parameters, either along Y-axis trajectory or Z-axis trajectory. This study might be a step forward for the investigation of new uses of motion capture systems in neurorehabilitation, making it possible to train activities of daily living (ADLs) in motivational environments while measuring objectively the patient's functional evolution. Implications for Rehabilitation Key findings: A motion capture application based on a data glove is presented, for being used as a virtual reality tool for rehabilitation. This application has provided objective data about patient's functional performance. What the study has added: (1) This study allows to open new areas of research based on the use of different motion capture systems as rehabilitation tools, making it possible to train Activities of Daily Living in motivational environments. (2) Furthermore, this study could be a contribution for the development of clinical protocols to identify which types of patients will benefit most from the VR treatments, which interfaces are more suitable to be used in neurorehabilitation, and what types of virtual exercises will work best.
Sankaranarayanan, Ganesh; Wooley, Lizzy; Hogg, Deborah; Dorozhkin, Denis; Olasky, Jaisa; Chauhan, Sanket; Fleshman, James W; De, Suvranu; Scott, Daniel; Jones, Daniel B
2018-01-25
SAGES FUSE curriculum provides didactic knowledge on OR fire prevention. The objective of this study is to evaluate the impact of an immersive virtual reality (VR)-based OR fire training simulation system in combination with FUSE didactics. The study compared a control with a simulation group. After a pre-test questionnaire that assessed the baseline knowledge, both groups were given didactic material that consists of a 10-min presentation and reading materials about precautions and stopping an OR fire from the FUSE manual. The simulation group practiced on the OR fire simulation for one session that consisted of five trials within a week from the pre-test. One week later, both groups were reassessed using a questionnaire. A week after the post-test both groups also participated in a simulated OR fire scenario while their performance was videotaped for assessment. A total of 20 subjects (ten per group) participated in this IRB approved study. Median test scores for the control group increased from 5.5 to 9.00 (p = 0.011) and for the simulation group it increased from 5.0 to 8.5 (p = 0.005). Both groups started at the same baseline (pre-test, p = 0.529) and reached similar level in cognitive knowledge (post-test, p = 0.853). However, when tested in the mock OR fire scenario, 70% of the simulation group subjects were able to perform the correct sequence of steps in extinguishing the simulated fire whereas only 20% subjects in the control group were able to do so (p = 0.003). The simulation group was better than control group in correctly identifying the oxidizer (p = 0.03) and ignition source (p = 0.014). Interactive VR-based hands-on training was found to be a relatively inexpensive and effective mode for teaching OR fire prevention and management scenarios.
Visuomotor adaptation in head-mounted virtual reality versus conventional training
Anglin, J. M.; Sugiyama, T.; Liew, S.-L.
2017-01-01
Immersive, head-mounted virtual reality (HMD-VR) provides a unique opportunity to understand how changes in sensory environments affect motor learning. However, potential differences in mechanisms of motor learning and adaptation in HMD-VR versus a conventional training (CT) environment have not been extensively explored. Here, we investigated whether adaptation on a visuomotor rotation task in HMD-VR yields similar adaptation effects in CT and whether these effects are achieved through similar mechanisms. Specifically, recent work has shown that visuomotor adaptation may occur via both an implicit, error-based internal model and a more cognitive, explicit strategic component. We sought to measure both overall adaptation and balance between implicit and explicit mechanisms in HMD-VR versus CT. Twenty-four healthy individuals were placed in either HMD-VR or CT and trained on an identical visuomotor adaptation task that measured both implicit and explicit components. Our results showed that the overall timecourse of adaption was similar in both HMD-VR and CT. However, HMD-VR participants utilized a greater cognitive strategy than CT, while CT participants engaged in greater implicit learning. These results suggest that while both conditions produce similar results in overall adaptation, the mechanisms by which visuomotor adaption occurs in HMD-VR appear to be more reliant on cognitive strategies. PMID:28374808
Marketing analysis of a positive technology app for the self-management of psychological stress.
Wiederhold, Brenda K; Boyd, Chelsie; Sulea, Camelia; Gaggioli, Andrea; Riva, Giuseppe
2014-01-01
The INTERSTRESS project developed a completely new concept in the treatment of psychological stress: Interreality, a concept that combines cognitive behavioral therapy with a hybrid, closed-loop empowering experience bridging real and virtual worlds. This model provides the opportunity for individual citizens to become active participants in their own health and well-being. This article contains the results of the Marketing Trial and analysis of the opinions of individual consumers/end users of the INTERSTRESS product. The specific objective of this study was to evaluate the feasibility, efficacy and user acceptance of a novel mobile-based relaxation training tool in combination with biofeedback exercises and wearable biosensors. Relaxation was aided through immersion in a mobile virtual scenario (a virtual island) featuring pre-recorded audio narratives guiding a series of relaxation exercises. During biofeedback exercises, a wearable biosensor system provided data which directly modified the virtual reality experience in real-time. Thirty-six participants evaluated the product and overall feedback from users was positive, with some variation seen based on participant gender. A larger market study is now underway to understand if there are cultural variations in acceptability of the device.
The use of virtual reality in memory rehabilitation: current findings and future directions.
Brooks, B M; Rose, F D
2003-01-01
There is considerable potential for using virtual reality (VR) in memory rehabilitation which is only just beginning to be realized. PC-based virtual environments are probably better suited for this purpose than more immersive virtual environments because they are relatively inexpensive and portable, and less frightening to patients. Those exploratory studies that have so far been performed indicate that VR involvement would be usefully directed towards improving assessments of memory impairments and in memory remediation using reorganization techniques. In memory assessment, the use of VR could provide more comprehensive, ecologically-valid, and controlled evaluations of prospective, incidental, and spatial memory in a rehabilitation setting than is possible using standardized assessment tests. The additional knowledge gained from these assessments could more effectively direct rehabilitation towards specific impairments of individual patients. In memory remediation, VR training has been found to promote procedural learning in people with memory impairments, and this learning has been found to transfer to improved real-world performance. Future research should investigate ways in which the procedural knowledge gained during VR interaction can be adapted to offset the many disabilities which result from different forms of memory impairment.
community are writing new chapters. Follow Along Take the Virtual Tour An immersive, 3D tour of our Dining Net Price Calculator Visit Virtual Visit From President Capilouto Right now, you are joining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eric A. Wernert; William R. Sherman; Patrick O'Leary
Immersive visualization makes use of the medium of virtual reality (VR) - it is a subset of virtual reality focused on the application of VR technologies to scientific and information visualization. As the name implies, there is a particular focus on the physically immersive aspect of VR that more fully engages the perceptual and kinesthetic capabilities of the scientist with the goal of producing greater insight. The immersive visualization community is uniquely positioned to address the analysis needs of the wide spectrum of domain scientists who are becoming increasingly overwhelmed by data. The outputs of computational science simulations and high-resolutionmore » sensors are creating a data deluge. Data is coming in faster than it can be analyzed, and there are countless opportunities for discovery that are missed as the data speeds by. By more fully utilizing the scientists visual and other sensory systems, and by offering a more natural user interface with which to interact with computer-generated representations, immersive visualization offers great promise in taming this data torrent. However, increasing the adoption of immersive visualization in scientific research communities can only happen by simultaneously lowering the engagement threshold while raising the measurable benefits of adoption. Scientists time spent immersed with their data will thus be rewarded with higher productivity, deeper insight, and improved creativity. Immersive visualization ties together technologies and methodologies from a variety of related but frequently disjoint areas, including hardware, software and human-computer interaction (HCI) disciplines. In many ways, hardware is a solved problem. There are well established technologies including large walk-in systems such as the CAVE{trademark} and head-based systems such as the Wide-5{trademark}. The advent of new consumer-level technologies now enable an entirely new generation of immersive displays, with smaller footprints and costs, widening the potential consumer base. While one would be hard-pressed to call software a solved problem, we now understand considerably more about best practices for designing and developing sustainable, scalable software systems, and we have useful software examples that illuminate the way to even better implementations. As with any research endeavour, HCI will always be exploring new topics in interface design, but we now have a sizable knowledge base of the strengths and weaknesses of the human perceptual systems and we know how to design effective interfaces for immersive systems. So, in a research landscape with a clear need for better visualization and analysis tools, a methodology in immersive visualization that has been shown to effectively address some of those needs, and vastly improved supporting technologies and knowledge of hardware, software, and HCI, why hasn't immersive visualization 'caught on' more with scientists? What can we do as a community of immersive visualization researchers and practitioners to facilitate greater adoption by scientific communities so as to make the transition from 'the promise of virtual reality' to 'the reality of virtual reality'.« less
Hudak, Justin; Blume, Friederike; Dresler, Thomas; Haeussinger, Florian B.; Renner, Tobias J.; Fallgatter, Andreas J.; Gawrilow, Caterina; Ehlis, Ann-Christine
2017-01-01
Based on neurofeedback (NF) training as a neurocognitive treatment in attention-deficit/hyperactivity disorder (ADHD), we designed a randomized, controlled functional near-infrared spectroscopy (fNIRS) NF intervention embedded in an immersive virtual reality classroom in which participants learned to control overhead lighting with their dorsolateral prefrontal brain activation. We tested the efficacy of the intervention on healthy adults displaying high impulsivity as a sub-clinical population sharing common features with ADHD. Twenty participants, 10 in an experimental and 10 in a shoulder muscle-based electromyography control group, underwent eight training sessions across 2 weeks. Training was bookended by a pre- and post-test including go/no-go, n-back, and stop-signal tasks (SST). Results indicated a significant reduction in commission errors on the no-go task with a simultaneous increase in prefrontal oxygenated hemoglobin concentration for the experimental group, but not for the control group. Furthermore, the ability of the subjects to gain control over the feedback parameter correlated strongly with the reduction in commission errors for the experimental, but not for the control group, indicating the potential importance of learning feedback control in moderating behavioral outcomes. In addition, participants of the fNIRS group showed a reduction in reaction time variability on the SST. Results indicate a clear effect of our NF intervention in reducing impulsive behavior possibly via a strengthening of frontal lobe functioning. Virtual reality additions to conventional NF may be one way to improve the ecological validity and symptom-relevance of the training situation, hence positively affecting transfer of acquired skills to real life. PMID:28928644
Procedural wound geometry and blood flow generation for medical training simulators
NASA Astrophysics Data System (ADS)
Aras, Rifat; Shen, Yuzhong; Li, Jiang
2012-02-01
Efficient application of wound treatment procedures is vital in both emergency room and battle zone scenes. In order to train first responders for such situations, physical casualty simulation kits, which are composed of tens of individual items, are commonly used. Similar to any other training scenarios, computer simulations can be effective means for wound treatment training purposes. For immersive and high fidelity virtual reality applications, realistic 3D models are key components. However, creation of such models is a labor intensive process. In this paper, we propose a procedural wound geometry generation technique that parameterizes key simulation inputs to establish the variability of the training scenarios without the need of labor intensive remodeling of the 3D geometry. The procedural techniques described in this work are entirely handled by the graphics processing unit (GPU) to enable interactive real-time operation of the simulation and to relieve the CPU for other computational tasks. The visible human dataset is processed and used as a volumetric texture for the internal visualization of the wound geometry. To further enhance the fidelity of the simulation, we also employ a surface flow model for blood visualization. This model is realized as a dynamic texture that is composed of a height field and a normal map and animated at each simulation step on the GPU. The procedural wound geometry and the blood flow model are applied to a thigh model and the efficiency of the technique is demonstrated in a virtual surgery scene.
Scenario-Based Spoken Interaction with Virtual Agents
ERIC Educational Resources Information Center
Morton, Hazel; Jack, Mervyn A.
2005-01-01
This paper describes a CALL approach which integrates software for speaker independent continuous speech recognition with embodied virtual agents and virtual worlds to create an immersive environment in which learners can converse in the target language in contextualised scenarios. The result is a self-access learning package: SPELL (Spoken…
Exploring the Utility of a Virtual Performance Assessment
ERIC Educational Resources Information Center
Clarke-Midura, Jody; Code, Jillianne; Zap, Nick; Dede, Chris
2011-01-01
With funding from the Institute of Education Sciences (IES), the Virtual Performance Assessment project at the Harvard Graduate School of Education is developing and studying the feasibility of immersive virtual performance assessments (VPAs) to assess scientific inquiry of middle school students as a standardized component of an accountability…
Learning through Place-Making: Virtual Environments and Future Literacies
ERIC Educational Resources Information Center
Berry, Maryanne Susan
2010-01-01
This study examines a project through which elementary school and high school students collaborated with university Architecture/New Media students in building models of virtual, immersive libraries. It presents the project in the context of multiple and cross-disciplinary fields currently investigating the use of virtual and immersive…
Antoniou, Panagiotis E; Athanasopoulou, Christina A; Dafli, Eleni
2014-01-01
Background Since their inception, virtual patients have provided health care educators with a way to engage learners in an experience simulating the clinician’s environment without danger to learners and patients. This has led this learning modality to be accepted as an essential component of medical education. With the advent of the visually and audio-rich 3-dimensional multi-user virtual environment (MUVE), a new deployment platform has emerged for educational content. Immersive, highly interactive, multimedia-rich, MUVEs that seamlessly foster collaboration provide a new hotbed for the deployment of medical education content. Objective This work aims to assess the suitability of the Second Life MUVE as a virtual patient deployment platform for undergraduate dental education, and to explore the requirements and specifications needed to meaningfully repurpose Web-based virtual patients in MUVEs. Methods Through the scripting capabilities and available art assets in Second Life, we repurposed an existing Web-based periodontology virtual patient into Second Life. Through a series of point-and-click interactions and multiple-choice queries, the user experienced a specific periodontology case and was asked to provide the optimal responses for each of the challenges of the case. A focus group of 9 undergraduate dentistry students experienced both the Web-based and the Second Life version of this virtual patient. The group convened 3 times and discussed relevant issues such as the group’s computer literacy, the assessment of Second Life as a virtual patient deployment platform, and compared the Web-based and MUVE-deployed virtual patients. Results A comparison between the Web-based and the Second Life virtual patient revealed the inherent advantages of the more experiential and immersive Second Life virtual environment. However, several challenges for the successful repurposing of virtual patients from the Web to the MUVE were identified. The identified challenges for repurposing of Web virtual patients to the MUVE platform from the focus group study were (1) increased case complexity to facilitate the user’s gaming preconception in a MUVE, (2) necessity to decrease textual narration and provide the pertinent information in a more immersive sensory way, and (3) requirement to allow the user to actuate the solutions of problems instead of describing them through narration. Conclusions For a successful systematic repurposing effort of virtual patients to MUVEs such as Second Life, the best practices of experiential and immersive game design should be organically incorporated in the repurposing workflow (automated or not). These findings are pivotal in an era in which open educational content is transferred to and shared among users, learners, and educators of various open repositories/environments. PMID:24927470
Virtual Reality Educational Tool for Human Anatomy.
Izard, Santiago González; Juanes Méndez, Juan A; Palomera, Pablo Ruisoto
2017-05-01
Virtual Reality is becoming widespread in our society within very different areas, from industry to entertainment. It has many advantages in education as well, since it allows visualizing almost any object or going anywhere in a unique way. We will be focusing on medical education, and more specifically anatomy, where its use is especially interesting because it allows studying any structure of the human body by placing the user inside each one. By allowing virtual immersion in a body structure such as the interior of the cranium, stereoscopic vision goggles make these innovative teaching technologies a powerful tool for training in all areas of health sciences. The aim of this study is to illustrate the teaching potential of applying Virtual Reality in the field of human anatomy, where it can be used as a tool for education in medicine. A Virtual Reality Software was developed as an educational tool. This technological procedure is based entirely on software which will run in stereoscopic goggles to give users the sensation of being in a virtual environment, clearly showing the different bones and foramina which make up the cranium, and accompanied by audio explanations. Throughout the results the structure of the cranium is described in detailed from both inside and out. Importance of an exhaustive morphological knowledge of cranial fossae is further discussed. Application for the design of microsurgery is also commented.
ERIC Educational Resources Information Center
Huang, Hsiu-Mei; Liaw, Shu-Sheng; Lai, Chung-Min
2016-01-01
Advanced technologies have been widely applied in medical education, including human-patient simulators, immersive virtual reality Cave Automatic Virtual Environment systems, and video conferencing. Evaluating learner acceptance of such virtual reality (VR) learning environments is a critical issue for ensuring that such technologies are used to…
ERIC Educational Resources Information Center
Annetta, Leonard; Klesath, Marta; Meyer, John
2009-01-01
A 3-D virtual field trip was integrated into an online college entomology course and developed as a trial for the possible incorporation of future virtual environments to supplement online higher education laboratories. This article provides an explanation of the rationale behind creating the virtual experience, the Bug Farm; the method and…
ERIC Educational Resources Information Center
Allison, John
2008-01-01
This paper will undertake a critical review of the impact of virtual reality tools on the teaching of history. Virtual reality is useful in several different ways. History educators, elementary and secondary school teachers and professors, can all profit from the digital environment. Challenges arise quickly however. Virtual reality technologies…
The Fidelity of ’Feel’: Emotional Affordance in Virtual Environments
2005-07-01
The Fidelity of “Feel”: Emotional Affordance in Virtual Environments Jacquelyn Ford Morie, Josh Williams, Aimee Dozois, Donat-Pierre Luigi... environment but also the participant. We do this with the focus on what emotional affordances this manipulation will provide. Our first evaluation scenario...emotionally affective VEs. Keywords: Immersive Environments , Virtual Environments , VEs, Virtual Reality, emotion , affordance, fidelity, presence
Ontological implications of being in immersive virtual environments
NASA Astrophysics Data System (ADS)
Morie, Jacquelyn F.
2008-02-01
The idea of Virtual Reality once conjured up visions of new territories to explore, and expectations of awaiting worlds of wonder. VR has matured to become a practical tool for therapy, medicine and commercial interests, yet artists, in particular, continue to expand the possibilities for the medium. Artistic virtual environments created over the past two decades probe the phenomenological nature of these virtual environments. When we inhabit a fully immersive virtual environment, we have entered into a new form of Being. Not only does our body continue to exist in the real, physical world, we are also embodied within the virtual by means of technology that translates our bodied actions into interactions with the virtual environment. Very few states in human existence allow this bifurcation of our Being, where we can exist simultaneously in two spaces at once, with the possible exception of meta-physical states such as shamanistic trance and out-of-body experiences. This paper discusses the nature of this simultaneous Being, how we enter the virtual space, what forms of persona we can don there, what forms of spaces we can inhabit, and what type of wondrous experiences we can both hope for and expect.
Virtual Reality Simulation of the International Space Welding Experiment
NASA Technical Reports Server (NTRS)
Phillips, James A.
1996-01-01
Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.
Simulation Exploration through Immersive Parallel Planes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M
We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less
Simulation Exploration through Immersive Parallel Planes: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny
We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less
The (human) science of medical virtual learning environments.
Stone, Robert J
2011-01-27
The uptake of virtual simulation technologies in both military and civilian surgical contexts has been both slow and patchy. The failure of the virtual reality community in the 1990s and early 2000s to deliver affordable and accessible training systems stems not only from an obsessive quest to develop the 'ultimate' in so-called 'immersive' hardware solutions, from head-mounted displays to large-scale projection theatres, but also from a comprehensive lack of attention to the needs of the end users. While many still perceive the science of simulation to be defined by technological advances, such as computing power, specialized graphics hardware, advanced interactive controllers, displays and so on, the true science underpinning simulation--the science that helps to guarantee the transfer of skills from the simulated to the real--is that of human factors, a well-established discipline that focuses on the abilities and limitations of the end user when designing interactive systems, as opposed to the more commercially explicit components of technology. Based on three surgical simulation case studies, the importance of a human factors approach to the design of appropriate simulation content and interactive hardware for medical simulation is illustrated. The studies demonstrate that it is unnecessary to pursue real-world fidelity in all instances in order to achieve psychological fidelity--the degree to which the simulated tasks reproduce and foster knowledge, skills and behaviours that can be reliably transferred to real-world training applications.
The effectiveness of virtual reality distraction for pain reduction: a systematic review.
Malloy, Kevin M; Milling, Leonard S
2010-12-01
Virtual reality technology enables people to become immersed in a computer-simulated, three-dimensional environment. This article provides a comprehensive review of controlled research on the effectiveness of virtual reality (VR) distraction for reducing pain. To be included in the review, studies were required to use a between-subjects or mixed model design in which VR distraction was compared with a control condition or an alternative intervention in relieving pain. An exhaustive search identified 11 studies satisfying these criteria. VR distraction was shown to be effective for reducing experimental pain, as well as the discomfort associated with burn injury care. Studies of needle-related pain provided less consistent findings. Use of more sophisticated virtual reality technology capable of fully immersing the individual in a virtual environment was associated with greater relief. Overall, controlled research suggests that VR distraction may be a useful tool for clinicians who work with a variety of pain problems. Copyright © 2010 Elsevier Ltd. All rights reserved.
Immersive Environments - A Connectivist Approach
NASA Astrophysics Data System (ADS)
Loureiro, Ana; Bettencourt, Teresa
We are conducting a research project with the aim of achieving better and more efficient ways to facilitate teaching and learning in Higher Level Education. We have chosen virtual environments, with particular emphasis to Second Life® platform augmented by web 2.0 tools, to develop the study. The Second Life® environment has some interesting characteristics that captured our attention, it is immersive; it is a real world simulator; it is a social network; it allows real time communication, cooperation, collaboration and interaction; it is a safe and controlled environment. We specifically chose tools from web 2.0 that enable sharing and collaborative way of learning. Through understanding the characteristics of this learning environment, we believe that immersive learning along with other virtual tools can be integrated in today's pedagogical practices.
NASA Astrophysics Data System (ADS)
Ikeda, Sei; Sato, Tomokazu; Kanbara, Masayuki; Yokoya, Naokazu
2004-05-01
Technology that enables users to experience a remote site virtually is called telepresence. A telepresence system using real environment images is expected to be used in the field of entertainment, medicine, education and so on. This paper describes a novel telepresence system which enables users to walk through a photorealistic virtualized environment by actual walking. To realize such a system, a wide-angle high-resolution movie is projected on an immersive multi-screen display to present users the virtualized environments and a treadmill is controlled according to detected user's locomotion. In this study, we use an omnidirectional multi-camera system to acquire images real outdoor scene. The proposed system provides users with rich sense of walking in a remote site.
Magdalon, Eliane C; Michaelsen, Stella M; Quevedo, Antonio A; Levin, Mindy F
2011-09-01
Virtual reality (VR) technology is being used with increasing frequency as a training medium for motor rehabilitation. However, before addressing training effectiveness in virtual environments (VEs), it is necessary to identify if movements made in such environments are kinematically similar to those made in physical environments (PEs) and the effect of provision of haptic feedback on these movement patterns. These questions are important since reach-to-grasp movements may be inaccurate when visual or haptic feedback is altered or absent. Our goal was to compare kinematics of reaching and grasping movements to three objects performed in an immersive three-dimensional (3D) VE with haptic feedback (cyberglove/grasp system) viewed through a head-mounted display to those made in an equivalent physical environment (PE). We also compared movements in PE made with and without wearing the cyberglove/grasp haptic feedback system. Ten healthy subjects (8 women, 62.1±8.8years) reached and grasped objects requiring 3 different grasp types (can, diameter 65.6mm, cylindrical grasp; screwdriver, diameter 31.6mm, power grasp; pen, diameter 7.5mm, precision grasp) in PE and visually similar virtual objects in VE. Temporal and spatial arm and trunk kinematics were analyzed. Movements were slower and grip apertures were wider when wearing the glove in both the PE and the VE compared to movements made in the PE without the glove. When wearing the glove, subjects used similar reaching trajectories in both environments, preserved the coordination between reaching and grasping and scaled grip aperture to object size for the larger object (cylindrical grasp). However, in VE compared to PE, movements were slower and had longer deceleration times, elbow extension was greater when reaching to the smallest object and apertures were wider for the power and precision grip tasks. Overall, the differences in spatial and temporal kinematics of movements between environments were greater than those due only to wearing the cyberglove/grasp system. Differences in movement kinematics due to the viewing environment were likely due to a lack of prior experience with the virtual environment, an uncertainty of object location and the restricted field-of-view when wearing the head-mounted display. The results can be used to inform the design and disposition of objects within 3D VEs for the study of the control of prehension and for upper limb rehabilitation. Copyright © 2011 Elsevier B.V. All rights reserved.
A Virtual Reality Curriculum for Pediatric Residents Decreases Rates of Influenza Vaccine Refusal.
Real, Francis J; DeBlasio, Dominick; Beck, Andrew F; Ollberding, Nicholas J; Davis, David; Cruse, Bradley; Samaan, Zeina; McLinden, Daniel; Klein, Melissa D
Influenza vaccine hesitancy is common in the primary care setting. Though physicians can affect caregivers' attitudes toward vaccination, physicians report uneasiness discussing vaccine hesitancy. Few studies have targeted physician-patient communication training as a means to decrease vaccination refusal. An immersive virtual reality (VR) curriculum was created to teach pediatric residents communication skills when discussing influenza vaccine hesitancy. This pilot curriculum consisted of 3 VR simulations during which residents counseled graphical character representatives (avatars) who expressed vaccine hesitancy. Participants were randomized to the intervention (n = 24) or control (n = 21) group. Only residents in the intervention group underwent the VR curriculum. Impact of the curriculum was assessed through difference in influenza vaccine refusal rates between the intervention and control groups in the 3 months after the VR curriculum. Participants included postgraduate level (PL) 2 and PL3 pediatric residents. All eligible residents (n = 45) participated; the survey response rate was 100%. In patients aged 6 to 59 months, residents in the intervention group had a decreased rate of influenza vaccination refusal in the postcurriculum period compared to the control group (27.8% vs 37.1%; P = .03). Immersive VR may be an effective modality to teach communication skills to medical trainees. Next steps include evaluation of the curriculum in a larger, multisite trial. Copyright © 2017 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Learning Experience with Virtual Worlds
ERIC Educational Resources Information Center
Wagner, Christian
2008-01-01
Virtual worlds create a new opportunity to enrich the educational experience through media-rich immersive learning. Virtual worlds have gained notoriety in games such as World of Warcraft (WoW), which has become the most successful online game ever, and in "general purpose" worlds, such as Second Life (SL), whose participation levels (more than 10…
A System for Governmental Virtual Institutions Based on Ontologies and Interaction Protocols
ERIC Educational Resources Information Center
de Araujo, Claudia J. Abrao; da Silva, Flavio S. Correa
2012-01-01
The authors believe that the adoption of virtual worlds is suitable for electronic government applications as it can increase the capillarity of public services, facilitate the access to government services and provide citizens with a natural and immersive experience. They present a Government Virtual Institution Model (GVI) for the provision of…
ERIC Educational Resources Information Center
Patera, Marianne; Draper, Steve; Naef, Martin
2008-01-01
This paper presents an exploratory study that created a virtual reality environment (VRE) to stimulate motivation and creativity in imaginative writing at primary school level. The main aim of the study was to investigate if an interactive, semi-immersive virtual reality world could increase motivation and stimulate pupils' imagination in the…
The Pixelated Professor: Faculty in Immersive Virtual Worlds
ERIC Educational Resources Information Center
Blackmon, Stephanie
2015-01-01
Online environments, particularly virtual worlds, can sometimes complicate issues of self expression. For example, the faculty member who loves punk rock has an opportunity, through hairstyle and attire choices in the virtual world, to share that part of herself with students. However, deciding to share that part of the self can depend on a number…
Teaching Literature in Virtual Worlds: Immersive Learning in English Studies
ERIC Educational Resources Information Center
Webb, Allen, Ed.
2011-01-01
What are the realities and possibilities of utilizing on-line virtual worlds as teaching tools for specific literary works? Through engaging and surprising stories from classrooms where virtual worlds are in use, this book invites readers to understand and participate in this emerging and valuable pedagogy. It examines the experience of high…
An Investigation into Cooperative Learning in a Virtual World Using Problem-Based Learning
ERIC Educational Resources Information Center
Parson, Vanessa; Bignell, Simon
2017-01-01
Three-dimensional multi-user virtual environments (MUVEs) have the potential to provide experiential learning qualitatively similar to that found in the real world. MUVEs offer a pedagogically-driven immersive learning opportunity for educationalists that is cost-effective and enjoyable. A family of digital virtual avatars was created within…
Pre-Service Teachers Designing Virtual World Learning Environments
ERIC Educational Resources Information Center
Jacka, Lisa; Booth, Kate
2012-01-01
Integrating Information Technology Communications in the classroom has been an important part of pre-service teacher education for over a decade. The advent of virtual worlds provides the pre-service teacher with an opportunity to study teaching and learning in a highly immersive 3D computer-based environment. Virtual worlds also provide a place…
Embodying self-compassion within virtual reality and its effects on patients with depression.
Falconer, Caroline J; Rovira, Aitor; King, John A; Gilbert, Paul; Antley, Angus; Fearon, Pasco; Ralph, Neil; Slater, Mel; Brewin, Chris R
2016-01-01
Self-criticism is a ubiquitous feature of psychopathology and can be combatted by increasing levels of self-compassion. However, some patients are resistant to self-compassion. To investigate whether the effects of self-identification with virtual bodies within immersive virtual reality could be exploited to increase self-compassion in patients with depression. We developed an 8-minute scenario in which 15 patients practised delivering compassion in one virtual body and then experienced receiving it from themselves in another virtual body. In an open trial, three repetitions of this scenario led to significant reductions in depression severity and self-criticism, as well as to a significant increase in self-compassion, from baseline to 4-week follow-up. Four patients showed clinically significant improvement. The results indicate that interventions using immersive virtual reality may have considerable clinical potential and that further development of these methods preparatory to a controlled trial is now warranted. None. © The Royal College of Psychiatrists 2016. This is an open access article distributed under the terms of the Creative Commons Attribution (CC BY) licence.
NASA Astrophysics Data System (ADS)
Tsoupikova, Daria
2006-02-01
This paper will explore how the aesthetics of the virtual world affects, transforms, and enhances the immersive emotional experience of the user. What we see and what we do upon entering the virtual environment influences our feelings, mental state, physiological changes and sensibility. To create a unique virtual experience the important component to design is the beauty of the virtual world based on the aesthetics of the graphical objects such as textures, models, animation, and special effects. The aesthetic potency of the images that comprise the virtual environment can make the immersive experience much stronger and more compelling. The aesthetic qualities of the virtual world as born out through images and graphics can influence the user's state of mind. Particular changes and effects on the user can be induced through the application of techniques derived from the research fields of psychology, anthropology, biology, color theory, education, art therapy, music, and art history. Many contemporary artists and developers derive much inspiration for their work from their experience with traditional arts such as painting, sculpture, design, architecture and music. This knowledge helps them create a higher quality of images and stereo graphics in the virtual world. The understanding of the close relation between the aesthetic quality of the virtual environment and the resulting human perception is the key to developing an impressive virtual experience.
Envisioning the future of home care: applications of immersive virtual reality.
Brennan, Patricia Flatley; Arnott Smith, Catherine; Ponto, Kevin; Radwin, Robert; Kreutz, Kendra
2013-01-01
Accelerating the design of technologies to support health in the home requires 1) better understanding of how the household context shapes consumer health behaviors and (2) the opportunity to afford engineers, designers, and health professionals the chance to systematically study the home environment. We developed the Living Environments Laboratory (LEL) with a fully immersive, six-sided virtual reality CAVE to enable recreation of a broad range of household environments. We have successfully developed a virtual apartment, including a kitchen, living space, and bathroom. Over 2000 people have visited the LEL CAVE. Participants use an electronic wand to activate common household affordances such as opening a refrigerator door or lifting a cup. Challenges currently being explored include creating natural gesture to interface with virtual objects, developing robust, simple procedures to capture actual living environments and rendering them in a 3D visualization, and devising systematic stable terminologies to characterize home environments.
Cue-exposure software for the treatment of bulimia nervosa and binge eating disorder.
Gutiérrez-Maldonado, José; Pla-Sanjuanelo, Joana; Ferrer-García, Marta
2016-11-01
Cue-exposure therapy (CET) has proven its efficacy in treating patients with bulimia nervosa and binge eating disorder who are resistant to standard treatment. Furthermore, incorporating virtual reality (VR) technology is increasingly considered a valid exposure method that may help to increase the efficacy of standard treatments in a variety of eating disorders. Although immersive displays improve the beneficial effects, expensive technology is not always necessary. We aimed to assess whether exposure to food related virtual environments could decrease food craving in a non-clinical sample. In addition, we specifically compared the effects of two VR systems (one non-immersive and one immersive) during CET. We therefore applied a one-session CET to 113 undergraduate students. Decreased food craving was found during exposure to both VR environments compared with pre-treatment levels, supporting the efficacy of VR-CET in reducing food craving. We found no significant differences in craving between immersive and non-immersive systems. Low-cost non-immersive systems applied through 3D laptops can improve the accessibility of this technique. By reducing the costs and improving the usability, VR-CET on 3D laptops may become a viable option that can be readily applied in a greater range of clinical contexts.
Real-time interactive virtual tour on the World Wide Web (WWW)
NASA Astrophysics Data System (ADS)
Yoon, Sanghyuk; Chen, Hai-jung; Hsu, Tom; Yoon, Ilmi
2003-12-01
Web-based Virtual Tour has become a desirable and demanded application, yet challenging due to the nature of web application's running environment such as limited bandwidth and no guarantee of high computation power on the client side. Image-based rendering approach has attractive advantages over traditional 3D rendering approach in such Web Applications. Traditional approach, such as VRML, requires labor-intensive 3D modeling process, high bandwidth and computation power especially for photo-realistic virtual scenes. QuickTime VR and IPIX as examples of image-based approach, use panoramic photos and the virtual scenes that can be generated from photos directly skipping the modeling process. But, these image-based approaches may require special cameras or effort to take panoramic views and provide only one fixed-point look-around and zooming in-out rather than 'walk around', that is a very important feature to provide immersive experience to virtual tourists. The Web-based Virtual Tour using Tour into the Picture employs pseudo 3D geometry with image-based rendering approach to provide viewers with immersive experience of walking around the virtual space with several snap shots of conventional photos.
Offenders become the victim in virtual reality: impact of changing perspective in domestic violence.
Seinfeld, S; Arroyo-Palacios, J; Iruretagoyena, G; Hortensius, R; Zapata, L E; Borland, D; de Gelder, B; Slater, M; Sanchez-Vives, M V
2018-02-09
The role of empathy and perspective-taking in preventing aggressive behaviors has been highlighted in several theoretical models. In this study, we used immersive virtual reality to induce a full body ownership illusion that allows offenders to be in the body of a victim of domestic abuse. A group of male domestic violence offenders and a control group without a history of violence experienced a virtual scene of abuse in first-person perspective. During the virtual encounter, the participants' real bodies were replaced with a life-sized virtual female body that moved synchronously with their own real movements. Participants' emotion recognition skills were assessed before and after the virtual experience. Our results revealed that offenders have a significantly lower ability to recognize fear in female faces compared to controls, with a bias towards classifying fearful faces as happy. After being embodied in a female victim, offenders improved their ability to recognize fearful female faces and reduced their bias towards recognizing fearful faces as happy. For the first time, we demonstrate that changing the perspective of an aggressive population through immersive virtual reality can modify socio-perceptual processes such as emotion recognition, thought to underlie this specific form of aggressive behaviors.
Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds
Wright, W. Geoffrey
2014-01-01
Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed. PMID:24782724
Persky, Susan; Kaphingst, Kimberly A.; McCall, Cade; Lachance, Christina; Beall, Andrew C.; Blascovich, Jim
2009-01-01
Presence in virtual learning environments (VLEs) has been associated with a number of outcome factors related to a user’s ability and motivation to learn. The extant but relatively small body of research suggests that a high level of presence is related to better performance on learning outcomes in VLEs. Different configurations of form and content variables such as those associated with active (self-driven, interactive activities) versus didactic (reading or lecture) learning may, however, influence how presence operates and on what content it operates. We compared the influence of presence between two types of immersive VLEs (i.e., active versus didactic techniques) on comprehension and engagement-related outcomes. The findings revealed that the active VLE promoted greater presence. Although we found no relationship between presence and learning comprehension outcomes for either virtual environment, presence was related to information engagement variables in the didactic immersive VLE but not the active environment. Results demonstrate that presence is not uniformly elicited or effective across immersive VLEs. Educational delivery mode and environment complexity may influence the impact of presence on engagement. PMID:19366319
Persky, Susan; Kaphingst, Kimberly A; McCall, Cade; Lachance, Christina; Beall, Andrew C; Blascovich, Jim
2009-06-01
Presence in virtual learning environments (VLEs) has been associated with a number of outcome factors related to a user's ability and motivation to learn. The extant but relatively small body of research suggests that a high level of presence is related to better performance on learning outcomes in VLEs. Different configurations of form and content variables such as those associated with active (self-driven, interactive activities) versus didactic (reading or lecture) learning may, however, influence how presence operates and on what content it operates. We compared the influence of presence between two types of immersive VLEs (i.e., active versus didactic techniques) on comprehension and engagement-related outcomes. The findings revealed that the active VLE promoted greater presence. Although we found no relationship between presence and learning comprehension outcomes for either virtual environment, presence was related to information engagement variables in the didactic immersive VLE but not the active environment. Results demonstrate that presence is not uniformly elicited or effective across immersive VLEs. Educational delivery mode and environment complexity may influence the impact of presence on engagement.
Ames Lab 101: C6: Virtual Engineering
McCorkle, Doug
2018-01-01
Ames Laboratory scientist Doug McCorkle explains the importance of virtual engineering and talks about the C6. The C6 is a three-dimensional, fully-immersive synthetic environment residing in the center atrium of Iowa State University's Howe Hall.
Adaptive Effects on Locomotion Performance Following Exposure to a Rotating Virtual Environment
NASA Technical Reports Server (NTRS)
Mulavara, A. P.; Richards, J. T.; Marshburn, A. M.; Bucello, R.; Bloomberg, J. J.
2003-01-01
During long-duration spaceflight, astronauts experience alterations in vestibular and somatosensory cues that result in adaptive disturbances in balance and coordination upon return to Earth. These changes can pose a risk to crew safety and to mission objectives if nominal or emergency vehicle egress is required immediately following long-duration spaceflight. At present, no operational countermeasure is available to mitigate the adaptive sensorimotor component underlying the locomotor disturbances that occur after spaceflight. Therefore, the goal of this study is to develop an inflight training regimen that facilitates recovery of locomotor function after long-duration spaceflight. The countermeasure we are proposing is based on the concept of adaptive generalization. During this type of training the subject gains experience producing the appropriate adaptive motor behavior under a variety of sensory conditions and response constraints. As a result of this training a subject learns to solve a class of motor problems, rather than a specific motor solution to one problem, i.e., the subject learns response generalizability or the ability to "learn to learn." under a variety of environmental constraints. We are developing an inflight countermeasure built around treadmill exercise activities. By manipulating the sensory conditions of exercise by varying visual flow patterns, body load and speed we will systematically and repeatedly promote adaptive change in locomotor behavior. It has been shown that variable practice training increases adaptability to novel visuo-motor situations. While walking over ground in a stereoscopic virtual environment that oscillated in roll, subjects have shown compensatory torso rotation in the direction of scene rotation that resulted in positional variation away from a desired linear path. Thus, postural sway and locomotor stability in 1-g can be modulated by visual flow patterns and used during inflight treadmill training to promote adaptive generalization. The purpose of this study was to determine if adaptive modification in locomotor performance could be achieved by viewing simulated self-motion in a passive-immersive virtual ' environment over a prolonged period during treadmill locomotion.
NASA Astrophysics Data System (ADS)
McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.
2017-12-01
Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.
Contextual EFL Learning in a 3D Virtual Environment
ERIC Educational Resources Information Center
Lan, Yu-Ju
2015-01-01
The purposes of the current study are to develop virtually immersive EFL learning contexts for EFL learners in Taiwan to pre- and review English materials beyond the regular English class schedule. A 2-iteration action research lasting for one semester was conducted to evaluate the effects of virtual contexts on learners' EFL learning. 132…
ERIC Educational Resources Information Center
Hack, Catherine Jane
2016-01-01
Using the delivery of a large postgraduate distance learning module in bioethics to health professionals as an illustrative example, the type of learning activity that could be enhanced through delivery in an immersive virtual world (IVW) was explored. Several activities were repurposed from the "traditional" virtual learning environment…
Enhance Learning on Software Project Management through a Role-Play Game in a Virtual World
ERIC Educational Resources Information Center
Maratou, Vicky; Chatzidaki, Eleni; Xenos, Michalis
2016-01-01
This article presents a role-play game for software project management (SPM) in a three-dimensional online multiuser virtual world. The Opensimulator platform is used for the creation of an immersive virtual environment that facilitates students' collaboration and realistic interaction, in order to manage unexpected events occurring during the…
ERIC Educational Resources Information Center
Winkelmann, Kurt; Keeney-Kennicutt, Wendy; Fowler, Debra; Macik, Maria
2017-01-01
Virtual worlds are a potential medium for teaching college-level chemistry laboratory courses. To determine the feasibility of conducting chemistry experiments in such an environment, undergraduate students performed two experiments in the immersive virtual world of Second Life (SL) as part of their regular General Chemistry 2 laboratory course.…
Teaching Physics to Deaf College Students in a 3-D Virtual Lab
ERIC Educational Resources Information Center
Robinson, Vicki
2013-01-01
Virtual worlds are used in many educational and business applications. At the National Technical Institute for the Deaf at Rochester Institute of Technology (NTID/RIT), deaf college students are introduced to the virtual world of Second Life, which is a 3-D immersive, interactive environment, accessed through computer software. NTID students use…
L2 Immersion in 3D Virtual Worlds: The Next Thing to Being There?
ERIC Educational Resources Information Center
Paillat, Edith
2014-01-01
Second Life is one of the many three-dimensional virtual environments accessible through a computer and a fast broadband connection. Thousands of participants connect to this platform to interact virtually with the world, join international communities of practice and, for some, role play groups. Unlike online role play games however, Second Life…
ERIC Educational Resources Information Center
Lau, Kung Wong; Lee, Pui Yuen
2015-01-01
This paper discusses the roles of simulation in creativity education and how to apply immersive virtual environments to enhance students' learning experiences in university, through the provision of interactive simulations. An empirical study of a simulated virtual reality was carried out in order to investigate the effectiveness of providing…
ERIC Educational Resources Information Center
Kim, Mi Hwa
2013-01-01
The purpose of this experimental study was to investigate the impact of the use of a virtual environment for learning Korean history on high school students' learning outcomes and attitudes toward virtual worlds (collaboration, engagement, general use of SL [Second Life], and immersion). In addition, this experiment examined the relationships…
Students' First Impression of Second Life: A Case from the United Arab Emirates
ERIC Educational Resources Information Center
Abdallah, Salam; Douglas, Jamal
2010-01-01
Emerging 3D virtual worlds such as Second Life can offer students with opportunities to enhance learning using rich collaborative asynchronous media. Virtual worlds are believed to impact the future of higher education and therefore, universities across the world are immersing themselves inside virtual worlds to establish a unique learning and…
"Immersive Education" Submerges Students in Online Worlds Made for Learning
ERIC Educational Resources Information Center
Foster, Andrea L.
2007-01-01
Immersive Education is a multimillion-dollar project devoted to build virtual-reality software exclusively for education within commercial and nonprofit fantasy spaces like Second Life. The project combines interactive three-dimensional graphics, Web cameras, Internet-based telephony, and other digital media. Some critics have complained that…
Architectures for Developing Multiuser, Immersive Learning Scenarios
ERIC Educational Resources Information Center
Nadolski, Rob J.; Hummel, Hans G. K.; Slootmaker, Aad; van der Vegt, Wim
2012-01-01
Multiuser immersive learning scenarios hold strong potential for lifelong learning as they can support the acquisition of higher order skills in an effective, efficient, and attractive way. Existing virtual worlds, game development platforms, and game engines only partly cater for the proliferation of such learning scenarios as they are often…
Three-Dimensional User Interfaces for Immersive Virtual Reality
NASA Technical Reports Server (NTRS)
vanDam, Andries
1997-01-01
The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.
Smith, Jordan W.
2015-01-01
Immersive virtual environment (IVE) technology offers a wide range of potential benefits to research focused on understanding how individuals perceive and respond to built and natural environments. In an effort to broaden awareness and use of IVE technology in perception, preference and behavior research, this review paper describes how IVE technology can be used to complement more traditional methods commonly applied in public health research. The paper also describes a relatively simple workflow for creating and displaying 360° virtual environments of built and natural settings and presents two freely-available and customizable applications that scientists from a variety of disciplines, including public health, can use to advance their research into human preferences, perceptions and behaviors related to built and natural settings. PMID:26378565
Smith, Jordan W
2015-09-11
Immersive virtual environment (IVE) technology offers a wide range of potential benefits to research focused on understanding how individuals perceive and respond to built and natural environments. In an effort to broaden awareness and use of IVE technology in perception, preference and behavior research, this review paper describes how IVE technology can be used to complement more traditional methods commonly applied in public health research. The paper also describes a relatively simple workflow for creating and displaying 360° virtual environments of built and natural settings and presents two freely-available and customizable applications that scientists from a variety of disciplines, including public health, can use to advance their research into human preferences, perceptions and behaviors related to built and natural settings.
Truck driver fatigue assessment using a virtual reality system.
DOT National Transportation Integrated Search
2016-10-17
In this study, a fully immersive Virtual Reality (VR) based driving simulator was developed to serve : as a proof-of-concept that VR can be utilized to assess the level of fatigue (or drowsiness) truck : drivers typically experience during real...
ERIC Educational Resources Information Center
Panettieri, Joseph C.
2007-01-01
Across the globe, progressive universities are embracing any number of MUVEs (multi-user virtual environments), 3D environments, and "immersive" virtual reality tools. And within the next few months, several universities are expected to test so-called "telepresence" videoconferencing systems from Cisco Systems and other leading…
Marescaux, J; Clément, J M; Nord, M; Russier, Y; Tassetti, V; Mutter, D; Cotin, S; Ayache, N
1997-11-01
Surgical simulation increasingly appears to be an essential aspect of tomorrow's surgery. The development of a hepatic surgery simulator is an advanced concept calling for a new writing system which will transform the medical world: virtual reality. Virtual reality extends the perception of our five senses by representing more than the real state of things by the means of computer sciences and robotics. It consists of three concepts: immersion, navigation and interaction. Three reasons have led us to develop this simulator: the first is to provide the surgeon with a comprehensive visualisation of the organ. The second reason is to allow for planning and surgical simulation that could be compared with the detailed flight-plan for a commercial jet pilot. The third lies in the fact that virtual reality is an integrated part of the concept of computer assisted surgical procedure. The project consists of a sophisticated simulator which has to include five requirements: visual fidelity, interactivity, physical properties, physiological properties, sensory input and output. In this report we will describe how to get a realistic 3D model of the liver from bi-dimensional 2D medical images for anatomical and surgical training. The introduction of a tumor and the consequent planning and virtual resection is also described, as are force feedback and real-time interaction.
Carbohydrates and sports practice: a Twitter virtual ethnography
Rodríguez-Martín, Beatriz; Castillo, Carlos Alberto
2017-02-01
Introduction: Although carbohydrates consumption is a key factor to enhance sport performance, intake levels seem questioned by some amateur athletes, leading to develop an irrational aversion to carbohydrate known as “carbophobia”. On the other hand, food is the origin of virtual communities erected as a source of knowledge and a way to exchange information. Despite this, very few studies have analysed the influence of social media in eating behaviours. Objectives: To know the conceptualizations about carbohydrates intake and eating patterns related to carbophobia expressed in amateur athletes’ Twitter accounts. Methods: Qualitative research designed from Hine’s Virtual Ethnography. Virtual immersion was used for data collection in Twitter open accounts in a theoretical sample of tweets from amateur athletes. Discourse analysis of narrative information of tweets was carried out through open, axial and selective coding process and the constant comparison method. Results: Data analysis revealed four main categories that offered a picture of conceptualizations of carbohydrates: carbohydrates as suspects or guilty from slowing down training, carbophobia as a lifestyle, carbophobia as a religion and finally the love/hate relationship with carbohydrates. Conclusions: Low-carbohydrate diet is considered a healthy lifestyle in some amateur athletes. The results of this study show the power of virtual communication tools such as Twitter to support, promote and maintain uncommon and not necessarily healthy eating behaviours. Future studies should focus on the context in which these practices appear.
NASA Technical Reports Server (NTRS)
Harm, Deborah L.; Taylor, L. C.; Bloomberg, J. J.
2007-01-01
Virtual environments (VE) offer unique training opportunities, particularly for training astronauts and preadapting them to the novel sensory conditions of microgravity. Sensorimotor aftereffects of VEs are often quite similar to adaptive sensorimotor responses observed in astronauts during and/or following space flight. The purpose of this research was to compare disturbances in sensorimotor coordination produced by dome virtual environment display and to examine the effects of exposure duration, and repeated exposures to VR systems. The current study examined disturbances in eye-head-hand (EHH) and eye-head coordination. Preliminary results will be presented. Eleven subjects have participated in the study to date. One training session was completed in order to achieve stable performance on the EHH coordination and VE tasks. Three experimental sessions were performed each separated by one day. Subjects performed a navigation and pick and place task in a dome immersive display VE for 30 or 60 min. The subjects were asked to move objects from one set of 15 pedestals to the other set across a virtual square room through a random pathway as quickly and accurately as possible. EHH coordination was measured before, immediately after, and at 1 hr, 2 hr, 4 hr and 6 hr following exposure to VR. EHH coordination was measured as position errors and reaction time in a pointing task that included multiple horizontal and vertical LED targets. Repeated measures ANOVAs were used to analyze the data. In general, we observed significant increases in position errors for both horizontal and vertical targets. The largest decrements were observed immediately following exposure to VR and showed a fairly rapid recovery across test sessions, but not across days. Subjects generally showed faster RTs across days. Individuals recovered from the detrimental effects of exposure to the VE on position errors within 1-2 hours. The fact that subjects did not significantly improve across days suggests that in order to achieve dual adaptation of EHH coordination may require more than three training sessions. These findings provide some direction for developing training schedules for VE users that facilitate adaptation, support the idea that preflight training of astronauts may serve as useful countermeasure for the sensorimotor effects of space flight, and support the idea that VEs may serve as an analog for sensorimotor effects of spaceflight.
Fromberger, Peter; Meyer, Sabrina; Kempf, Christina; Jordan, Kirsten; Müller, Jürgen L.
2015-01-01
Virtual Reality (VR) has successfully been used in the research of human behavior for more than twenty years. The main advantage of VR is its capability to induce a high sense of presence. This results in emotions and behavior which are very close to those shown in real situations. In the context of sex research, only a few studies have used high-immersive VR so far. The ones that did can be found mostly in the field of forensic psychology. Nevertheless, the relationship between presence and sexual interest still remains unclear. The present study is the first to examine the advantages of high-immersive VR in comparison to a conventional standard desktop system regarding their capability to measure sexual interest. 25 gynephilic and 20 androphilic healthy men underwent three experimental conditions, which differed in their ability to induce a sense of presence. In each condition, participants were asked to rate ten male and ten female virtual human characters regarding their sexual attractiveness. Without their knowledge, the subjects’ viewing time was assessed throughout the rating. Subjects were then asked to rate the sense of presence they had experienced as well as their perceived realism of the characters. Results suggested that stereoscopic viewing can significantly enhance the subjective sexual attractiveness of sexually relevant characters. Furthermore, in all three conditions participants looked significantly longer at sexually relevant virtual characters than at sexually non-relevant ones. The high immersion condition provided the best discriminant validity. From a statistical point of view, however, the sense of presence had no significant influence on the discriminant validity of the viewing time task. The study showed that high-immersive virtual environments enhance realism ratings as well as ratings of sexual attractiveness of three-dimensional human stimuli in comparison to standard desktop systems. Results also show that viewing time seems to be influenced neither by sexual attractiveness nor by realism of stimuli. This indicates how important task specific mechanisms of the viewing time effect are. PMID:25992790
The impact of contextualization on immersion in healthcare simulation.
Engström, Henrik; Andersson Hagiwara, Magnus; Backlund, Per; Lebram, Mikael; Lundberg, Lars; Johannesson, Mikael; Sterner, Anders; Maurin Söderholm, Hanna
2016-01-01
The aim of this paper is to explore how contextualization of a healthcare simulation scenarios impacts immersion, by using a novel objective instrument, the Immersion Score Rating Instrument. This instrument consists of 10 triggers that indicate reduced or enhanced immersion among participants in a simulation scenario. Triggers refer to events such as jumps in time or space (sign of reduced immersion) and natural interaction with the manikin (sign of enhanced immersion) and can be used to calculate an immersion score. An experiment using a randomized controlled crossover design was conducted to compare immersion between two simulation training conditions for prehospital care: one basic and one contextualized. The Immersion Score Rating Instrument was used to compare the total immersion score for the whole scenario, the immersion score for individual mission phases, and to analyze differences in trigger occurrences. A paired t test was used to test for significance. The comparison shows that the overall immersion score for the simulation was higher in the contextualized condition. The average immersion score was 2.17 (sd = 1.67) in the contextualized condition and -0.77 (sd = 2.01) in the basic condition ( p < .001). The immersion score was significantly higher in the contextualized condition in five out of six mission phases. Events that might be disruptive for the simulation participants' immersion, such as interventions of the instructor and illogical jumps in time or space, are present to a higher degree in the basic scenario condition; while events that signal enhanced immersion, such as natural interaction with the manikin, are more frequently observed in the contextualized condition. The results suggest that contextualization of simulation training with respect to increased equipment and environmental fidelity as well as functional task alignment might affect immersion positively and thus contribute to an improved training experience.
What about the Firewall? Creating Virtual Worlds in a Public Primary School Using Sim-on-a-Stick
ERIC Educational Resources Information Center
Jacka, Lisa; Booth, Kate
2012-01-01
Virtual worlds are highly immersive, engaging and popular computer mediated environments being explored by children and adults. Why then aren't more teachers using virtual worlds in the classroom with primary and secondary school students? Reasons often cited are the learning required to master the technology, low-end graphics cards, poor…
ERIC Educational Resources Information Center
Chihak, Benjamin J.; Plumert, Jodie M.; Ziemer, Christine J.; Babu, Sabarish; Grechkin, Timofey; Cremer, James F.; Kearney, Joseph K.
2010-01-01
Two experiments examined how 10- and 12-year-old children and adults intercept moving gaps while bicycling in an immersive virtual environment. Participants rode an actual bicycle along a virtual roadway. At 12 test intersections, participants attempted to pass through a gap between 2 moving, car-sized blocks without stopping. The blocks were…
Possibilities and Determinants of Using Low-Cost Devices in Virtual Education Applications
ERIC Educational Resources Information Center
Bun, Pawel Kazimierz; Wichniarek, Radoslaw; Górski, Filip; Grajewski, Damian; Zawadzki, Przemyslaw; Hamrol, Adam
2017-01-01
Virtual reality (VR) may be used as an innovative educational tool. However, in order to fully exploit its potential, it is essential to achieve the effect of immersion. To more completely submerge the user in a virtual environment, it is necessary to ensure that the user's actions are directly translated into the image generated by the…
Can Virtual Science Foster Real Skills? A Study of Inquiry Skills in a Virtual World
ERIC Educational Resources Information Center
Dodds, Heather E.
2013-01-01
Online education has grown into a part of the educational market answering the demand for learning at the learner's choice of time and place. Inquiry skills such as observing, questioning, collecting data, and devising fair experiments are an essential element of 21st-century online science coursework. Virtual immersive worlds such as Second Life…
ERIC Educational Resources Information Center
Bailenson, Jeremy N.; Yee, Nick; Blascovich, Jim; Beall, Andrew C.; Lundblad, Nicole; Jin, Michael
2008-01-01
This article illustrates the utility of using virtual environments to transform social interaction via behavior and context, with the goal of improving learning in digital environments. We first describe the technology and theories behind virtual environments and then report data from 4 empirical studies. In Experiment 1, we demonstrated that…
ERIC Educational Resources Information Center
Ryan, Jenna; Porter, Marjorie; Miller, Rebecca
2010-01-01
Current literature on libraries is abundant with articles about the uses and the potential of new interactive communication technology, including Web 2.0 tools. Recently, the advent and use of virtual worlds have received top billing in these works. Many library institutions are exploring these virtual environments; this exploration and the…
Augmenting the Thermal Flux Experiment: A Mixed Reality Approach with the HoloLens
ERIC Educational Resources Information Center
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-01-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted…
Virtual Worlds; Real Learning: Design Principles for Engaging Immersive Environments
NASA Technical Reports Server (NTRS)
Wu (u. Sjarpm)
2012-01-01
The EMDT master's program at Full Sail University embarked on a small project to use a virtual environment to teach graduate students. The property used for this project has evolved our several iterations and has yielded some basic design principles and pedagogy for virtual spaces. As a result, students are emerging from the program with a better grasp of future possibilities.
Emerging Conceptual Understanding of Complex Astronomical Phenomena by Using a Virtual Solar System
ERIC Educational Resources Information Center
Gazit, Elhanan; Yair, Yoav; Chen, David
2005-01-01
This study describes high school students' conceptual development of the basic astronomical phenomena during real-time interactions with a Virtual Solar System (VSS). The VSS is a non-immersive virtual environment which has a dynamic frame of reference that can be altered by the user. Ten 10th grade students were given tasks containing a set of…
Embodying self-compassion within virtual reality and its effects on patients with depression
Falconer, Caroline J.; Rovira, Aitor; King, John A.; Gilbert, Paul; Antley, Angus; Fearon, Pasco; Ralph, Neil; Slater, Mel
2016-01-01
Background Self-criticism is a ubiquitous feature of psychopathology and can be combatted by increasing levels of self-compassion. However, some patients are resistant to self-compassion. Aims To investigate whether the effects of self-identification with virtual bodies within immersive virtual reality could be exploited to increase self-compassion in patients with depression. Method We developed an 8-minute scenario in which 15 patients practised delivering compassion in one virtual body and then experienced receiving it from themselves in another virtual body. Results In an open trial, three repetitions of this scenario led to significant reductions in depression severity and self-criticism, as well as to a significant increase in self-compassion, from baseline to 4-week follow-up. Four patients showed clinically significant improvement. Conclusions The results indicate that interventions using immersive virtual reality may have considerable clinical potential and that further development of these methods preparatory to a controlled trial is now warranted. Declaration of interest None. Copyright and usage © The Royal College of Psychiatrists 2016. This is an open access article distributed under the terms of the Creative Commons Attribution (CC BY) licence. PMID:27703757
Virtualized Traffic: reconstructing traffic flows from discrete spatiotemporal data.
Sewall, Jason; van den Berg, Jur; Lin, Ming C; Manocha, Dinesh
2011-01-01
We present a novel concept, Virtualized Traffic, to reconstruct and visualize continuous traffic flows from discrete spatiotemporal data provided by traffic sensors or generated artificially to enhance a sense of immersion in a dynamic virtual world. Given the positions of each car at two recorded locations on a highway and the corresponding time instances, our approach can reconstruct the traffic flows (i.e., the dynamic motions of multiple cars over time) between the two locations along the highway for immersive visualization of virtual cities or other environments. Our algorithm is applicable to high-density traffic on highways with an arbitrary number of lanes and takes into account the geometric, kinematic, and dynamic constraints on the cars. Our method reconstructs the car motion that automatically minimizes the number of lane changes, respects safety distance to other cars, and computes the acceleration necessary to obtain a smooth traffic flow subject to the given constraints. Furthermore, our framework can process a continuous stream of input data in real time, enabling the users to view virtualized traffic events in a virtual world as they occur. We demonstrate our reconstruction technique with both synthetic and real-world input. © 2011 IEEE Published by the IEEE Computer Society
[Subjective sensations indicating simulator sickness and fatigue after exposure to virtual reality].
Malińska, Marzena; Zuzewicz, Krystyna; Bugajska, Joanna; Grabowski, Andrzej
2014-01-01
The study assessed the incidence and intensity of subjective symptoms indicating simulator sickness among the persons with no inclination to motion sickness, immersed in virtual reality (VR) by watching an hour long movie in the stereoscopic (three-dimensional - 3D) and non-stereoscopic (two-dimensional - 2D) versions and after an hour long training using virtual reality, called sVR. The sample comprised 20 healthy young men with no inclination to motion sickness. The participants' subjective sensations, indicating symptoms of simulator sickness were assessed using the questionnaire completed by the participants immediately, 20 min and 24 h following the test. Grandjean's scale was used to assess fatigue and mood. The symptoms were observed immediately after the exposure to sVR. Their intensity was higher than after watching the 2D and 3D movies. A significant relationship was found between the eye pain and the type of exposure (2D, 3D and sVR) (Chi2)(2) = 6.225, p < or = 0.05); the relationship between excessive perspiration and the exposure to 31) movie and sVR was also noted (Chi2(1) = 9.173, p < or = 0.01). Some symptoms were still observed 20 min after exposure to sVR. The comparison of Grandjean's scale results before and after the training in sVR handing showed significant differences in 11 out of 14 subscales. Before and after exposure to 3D movie, the differences were significant only for the "tired-fatigued" subscale (Z = 2.501, p < or = 0.012) in favor of "fatigued". Based on the subjective sensation of discomfort after watching 2D and 3D movies it is impossible to predict symptoms of simulator sickness after training using sVR.
The Role of Immersive Media in Online Education
ERIC Educational Resources Information Center
Bronack, Stephen C.
2011-01-01
An increasing number of educators are integrating immersive media into core course offerings. Virtual worlds, serious games, simulations, and augmented reality are enabling students and instructors to connect with content and with one another in novel ways. As a result, many are investigating the new affordances these media provide and the impact…
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
Kim, Aram; Zhou, Zixuan; Kretch, Kari S; Finley, James M
2017-07-01
The ability to successfully navigate obstacles in our environment requires integration of visual information about the environment with estimates of our body's state. Previous studies have used partial occlusion of the visual field to explore how information about the body and impending obstacles are integrated to mediate a successful clearance strategy. However, because these manipulations often remove information about both the body and obstacle, it remains to be seen how information about the lower extremities alone is utilized during obstacle crossing. Here, we used an immersive virtual reality (VR) interface to explore how visual feedback of the lower extremities influences obstacle crossing performance. Participants wore a head-mounted display while walking on treadmill and were instructed to step over obstacles in a virtual corridor in four different feedback trials. The trials involved: (1) No visual feedback of the lower extremities, (2) an endpoint-only model, (3) a link-segment model, and (4) a volumetric multi-segment model. We found that the volumetric model improved success rate, placed their trailing foot before crossing and leading foot after crossing more consistently, and placed their leading foot closer to the obstacle after crossing compared to no model. This knowledge is critical for the design of obstacle negotiation tasks in immersive virtual environments as it may provide information about the fidelity necessary to reproduce ecologically valid practice environments.
Yu, Xunyi; Ganz, Aura
2011-01-01
In this paper we introduce a Mixed Reality Triage and Evacuation game, MiRTE, that is used in the development, testing and training of Mass Casualty Incident (MCI) information systems for first responders. Using the Source game engine from Valve software, MiRTE creates immersive virtual environments to simulate various incident scenarios, and enables interactions between multiple players/first responders. What distinguishes it from a pure computer simulation game is that it can interface with external mass casualty incident management systems, such as DIORAMA. The game will enable system developers to specify technical requirements of underlying technology, and test different alternatives of design. After the information system hardware and software are completed, the game can simulate various algorithms such as localization technologies, and interface with an actual user interface on PCs and Smartphones. We implemented and tested the game with the DIORAMA system.
Li, Benjamin J; Bailenson, Jeremy N; Pines, Adam; Greenleaf, Walter J; Williams, Leanne M
2017-01-01
Virtual reality (VR) has been proposed as a methodological tool to study the basic science of psychology and other fields. One key advantage of VR is that sharing of virtual content can lead to more robust replication and representative sampling. A database of standardized content will help fulfill this vision. There are two objectives to this study. First, we seek to establish and allow public access to a database of immersive VR video clips that can act as a potential resource for studies on emotion induction using virtual reality. Second, given the large sample size of participants needed to get reliable valence and arousal ratings for our video, we were able to explore the possible links between the head movements of the observer and the emotions he or she feels while viewing immersive VR. To accomplish our goals, we sourced for and tested 73 immersive VR clips which participants rated on valence and arousal dimensions using self-assessment manikins. We also tracked participants' rotational head movements as they watched the clips, allowing us to correlate head movements and affect. Based on past research, we predicted relationships between the standard deviation of head yaw and valence and arousal ratings. Results showed that the stimuli varied reasonably well along the dimensions of valence and arousal, with a slight underrepresentation of clips that are of negative valence and highly arousing. The standard deviation of yaw positively correlated with valence, while a significant positive relationship was found between head pitch and arousal. The immersive VR clips tested are available online as supplemental material.
How incorporation of scents could enhance immersive virtual experiences
Ischer, Matthieu; Baron, Naëm; Mermoud, Christophe; Cayeux, Isabelle; Porcherot, Christelle; Sander, David; Delplanque, Sylvain
2014-01-01
Under normal everyday conditions, senses all work together to create experiences that fill a typical person's life. Unfortunately for behavioral and cognitive researchers who investigate such experiences, standard laboratory tests are usually conducted in a nondescript room in front of a computer screen. They are very far from replicating the complexity of real world experiences. Recently, immersive virtual reality (IVR) environments became promising methods to immerse people into an almost real environment that involves more senses. IVR environments provide many similarities to the complexity of the real world and at the same time allow experimenters to constrain experimental parameters to obtain empirical data. This can eventually lead to better treatment options and/or new mechanistic hypotheses. The idea that increasing sensory modalities improve the realism of IVR environments has been empirically supported, but the senses used did not usually include olfaction. In this technology report, we will present an odor delivery system applied to a state-of-the-art IVR technology. The platform provides a three-dimensional, immersive, and fully interactive visualization environment called “Brain and Behavioral Laboratory—Immersive System” (BBL-IS). The solution we propose can reliably deliver various complex scents during different virtual scenarios, at a precise time and space and without contamination of the environment. The main features of this platform are: (i) the limited cross-contamination between odorant streams with a fast odor delivery (< 500 ms), (ii) the ease of use and control, and (iii) the possibility to synchronize the delivery of the odorant with pictures, videos or sounds. How this unique technology could be used to investigate typical research questions in olfaction (e.g., emotional elicitation, memory encoding or attentional capture by scents) will also be addressed. PMID:25101017
Li, Benjamin J.; Bailenson, Jeremy N.; Pines, Adam; Greenleaf, Walter J.; Williams, Leanne M.
2017-01-01
Virtual reality (VR) has been proposed as a methodological tool to study the basic science of psychology and other fields. One key advantage of VR is that sharing of virtual content can lead to more robust replication and representative sampling. A database of standardized content will help fulfill this vision. There are two objectives to this study. First, we seek to establish and allow public access to a database of immersive VR video clips that can act as a potential resource for studies on emotion induction using virtual reality. Second, given the large sample size of participants needed to get reliable valence and arousal ratings for our video, we were able to explore the possible links between the head movements of the observer and the emotions he or she feels while viewing immersive VR. To accomplish our goals, we sourced for and tested 73 immersive VR clips which participants rated on valence and arousal dimensions using self-assessment manikins. We also tracked participants' rotational head movements as they watched the clips, allowing us to correlate head movements and affect. Based on past research, we predicted relationships between the standard deviation of head yaw and valence and arousal ratings. Results showed that the stimuli varied reasonably well along the dimensions of valence and arousal, with a slight underrepresentation of clips that are of negative valence and highly arousing. The standard deviation of yaw positively correlated with valence, while a significant positive relationship was found between head pitch and arousal. The immersive VR clips tested are available online as supplemental material. PMID:29259571
Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto
2013-01-01
In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680
Designers workbench: toward real-time immersive modeling
NASA Astrophysics Data System (ADS)
Kuester, Falko; Duchaineau, Mark A.; Hamann, Bernd; Joy, Kenneth I.; Ma, Kwan-Liu
2000-05-01
This paper introduces the Designers Workbench, a semi- immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing, and computer-aided engineering systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates form a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The Designers Workbench aims at closing this technology or 'digital gap' experienced by design and CAD engineers by transforming the classical design paradigm into its fully integrate digital and virtual analog allowing collaborative development in a semi- immersive virtual environment. This project emphasizes two key components form the classical product design cycle: freeform modeling and analysis. In the freedom modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.
NASA Astrophysics Data System (ADS)
Demir, I.
2015-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.
Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto
2013-10-09
In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.
Development of an immersive virtual reality head-mounted display with high performance.
Wang, Yunqi; Liu, Weiqi; Meng, Xiangxiang; Fu, Hanyi; Zhang, Daliang; Kang, Yusi; Feng, Rui; Wei, Zhonglun; Zhu, Xiuqing; Jiang, Guohua
2016-09-01
To resolve the contradiction between large field of view and high resolution in immersive virtual reality (VR) head-mounted displays (HMDs), an HMD monocular optical system with a large field of view and high resolution was designed. The system was fabricated by adopting aspheric technology with CNC grinding and a high-resolution LCD as the image source. With this monocular optical system, an HMD binocular optical system with a wide-range continuously adjustable interpupillary distance was achieved in the form of partially overlapping fields of view (FOV) combined with a screw adjustment mechanism. A fast image processor-centered LCD driver circuit and an image preprocessing system were also built to address binocular vision inconsistency in the partially overlapping FOV binocular optical system. The distortions of the HMD optical system with a large field of view were measured. Meanwhile, the optical distortions in the display and the trapezoidal distortions introduced during image processing were corrected by a calibration model for reverse rotations and translations. A high-performance not-fully-transparent VR HMD device with high resolution (1920×1080) and large FOV [141.6°(H)×73.08°(V)] was developed. The full field-of-view average value of angular resolution is 18.6 pixels/degree. With the device, high-quality VR simulations can be completed under various scenarios, and the device can be utilized for simulated trainings in aeronautics, astronautics, and other fields with corresponding platforms. The developed device has positive practical significance.
Pietrzak, Eva; Pullman, Stephen; McGuire, Annabel
2014-08-01
This article reviews the available literature about the use of novel methods of rehabilitation using virtual reality interventions for people living with posttraumatic brain injuries. The MEDLINE, EMBASE, SCOPUS, and Cochrane Library databases were searched using the terms "virtual reality" OR "video games" AND "traumatic brain injury." Included studies investigated therapeutic use of virtual reality in adults with a brain trauma resulting from acquired closed head injury, reported outcomes that included measures of motor or cognitive functionality, and were published in a peer-reviewed journal written in English. Eighteen articles fulfilled inclusion criteria. Eight were case studies, five studies had a quasi-experimental design with a pre-post comparison, and five were pilot randomized control trials or comparative studies. The virtual reality systems used were commercial or custom designed for the study and ranged from expensive, fully immersive systems to cheap online games or videogames. In before-after comparisons, improvements in balance were seen in four case studies and two small randomized control trials. Between-group comparisons in these randomized control trials showed no difference between virtual reality and traditional therapy. Post-training improvements were also seen for upper extremity functions (five small studies) and for various cognitive function measures (four case studies and one pilot randomized control trial). Attitudes of participants toward virtual reality interventions was more positive than for traditional therapy (three studies). The evidence that the use of virtual reality in rehabilitation of traumatic brain injury improves motor and cognitive functionality is currently very limited. However, this approach has the potential to provide alternative, possibly more affordable and available rehabilitation therapy for traumatic brain injury in settings where access to therapy is limited by geographical or financial constraints.
ERIC Educational Resources Information Center
Orman, Evelyn K.
2016-01-01
This study examined the effects of virtual reality immersion with audio on eye contact, directional focus and focus of attention for novice wind band conductors. Participants (N = 34) included a control group (n = 12) and two virtual reality groups with (n = 10) and without (n = 12) head tracking. Participants completed conducting/score study…
Constraint, Intelligence, and Control Hierarchy in Virtual Environments. Chapter 1
NASA Technical Reports Server (NTRS)
Sheridan, Thomas B.
2007-01-01
This paper seeks to deal directly with the question of what makes virtual actors and objects that are experienced in virtual environments seem real. (The term virtual reality, while more common in public usage, is an oxymoron; therefore virtual environment is the preferred term in this paper). Reality is difficult topic, treated for centuries in those sub-fields of philosophy called ontology- "of or relating to being or existence" and epistemology- "the study of the method and grounds of knowledge, especially with reference to its limits and validity" (both from Webster s, 1965). Advances in recent decades in the technologies of computers, sensors and graphics software have permitted human users to feel present or experience immersion in computer-generated virtual environments. This has motivated a keen interest in probing this phenomenon of presence and immersion not only philosophically but also psychologically and physiologically in terms of the parameters of the senses and sensory stimulation that correlate with the experience (Ellis, 1991). The pages of the journal Presence: Teleoperators and Virtual Environments have seen much discussion of what makes virtual environments seem real (see, e.g., Slater, 1999; Slater et al. 1994; Sheridan, 1992, 2000). Stephen Ellis, when organizing the meeting that motivated this paper, suggested to invited authors that "We may adopt as an organizing principle for the meeting that the genesis of apparently intelligent interaction arises from an upwelling of constraints determined by a hierarchy of lower levels of behavioral interaction. "My first reaction was "huh?" and my second was "yeah, that seems to make sense." Accordingly the paper seeks to explain from the author s viewpoint, why Ellis s hypothesis makes sense. What is the connection of "presence" or "immersion" of an observer in a virtual environment, to "constraints" and what types of constraints. What of "intelligent interaction," and is it the intelligence of the observer or the intelligence of the environment (whatever the latter may mean) that is salient? And finally, what might be relevant about "upwelling" of constraints as determined by a hierarchy of levels of interaction?
EXPLORING ENVIRONMENTAL DATA IN A HIGHLY IMMERSIVE VIRTUAL REALITY ENVIRONMENT
Geography inherently fills a 3D space and yet we struggle with displaying geography using, primaarily, 2D display devices. Virtual environments offer a more realistically-dimensioned display space and this is being realized in the expanding area of research on 3D Geographic Infor...
PC-Based Virtual Reality for CAD Model Viewing
ERIC Educational Resources Information Center
Seth, Abhishek; Smith, Shana S.-F.
2004-01-01
Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…
Seeing an Embodied Virtual Hand is Analgesic Contingent on Colocation.
Nierula, Birgit; Martini, Matteo; Matamala-Gomez, Marta; Slater, Mel; Sanchez-Vives, Maria V
2017-06-01
Seeing one's own body has been reported to have analgesic properties. Analgesia has also been described when seeing an embodied virtual body colocated with the real one. However, there is controversy regarding whether this effect holds true when seeing an illusory-owned body part, such as during the rubber-hand illusion. A critical difference between these paradigms is the distance between the real and surrogate body part. Colocation of the real and surrogate arm is possible in an immersive virtual environment, but not during illusory ownership of a rubber arm. The present study aimed at testing whether the distance between a real and a virtual arm can explain such differences in terms of pain modulation. Using a paradigm of embodiment of a virtual body allowed us to evaluate heat pain thresholds at colocation and at a 30-cm distance between the real and the virtual arm. We observed a significantly higher heat pain threshold at colocation than at a 30-cm distance. The analgesic effects of seeing a virtual colocated arm were eliminated by increasing the distance between the real and the virtual arm, which explains why seeing an illusorily owned rubber arm does not consistently result in analgesia. These findings are relevant for the use of virtual reality in pain management. Looking at a virtual body has analgesic properties similar to looking at one's real body. We identify the importance of colocation between a real and a surrogate body for this to occur and thereby resolve a scientific controversy. This information is useful for exploiting immersive virtual reality in pain management. Copyright © 2017. Published by Elsevier Inc.
Hands-on Learning in the Virtual World
ERIC Educational Resources Information Center
Branson, John; Thomson, Diane
2013-01-01
The U.S. military has long understood the value of immersive simulations in education. Before the Navy entrusts a ship to a crew, crew members must first practice and demonstrate their competency in a fully immersive, simulated environment. Why not teach students in the same way? K-12 educators in Pennsylvania, USA, recently did just that when…
Engagement with Electronic Screen Media among Students with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Mineo, Beth A.; Ziegler, William; Gill, Susan; Salkin, Donna
2009-01-01
This study investigated the relative engagement potential of four types of electronic screen media (ESM): animated video, video of self, video of a familiar person engaged with an immersive virtual reality (VR) game, and immersion of self in the VR game. Forty-two students with autism, varying in age and expressive communication ability, were…
ERIC Educational Resources Information Center
Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca
2009-01-01
The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…
Cognitive Presence and Effect of Immersion in Virtual Learning Environment
ERIC Educational Resources Information Center
Katernyak, Ihor; Loboda, Viktoriya
2016-01-01
This paper presents the approach to successful application of two knowledge management techniques--community of practice and eLearning, in order to create and manage a competence-developing virtual learning environment. It explains how "4A" model of involving practitioners in eLearning process (through attention, actualization,…
Undergraduate Student Self-Efficacy and Perceptions of Virtual World Learning Experience
ERIC Educational Resources Information Center
Stanton, Lorraine May
2017-01-01
Virtual worlds are innovative teaching and learning methods that can provide immersive and engaging learning experiences (Lu, 2010). Though they have potential benefits, students sometimes experience a steep learning curve and discomfort with the technology (Warburton, 2009). This study explored how students in two American Studies classes using…
Virtual Environments and Autism: A Developmental Psychopathological Approach
ERIC Educational Resources Information Center
Rajendran, G.
2013-01-01
Individuals with autism spectrum disorders supposedly have an affinity with information and communication technology (ICT), making it an ideally suited media for this population. Virtual environments (VEs)--both two-dimensional and immersive--represent a particular kind of ICT that might be of special benefit. Specifically, this paper discusses…
Intelligent Tutors in Immersive Virtual Environments
ERIC Educational Resources Information Center
Yan, Peng; Slator, Brian M.; Vender, Bradley; Jin, Wei; Kariluoma, Matti; Borchert, Otto; Hokanson, Guy; Aggarwal, Vaibhav; Cosmano, Bob; Cox, Kathleen T.; Pilch, André; Marry, Andrew
2013-01-01
Research into virtual role-based learning has progressed over the past decade. Modern issues include gauging the difficulty of designing a goal system capable of meeting the requirements of students with different knowledge levels, and the reasonability and possibility of taking advantage of the well-designed formula and techniques served in other…
Exploring Moral Action Using lmmersive Virtual Reality
2016-10-01
the Obedience. in The Bar experimental scenario is in the context of sexual harassment and has two phases, a ll in immersive virtual rea lity. In...a paper for submission to a high impact journal (depending of course on the final resu lts). 4. Conclusions The original proposal set out the
Amaral, Carlos P; Simões, Marco A; Mouga, Susana; Andrade, João; Castelo-Branco, Miguel
2017-10-01
We present a novel virtual-reality P300-based Brain Computer Interface (BCI) paradigm using social cues to direct the focus of attention. We combined interactive immersive virtual-reality (VR) technology with the properties of P300 signals in a training tool which can be used in social attention disorders such as autism spectrum disorder (ASD). We tested the novel social attention training paradigm (P300-based BCI paradigm for rehabilitation of joint-attention skills) in 13 healthy participants, in 3 EEG systems. The more suitable setup was tested online with 4 ASD subjects. Statistical accuracy was assessed based on the detection of P300, using spatial filtering and a Naïve-Bayes classifier. We compared: 1 - g.Mobilab+ (active dry-electrodes, wireless transmission); 2 - g.Nautilus (active electrodes, wireless transmission); 3 - V-Amp with actiCAP Xpress dry-electrodes. Significant statistical classification was achieved in all systems. g.Nautilus proved to be the best performing system in terms of accuracy in the detection of P300, preparation time, speed and reported comfort. Proof of concept tests in ASD participants proved that this setup is feasible for training joint attention skills in ASD. This work provides a unique combination of 'easy-to-use' BCI systems with new technologies such as VR to train joint-attention skills in autism. Our P300 BCI paradigm is feasible for future Phase I/II clinical trials to train joint-attention skills, with successful classification within few trials, online in ASD participants. The g.Nautilus system is the best performing one to use with the developed BCI setup. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Miranda, Mateus R.; Costa, Henrik; Oliveira, Luiz; Bernardes, Thiago; Aguiar, Carla; Miosso, Cristiano; Oliveira, Alessandro B. S.; Diniz, Alberto C. G. C.; Domingues, Diana Maria G.
2015-03-01
This paper aims at describing an experimental platform used to evaluate the performance of individuals at training immersive physiological games. The platform proposed is embedded in an immersive environment in a CAVE of Virtual Reality and consists on a base frame with actuators with three degrees of freedom, sensor array interface and physiological sensors. Physiological data of breathing, galvanic skin resistance (GSR) and pressure on the hand of the user and a subjective questionnaire were collected during the experiments. The theoretical background used in a project focused on Software Engineering, Biomedical Engineering in the field of Ergonomics and Creative Technologies in order to presents this case study, related of an evaluation of a vehicular simulator located inside the CAVE. The analysis of the simulator uses physiological data of the drivers obtained in a period of rest and after the experience, with and without movements at the simulator. Also images from the screen are captured through time at the embedded experience and data collected through physiological data visualization (average frequency and RMS graphics). They are empowered by the subjective questionnaire as strong lived experience provided by the technological apparatus. The performed immersion experience inside the CAVE allows to replicate behaviors from physical spaces inside data space enhanced by physiological properties. In this context, the biocybrid condition is expanded beyond art and entertainment, as it is applied to automotive engineering and biomedical engineering. In fact, the kinesthetic sensations amplified by synesthesia replicates the sensation of displacement in the interior of an automobile, as well as the sensations of vibration and vertical movements typical of a vehicle, different speeds, collisions, etc. The contribution of this work is the possibility to tracing a stress analysis protocol for drivers while operating a vehicle getting affective behaviors coming from physiological data, mixed to embedded simulation in Mixed Reality.
Temporally coherent 4D video segmentation for teleconferencing
NASA Astrophysics Data System (ADS)
Ehmann, Jana; Guleryuz, Onur G.
2013-09-01
We develop an algorithm for 4-D (RGB+Depth) video segmentation targeting immersive teleconferencing ap- plications on emerging mobile devices. Our algorithm extracts users from their environments and places them onto virtual backgrounds similar to green-screening. The virtual backgrounds increase immersion and interac- tivity, relieving the users of the system from distractions caused by disparate environments. Commodity depth sensors, while providing useful information for segmentation, result in noisy depth maps with a large number of missing depth values. By combining depth and RGB information, our work signi¯cantly improves the other- wise very coarse segmentation. Further imposing temporal coherence yields compositions where the foregrounds seamlessly blend with the virtual backgrounds with minimal °icker and other artifacts. We achieve said improve- ments by correcting the missing information in depth maps before fast RGB-based segmentation, which operates in conjunction with temporal coherence. Simulation results indicate the e±cacy of the proposed system in video conferencing scenarios.
New tools for sculpting cranial implants in a shared haptic augmented reality environment.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2006-01-01
New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.
New approaches to virtual environment surgery
NASA Technical Reports Server (NTRS)
Ross, M. D.; Twombly, A.; Lee, A. W.; Cheng, R.; Senger, S.
1999-01-01
This research focused on two main problems: 1) low cost, high fidelity stereoscopic imaging of complex tissues and organs; and 2) virtual cutting of tissue. A further objective was to develop these images and virtual tissue cutting methods for use in a telemedicine project that would connect remote sites using the Next Generation Internet. For goal one we used a CT scan of a human heart, a desktop PC with an OpenGL graphics accelerator card, and LCD stereoscopic glasses. Use of multiresolution meshes ranging from approximately 1,000,000 to 20,000 polygons speeded interactive rendering rates enormously while retaining general topography of the dataset. For goal two, we used a CT scan of an infant skull with premature closure of the right coronal suture, a Silicon Graphics Onyx workstation, a Fakespace Immersive WorkBench and CrystalEyes LCD glasses. The high fidelity mesh of the skull was reduced from one million to 50,000 polygons. The cut path was automatically calculated as the shortest distance along the mesh between a small number of hand selected vertices. The region outlined by the cut path was then separated from the skull and translated/rotated to assume a new position. The results indicate that widespread high fidelity imaging in virtual environment is possible using ordinary PC capabilities if appropriate mesh reduction methods are employed. The software cutting tool is applicable to heart and other organs for surgery planning, for training surgeons in a virtual environment, and for telemedicine purposes.
ERIC Educational Resources Information Center
Warburton, Steven
2009-01-01
"Second Life" (SL) is currently the most mature and popular multi-user virtual world platform being used in education. Through an in-depth examination of SL, this article explores its potential and the barriers that multi-user virtual environments present to educators wanting to use immersive 3-D spaces in their teaching. The context is set by…
The Adaptive Effects Of Virtual Interfaces: Vestibulo-Ocular Reflex and Simulator Sickness.
1998-08-07
rearrangement: a pattern of stimulation differing from that existing as a result of normal interactions with the real world. Stimulus rearrangements can...is immersive and interactive . virtual interface: a system of transducers, signal processors, computer hardware and software that create an... interactive medium through which: 1) information is transmitted to the senses in the form of two- and three dimensional virtual images and 2) psychomotor
Kothgassner, Oswald D; Goreis, Andreas; Kafka, Johanna X; Hlavacs, Helmut; Beutl, Leon; Kryspin-Exner, Ilse; Felnhofer, Anna
2018-05-01
While virtual humans are increasingly used to benefit the elderly, considerably little is still known about older adults' virtual experiences. However, due to age-related changes, older adults' perceptions of virtual environments (VEs) may be unique. Hence, our objective was to examine possible gender differences in immersion, flow, and emotional states as well as physical and social presence in elderly males and females interacting either with a computer-controlled agent or a human-controlled avatar. Seventy-eight German-speaking older adults were randomly assigned to an avatar or an agent condition and were exposed to a brief social encounter in a virtual café. Results indicate no overall gender differences, but a significant effect of agency on social presence, physical presence, immersion, and flow. Participants in the avatar condition reported higher levels in all measures, except for involvement. Furthermore, significant gender × agency interactions were found, with females showing more social presence, spatial presence, and flow when interacting with a human-controlled avatar and more realism when conversing with an agent. Also, all participants showed significant changes in their affect post exposure. In sum, older adults' virtual experiences seem to follow unique patterns, yet, they do not preclude the elderly from successfully participating in VEs.
Radiological tele-immersion for next generation networks.
Ai, Z; Dech, F; Rasmussen, M; Silverstein, J C
2000-01-01
Since the acquisition of high-resolution three-dimensional patient images has become widespread, medical volumetric datasets (CT or MR) larger than 100 MB and encompassing more than 250 slices are common. It is important to make this patient-specific data quickly available and usable to many specialists at different geographical sites. Web-based systems have been developed to provide volume or surface rendering of medical data over networks with low fidelity, but these cannot adequately handle stereoscopic visualization or huge datasets. State-of-the-art virtual reality techniques and high speed networks have made it possible to create an environment for clinicians geographically distributed to immersively share these massive datasets in real-time. An object-oriented method for instantaneously importing medical volumetric data into Tele-Immersive environments has been developed at the Virtual Reality in Medicine Laboratory (VRMedLab) at the University of Illinois at Chicago (UIC). This networked-VR setup is based on LIMBO, an application framework or template that provides the basic capabilities of Tele-Immersion. We have developed a modular general purpose Tele-Immersion program that automatically combines 3D medical data with the methods for handling the data. For this purpose a DICOM loader for IRIS Performer has been developed. The loader was designed for SGI machines as a shared object, which is executed at LIMBO's runtime. The loader loads not only the selected DICOM dataset, but also methods for rendering, handling, and interacting with the data, bringing networked, real-time, stereoscopic interaction with radiological data to reality. Collaborative, interactive methods currently implemented in the loader include cutting planes and windowing. The Tele-Immersive environment has been tested on the UIC campus over an ATM network. We tested the environment with 3 nodes; one ImmersaDesk at the VRMedLab, one CAVE at the Electronic Visualization Laboratory (EVL) on east campus, and a CT scan machine in UIC Hospital. CT data was pulled directly from the scan machine to the Tele-Immersion server in our Laboratory, and then the data was synchronously distributed by our Onyx2 Rack server to all the VR setups. Instead of permitting medical volume visualization at one VR device, by combining teleconferencing, tele-presence, and virtual reality, the Tele-Immersive environment will enable geographically distributed clinicians to intuitively interact with the same medical volumetric models, point, gesture, converse, and see each other. This environment will bring together clinicians at different geographic locations to participate in Tele-Immersive consultation and collaboration.
Zhang, Melvyn W B; Ho, Roger C M
2017-01-01
There have been rapid advances in technologies over the past decade and virtual reality technology is an area which is increasingly utilized as a healthcare intervention in many disciplines including that of Medicine, Surgery and Psychiatry. In Psychiatry, most of the current interventions involving the usage of virtual reality technology is limited to its application for anxiety disorders. With the advances in technology, Internet addiction and Internet gaming disorders are increasingly prevalent. To date, these disorders are still being treated using conventional psychotherapy methods such as cognitive behavioural therapy. However, there is an increasing number of research combining various other therapies alongside with cognitive behavioural therapy, as an attempt possibly to reduce the drop-out rates and to make such interventions more relevant to the targeted group of addicts, who are mostly adolescents. To date, there has been a prior study done in Korea that has demonstrated the comparable efficacy of virtual reality therapy with that of cognitive behavioural therapy. However, the intervention requires the usage of specialized screens and devices. It is thus the objective of the current article to highlight how smartphone applications could be designed and be utilized for immersive virtual reality treatment, alongside low cost wearables.
Seraglia, Bruno; Gamberini, Luciano; Priftis, Konstantinos; Scatturin, Pietro; Martinelli, Massimiliano; Cutini, Simone
2011-01-01
For over two decades Virtual Reality (VR) has been used as a useful tool in several fields, from medical and psychological treatments, to industrial and military applications. Only in recent years researchers have begun to study the neural correlates that subtend VR experiences. Even if the functional Magnetic Resonance Imaging (fMRI) is the most common and used technique, it suffers several limitations and problems. Here we present a methodology that involves the use of a new and growing brain imaging technique, functional Near-infrared Spectroscopy (fNIRS), while participants experience immersive VR. In order to allow a proper fNIRS probe application, a custom-made VR helmet was created. To test the adapted helmet, a virtual version of the line bisection task was used. Participants could bisect the lines in a virtual peripersonal or extrapersonal space, through the manipulation of a Nintendo Wiimote ® controller in order for the participants to move a virtual laser pointer. Although no neural correlates of the dissociation between peripersonal and extrapersonal space were found, a significant hemodynamic activity with respect to the baseline was present in the right parietal and occipital areas. Both advantages and disadvantages of the presented methodology are discussed.
Use of VR Technology and Passive Haptics for MANPADS Training System
2017-09-01
this setup also does not offer a variety of challenging scenarios needed for good training as the aircraft are mostly flying in landing or take-off... customized high-fidelity immersive training facilities are limited. Moreover, low trainee throughput from such high-end facilities is an ongoing obstacle...opportunities allow few operators to fire during live exercises. Simulation training is effective, but customized high-fidelity immersive training
Faber, Albertus W.; Patterson, David R.; Bremer, Marco
2012-01-01
Objective The current study explored whether immersive virtual reality continues to reduce pain (via distraction) during more than one wound care session per patient. Patients: Thirty six patients aged 8 to 57 years (mean age of 27.7 years), with an average of 8.4% total body surface area burned (range .25 to 25.5 TBSA) received bandage changes, and wound cleaning. Methods Each patient received one baseline wound cleaning/debridement session with no-VR (control condition) followed by one or more (up to seven) subsequent wound care sessions during VR. After each wound care session (one session per day), worst pain intensity was measured using a Visual Analogue Thermometer (VAT), the dependent variable. Using a within subjects design, worst pain intensity VAT during wound care with no-VR (baseline, Day 0) was compared to pain during wound care while using immersive virtual reality (up to seven days of wound care during VR). Results Compared to pain during no-VR Baseline (Day 0), pain ratings during wound debridement were statistically lower when patients were in virtual reality on Days 1, 2 and 3, and although not significant beyond day 3, the pattern of results from Days 4, 5, and 6 are consistent with the notion that VR continues to reduce pain when used repeatedly. Conclusions Results from the present study suggest that VR continues to be effective when used for three (or possibly more) treatments during severe burn wound debridement. PMID:23970314
Tidoni, Emmanuele; Abu-Alqumsan, Mohammad; Leonardis, Daniele; Kapeller, Christoph; Fusco, Gabriele; Guger, Cristoph; Hintermuller, Cristoph; Peer, Angelika; Frisoli, Antonio; Tecchia, Franco; Bergamasco, Massimo; Aglioti, Salvatore Maria
2017-09-01
The development of technological applications that allow people to control and embody external devices within social interaction settings represents a major goal for current and future brain-computer interface (BCI) systems. Prior research has suggested that embodied systems may ameliorate BCI end-user's experience and accuracy in controlling external devices. Along these lines, we developed an immersive P300-based BCI application with a head-mounted display for virtual-local and robotic-remote social interactions and explored in a group of healthy participants the role of proprioceptive feedback in the control of a virtual surrogate (Study 1). Moreover, we compared the performance of a small group of people with spinal cord injury (SCI) to a control group of healthy subjects during virtual and robotic social interactions (Study 2), where both groups received a proprioceptive stimulation. Our attempt to combine immersive environments, BCI technologies and neuroscience of body ownership suggests that providing realistic multisensory feedback still represents a challenge. Results have shown that healthy and people living with SCI used the BCI within the immersive scenarios with good levels of performance (as indexed by task accuracy, optimizations calls and Information Transfer Rate) and perceived control of the surrogates. Proprioceptive feedback did not contribute to alter performance measures and body ownership sensations. Further studies are necessary to test whether sensorimotor experience represents an opportunity to improve the use of future embodied BCI applications.
Effect of longitudinal physical training and water immersion on orthostatic tolerance in men
NASA Technical Reports Server (NTRS)
Greenleaf, J. E.; Dunn, E. R.; Nesvig, C.; Keil, L. C.; Harrison, M. H.
1988-01-01
The effect of six months of moderately intense aerobic training on 60-deg head-up tilt tolerance was assessed before and after 6 hrs of water-immersion deconditioning by comparing the orthostatic and fluid-electrolyte-endocrine responses of five male subjects before and after these tests. It was found that six months of training has no significant effect on 60-deg head-up tilt tolerance. Thus, during pretraining, the water immersion tilt-tolerance was found to decrease from about 74 min before to 34 min after water immersion, while during posttraining, water immersion tilt tolerance decreased from 74 min to 44 min. Fluid-electrolyte-endocrine responses were also essentially the same during all four tilts. Plasma volume decreased by 9.0 to 12.6 percent; plasma sodium and osmotic concentrations were unchanged; and serum protein and plasma renin activity increased.
De Luca, Rosaria; Russo, Margherita; Naro, Antonino; Tomasello, Provvidenza; Leonardi, Simona; Santamaria, Floriana; Desireè, Latella; Bramanti, Alessia; Silvestri, Giuseppe; Bramanti, Placido; Calabrò, Rocco Salvatore
2018-02-02
Cognitive impairment occurs frequently in post-stroke patients. This study aimed to determine the effects of a virtual reality training (VRT) with BTs-Nirvana (BTsN) on the recovery of cognitive functions in stroke patients, using the Interactive-Semi-Immersive Program (I-SIP). We enrolled 12 subjects (randomly divided into two groups: experimental group (EG); and control group (CG)), who attended the Laboratory of Robotic and Cognitive Rehabilitation of IRCCS Neurolesi of Messina from January to June 2016. The EG underwent a VRT with BTsN, whereas CG received a standard cognitive treatment. Both the groups underwent the same conventional physiotherapy program. Each treatment session lasted 45 minutes and was repeated three times a week for 8 weeks. All the patients were evaluated by a specific clinical-psychometric battery before (T0), immediately (T1), and one month (T2) after the end of the training. At T1, the EG presented a greater improvement in the trunk control test (p = 0.03), the Montreal Cognitive Assessment (p = 0.01), the selective attention assessment scores (p = 0.01), the verbal memory (p = 0.03), and the visuospatial and constructive abilities (p = 0.01), as compared to CG. Moreover, such amelioration persisted at T2 only in the EG. According to these preliminary data, VRT with I-SIP can be considered a useful complementary treatment to potentiate functional recovery, with regard to attention, visual-spatial deficits, and motor function in patients affected by stroke.
[Thinking on the Training of Uniportal Video-assisted Thoracic Surgery].
Zhu, Yuming; Jiang, Gening
2018-04-20
Recently, uniportal video-assisted thoracic surgery (VATS) has developed rapidly and has become the main theme of global surgical development. The specific, standardized and systematic training of this technology has become an important topic. Specific training in the uniportal VATS approach is crucial to ensure safety and radical treatment. Such training approach, including a direct interaction with experienced surgeons in high-volume centers, is crucial and represents an indispensable step. Another form of training that usually occurs after preceptorship is proctorship: an experienced mentor can be invited to a trainee's own center to provide specific on-site tutelage. Videos published online are commonly used as training material. Technology has allowed the use of different models of simulators for training. The most common model is the use of animal wet laboratory training. Other models, however, have been used mostrecently, such as the use of 3D and VR Technology, virtual reality simulators, and completely artificial models of the human thorax with synthetic lung, vessel, airway, and nodal tissues. A short-duration, high-volume, clinical immersion training, and a long term systematic training in high-volume centers are getting more and more attention. According to the evaluation of students' grading, a diversified training mode is adopted and the targeted training in accordance with different students helps to improve the training effect. We have done some work in systematic and standardized training of uniportal VATS in single center. We believe such training is feasible and absolutely necessary.
ERIC Educational Resources Information Center
Kemp, Jeremy William
2011-01-01
This quantitative survey study examines the willingness of online students to adopt an immersive virtual environment as a classroom tool and compares this with their feelings about more traditional learning modes including our ANGEL learning management system and the Elluminate live Web conferencing tool. I surveyed 1,108 graduate students in…
Making Web3D Less Scary: Toward Easy-to-Use Web3D e-Learning Content Development Tools for Educators
ERIC Educational Resources Information Center
de Byl, Penny
2009-01-01
Penny de Byl argues that one of the biggest challenges facing educators today is the integration of rich and immersive three-dimensional environments with existing teaching and learning materials. To empower educators with the ability to embrace emerging Web3D technologies, the Advanced Learning and Immersive Virtual Environment (ALIVE) research…
Virtual reality 3D headset based on DMD light modulators
NASA Astrophysics Data System (ADS)
Bernacki, Bruce E.; Evans, Allan; Tang, Edward
2014-06-01
We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micromirror devices (DMD). Current methods for presenting information for virtual reality are focused on either polarizationbased modulators such as liquid crystal on silicon (LCoS) devices, or miniature LCD or LED displays often using lenses to place the image at infinity. LCoS modulators are an area of active research and development, and reduce the amount of viewing light by 50% due to the use of polarization. Viewable LCD or LED screens may suffer low resolution, cause eye fatigue, and exhibit a "screen door" or pixelation effect due to the low pixel fill factor. Our approach leverages a mature technology based on silicon micro mirrors delivering 720p resolution displays in a small form-factor with high fill factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high-definition resolution and low power consumption, and many of the design methods developed for DMD projector applications can be adapted to display use. Potential applications include night driving with natural depth perception, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design concept is described in which light from the DMD is imaged to infinity and the user's own eye lens forms a real image on the user's retina resulting in a virtual retinal display.
From stereoscopic recording to virtual reality headsets: Designing a new way to learn surgery.
Ros, M; Trives, J-V; Lonjon, N
2017-03-01
To improve surgical practice, there are several different approaches to simulation. Due to wearable technologies, recording 3D movies is now easy. The development of a virtual reality headset allows imagining a different way of watching these videos: using dedicated software to increase interactivity in a 3D immersive experience. The objective was to record 3D movies via a main surgeon's perspective, to watch files using virtual reality headsets and to validate pedagogic interest. Surgical procedures were recorded using a system combining two side-by-side cameras placed on a helmet. We added two LEDs just below the cameras to enhance luminosity. Two files were obtained in mp4 format and edited using dedicated software to create 3D movies. Files obtained were then played using a virtual reality headset. Surgeons who tried the immersive experience completed a questionnaire to evaluate the interest of this procedure for surgical learning. Twenty surgical procedures were recorded. The movies capture a scene which is extended 180° horizontally and 90° vertically. The immersive experience created by the device conveys a genuine feeling of being in the operating room and seeing the procedure first-hand through the eyes of the main surgeon. All surgeons indicated that they believe in pedagogical interest of this method. We succeeded in recording the main surgeon's point of view in 3D and watch it on a virtual reality headset. This new approach enhances the understanding of surgery; most of the surgeons appreciated its pedagogic value. This method could be an effective learning tool in the future. Copyright © 2016. Published by Elsevier Masson SAS.
NASA Astrophysics Data System (ADS)
Basso Moro, Sara; Carrieri, Marika; Avola, Danilo; Brigadoi, Sabrina; Lancia, Stefania; Petracca, Andrea; Spezialetti, Matteo; Ferrari, Marco; Placidi, Giuseppe; Quaresima, Valentina
2016-06-01
Objective. In the last few years, the interest in applying virtual reality systems for neurorehabilitation is increasing. Their compatibility with neuroimaging techniques, such as functional near-infrared spectroscopy (fNIRS), allows for the investigation of brain reorganization with multimodal stimulation and real-time control of the changes occurring in brain activity. The present study was aimed at testing a novel semi-immersive visuo-motor task (VMT), which has the features of being adopted in the field of neurorehabilitation of the upper limb motor function. Approach. A virtual environment was simulated through a three-dimensional hand-sensing device (the LEAP Motion Controller), and the concomitant VMT-related prefrontal cortex (PFC) response was monitored non-invasively by fNIRS. Upon the VMT, performed at three different levels of difficulty, it was hypothesized that the PFC would be activated with an expected greater level of activation in the ventrolateral PFC (VLPFC), given its involvement in the motor action planning and in the allocation of the attentional resources to generate goals from current contexts. Twenty-one subjects were asked to move their right hand/forearm with the purpose of guiding a virtual sphere over a virtual path. A twenty-channel fNIRS system was employed for measuring changes in PFC oxygenated-deoxygenated hemoglobin (O2Hb/HHb, respectively). Main results. A VLPFC O2Hb increase and a concomitant HHb decrease were observed during the VMT performance, without any difference in relation to the task difficulty. Significance. The present study has revealed a particular involvement of the VLPFC in the execution of the novel proposed semi-immersive VMT adoptable in the neurorehabilitation field.
Moro, Sara Basso; Carrieri, Marika; Avola, Danilo; Brigadoi, Sabrina; Lancia, Stefania; Petracca, Andrea; Spezialetti, Matteo; Ferrari, Marco; Placidi, Giuseppe; Quaresima, Valentina
2016-06-01
In the last few years, the interest in applying virtual reality systems for neurorehabilitation is increasing. Their compatibility with neuroimaging techniques, such as functional near-infrared spectroscopy (fNIRS), allows for the investigation of brain reorganization with multimodal stimulation and real-time control of the changes occurring in brain activity. The present study was aimed at testing a novel semi-immersive visuo-motor task (VMT), which has the features of being adopted in the field of neurorehabilitation of the upper limb motor function. A virtual environment was simulated through a three-dimensional hand-sensing device (the LEAP Motion Controller), and the concomitant VMT-related prefrontal cortex (PFC) response was monitored non-invasively by fNIRS. Upon the VMT, performed at three different levels of difficulty, it was hypothesized that the PFC would be activated with an expected greater level of activation in the ventrolateral PFC (VLPFC), given its involvement in the motor action planning and in the allocation of the attentional resources to generate goals from current contexts. Twenty-one subjects were asked to move their right hand/forearm with the purpose of guiding a virtual sphere over a virtual path. A twenty-channel fNIRS system was employed for measuring changes in PFC oxygenated-deoxygenated hemoglobin (O2Hb/HHb, respectively). A VLPFC O2Hb increase and a concomitant HHb decrease were observed during the VMT performance, without any difference in relation to the task difficulty. The present study has revealed a particular involvement of the VLPFC in the execution of the novel proposed semi-immersive VMT adoptable in the neurorehabilitation field.
NASA Astrophysics Data System (ADS)
Klippel, A.; Zhao, J.; Masrur, A.; Wallgruen, J. O.; La Femina, P. C.
2017-12-01
We present work along the virtuality continuum showcasing both AR and VR environments for geoscience applications and research. The AR/VR project focusses on one of the most prominent landmarks on the Penn State campus which, at the same time, is a representation of the geology of Pennsylvania. The Penn State Obelisk is a 32" high, 51 ton monument composed of 281 rocks collected from across Pennsylvania. While information about its origins and composition are scattered in articles and some web databases, we compiled all the available data from the web and archives and curated them as a basis for an immersive xR experience. Tabular data was amended by xR data such as 360° photos, videos, and 3D models (e.g., the Obelisk). Our xR (both AR and VR) prototype provides an immersive analytical environment that supports interactive data visualization and virtual navigation in a natural environment (a campus model of today and of 1896, the year of the Obelisk's installation). This work-in-progress project can provide an interactive immersive learning platform (specifically, for K-12 and introductory level geosciences students) where learning process is enhanced through seamless navigation between 3D data space and physical space. The, second, VR focused application is creating and empirically evaluating virtual reality (VR) experiences for geosciences research, specifically, an interactive volcano experience based on LiDAR and image data of Iceland's Thrihnukar volcano. The prototype addresses the lack of content and tools for immersive virtual reality (iVR) in geoscientific education and research and how to make it easier to integrate iVR into research and classroom experiences. It makes use of environmentally sensed data such that interaction and linked content can be integrated into a single experience. We discuss our workflows as well as methods and authoring tools for iVR analysis and creation of virtual experiences. These methods and tools aim to enhance the utility of geospatial data from repositories such as OpenTopography.org through unlocking treasure-troves of geospatial data for VR applications. Their enhanced accessibility in education and research for the geosciences and beyond will benefit geoscientists and educators who cannot be expected to be VR and 3D application experts.
Water immersion in neonatal bereavement photography.
Duffey, Heather
2014-01-01
Water immersion in neonatal bereavement photography is a new technique intended to enhance the quality of the photographs provided to families following their loss. Water immersion appears to be most helpful following a second trimester fetal demise. This technique can be used by nurses, professional photographers and others in addition to more traditional neonatal bereavement photography. It does not require special skills or equipment and can be implemented in virtually any perinatal setting. The enhanced quality of photographs produced with this method can potentially provide a source of comfort to grieving families. © 2014 AWHONN.
Heat Acclimation and Water-Immersion Deconditioning: Responses to Exercise
NASA Technical Reports Server (NTRS)
Shvartz, E.; Bhattacharya, A.; Sperinde, S. J.; Brock, P. J.; Sciaraffa, D.; Haines, R. F.; Greenleaf, J. E.
1977-01-01
Simulated subgravity conditions, such as bed rest and water immersion, cause a decrease in a acceleration tolerance (3, 4), tilt tolerance (3, 9, 10), work capacity (5, 7), and plasma volume (1, 8-10). Moderate exercise training performed during bed rest (4) and prior to water immersion (5) provides some protection against the adverse effects of deconditioning, but the relationship between exercise and changes due to deconditioning remains unclear. Heat acclimation increases plasma and interstitial volumes, total body water, stroke volume (11), and tilt tolerance (6) and may, therefore, be a more efficient method of ameliorating deconditioning than physical training alone. The present study was undertaken to determine the effects of heat acclimation and moderate physical training, performed in cool conditions, on water-immersion deconditioning.
Modulation of Excitability in the Temporoparietal Junction Relieves Virtual Reality Sickness.
Takeuchi, Naoyuki; Mori, Takayuki; Suzukamo, Yoshimi; Izumi, Shin-Ichi
2018-06-01
Virtual reality (VR) immersion often provokes subjective discomfort and postural instability, so called VR sickness. The neural mechanism of VR sickness is speculated to be related to visual-vestibular information mismatch and/or postural instability. However, the approaches proposed to relieve VR sickness through modulation of brain activity are poorly understood. Using transcranial direct current stimulation (tDCS), we aimed to investigate whether VR sickness could be relieved by the modulation of cortical excitability in the temporoparietal junction (TPJ), which is known to be involved in processing of both vestibular and visual information. Twenty healthy subjects received tDCS over right TPJ before VR immersion. The order of the three types of tDCS (anodal, cathodal, and sham) was counterbalanced across subjects. We evaluated the subjective symptoms, heart rate, and center of pressure at baseline, after tDCS, and after VR immersion. VR immersion using head-mounted displays provoked subjective discomfort and postural instability. However, anodal tDCS over right TPJ ameliorated subjective disorientation symptoms and postural instability induced by VR immersion compared with sham condition. The amelioration of VR sickness by anodal tDCS over the right TPJ might result from relief of the sensory conflict and/or facilitation of vestibular function. Our result not only has potential clinical implications for the neuromodulation approach of VR sickness but also implies a causal role of the TPJ in VR sickness.
A Virtual Walk through London: Culture Learning through a Cultural Immersion Experience
ERIC Educational Resources Information Center
Shih, Ya-Chun
2015-01-01
Integrating Google Street View into a three-dimensional virtual environment in which users control personal avatars provides these said users with access to an innovative, interactive, and real-world context for communication and culture learning. We have selected London, a city famous for its rich historical, architectural, and artistic heritage,…
ERIC Educational Resources Information Center
Grenfell, Janette
2013-01-01
Selected ubiquitous technologies encourage collaborative participation between higher education students and educators within a virtual socially networked e-learning landscape. Multiple modes of teaching and learning, ranging from real world experiences, to text and digital images accessed within the Deakin studies online learning management…
2009-01-01
Bowerly, T., Buckwalter, J.G., Rizzo, A.A.: A controlled clinical compari- son of attention performance in children with ADHD in a virtual reality... classroom com- pared to standard neuropsychological methods. Child Neuropsychology 13, 363–381 (2007) 7. Parsons, T.D., Rizzo, A.A
Assessment in Immersive Virtual Environments: Cases for Learning, of Learning, and as Learning
ERIC Educational Resources Information Center
Code, Jillianne; Zap, Nick
2017-01-01
The key to education reform lies in exploring alternative forms of assessment. Alternative performance assessments provide a more valid measure than multiple-choice tests of students' conceptual understanding and higher-level skills such as problem solving and inquiry. Advances in game-based and virtual environment technologies are creating new…
13 Tips for Virtual World Teaching
ERIC Educational Resources Information Center
Villano, Matt
2008-01-01
Multi-user virtual environments (MUVEs) are gaining momentum as the latest and greatest learning tool in the world of education technology. How does one get started with them? How do they work? This article shares 13 secrets from immersive education experts and educators on how to have success in implementing these new tools and technologies on…
Learning to Teach in Second Life: A Novice Adventure in Virtual Reality
ERIC Educational Resources Information Center
Ellis, Maureen; Anderson, Patricia
2011-01-01
Second Life (SL) is a social virtual world, which emphasizes the general use of immersive worlds for supporting a variety of human activities and interactions, presenting a plethora of new opportunities and challenges for enriching how we learn, work and play (Boulos, Hetherington & Wheeler, 2007; Prasolova-Førland, Sourin & Sourina,…
Evidence of Blocking with Geometric Cues in a Virtual Watermaze
ERIC Educational Resources Information Center
Redhead, Edward S.; Hamilton, Derek A.
2009-01-01
Three computer based experiments, testing human participants in a non-immersive virtual watermaze task, used a blocking design to assess whether two sets of geometric cues would compete in a manner described by associative models of learning. In stage 1, participants were required to discriminate between visually distinct platforms. In stage 2,…
Road-Crossing Safety in Virtual Reality: A Comparison of Adolescents With and Without ADHD
ERIC Educational Resources Information Center
Clancy, Tamera A.; Rucklidge, Julia J.; Owen, Dean
2006-01-01
This study investigated the potential accident-proneness of adolescents with attention deficit hyperactivity disorder (ADHD) in a hazardous road-crossing environment. An immersive virtual reality traffic gap-choice task was used to determine whether ADHD adolescents show more unsafe road-crossing behavior than controls. Participants (ages 13 to…
ERIC Educational Resources Information Center
Scoresby, Jon; Shelton, Brett E.
2011-01-01
The mis-categorizing of cognitive states involved in learning within virtual environments has complicated instructional technology research. Further, most educational computer game research does not account for how learning activity is influenced by factors of game content and differences in viewing perspectives. This study is a qualitative…
When VR really hits the streets
NASA Astrophysics Data System (ADS)
Morie, Jacquelyn F.
2014-02-01
Immersive Virtual Reality (VR) technology, while popular in the late part of the 20th Century, seemed to disappear from public view as social media took its place and captured the attention of millions. Now that a new generation of entrepreneurs and crowd-sourced funding campaigns have arrived, perhaps virtual reality is poised for a resurgence.
Physics Education in Virtual Reality: An Example
ERIC Educational Resources Information Center
Kaufmann, Hannes; Meyer, Bernd
2009-01-01
We present an immersive virtual reality (VR) application for physics education. It utilizes a recent physics engine developed for the PC gaming market to simulate physical experiments correctly and accurately. Students are enabled to actively build their own experiments and study them. A variety of tools are provided to analyze forces, mass, paths…
IMMERSE: Interactive Mentoring for Multimodal Experiences in Realistic Social Encounters
2015-08-28
undergraduates funded by your agreement who graduated during this period and will receive scholarships or fellowships for further studies in science... Player Locomotion 9.2 Interacting with Real and Virtual Objects 9.3 Animation Combinations and Stage Management 10. Recommendations on the Way Ahead...Interaction with Virtual Characters ................................52! 9.1! Player Locomotion
Using CLIPS to represent knowledge in a VR simulation
NASA Technical Reports Server (NTRS)
Engelberg, Mark L.
1994-01-01
Virtual reality (VR) is an exciting use of advanced hardware and software technologies to achieve an immersive simulation. Until recently, the majority of virtual environments were merely 'fly-throughs' in which a user could freely explore a 3-dimensional world or a visualized dataset. Now that the underlying technologies are reaching a level of maturity, programmers are seeking ways to increase the complexity and interactivity of immersive simulations. In most cases, interactivity in a virtual environment can be specified in the form 'whenever such-and-such happens to object X, it reacts in the following manner.' CLIPS and COOL provide a simple and elegant framework for representing this knowledge-base in an efficient manner that can be extended incrementally. The complexity of a detailed simulation becomes more manageable when the control flow is governed by CLIPS' rule-based inference engine as opposed to by traditional procedural mechanisms. Examples in this paper will illustrate an effective way to represent VR information in CLIPS, and to tie this knowledge base to the input and output C routines of a typical virtual environment.
Pavone, Enea Francesco; Tieri, Gaetano; Rizza, Giulia; Tidoni, Emmanuele; Grisoni, Luigi; Aglioti, Salvatore Maria
2016-01-13
Brain monitoring of errors in one's own and other's actions is crucial for a variety of processes, ranging from the fine-tuning of motor skill learning to important social functions, such as reading out and anticipating the intentions of others. Here, we combined immersive virtual reality and EEG recording to explore whether embodying the errors of an avatar by seeing it from a first-person perspective may activate the error monitoring system in the brain of an onlooker. We asked healthy participants to observe, from a first- or third-person perspective, an avatar performing a correct or an incorrect reach-to-grasp movement toward one of two virtual mugs placed on a table. At the end of each trial, participants reported verbally how much they embodied the avatar's arm. Ratings were maximal in first-person perspective, indicating that immersive virtual reality can be a powerful tool to induce embodiment of an artificial agent, even through mere visual perception and in the absence of any cross-modal boosting. Observation of erroneous grasping from a first-person perspective enhanced error-related negativity and medial-frontal theta power in the trials where human onlookers embodied the virtual character, hinting at the tight link between early, automatic coding of error detection and sense of embodiment. Error positivity was similar in 1PP and 3PP, suggesting that conscious coding of errors is similar for self and other. Thus, embodiment plays an important role in activating specific components of the action monitoring system when others' errors are coded as if they are one's own errors. Detecting errors in other's actions is crucial for social functions, such as reading out and anticipating the intentions of others. Using immersive virtual reality and EEG recording, we explored how the brain of an onlooker reacted to the errors of an avatar seen from a first-person perspective. We found that mere observation of erroneous actions enhances electrocortical markers of error detection in the trials where human onlookers embodied the virtual character. Thus, the cerebral system for action monitoring is maximally activated when others' errors are coded as if they are one's own errors. The results have important implications for understanding how the brain can control the external world and thus creating new brain-computer interfaces. Copyright © 2016 the authors 0270-6474/16/360268-12$15.00/0.
Designers Workbench: Towards Real-Time Immersive Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuester, F; Duchaineau, M A; Hamann, B
2001-10-03
This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technologymore » or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.« less
Immersive Earth Science: Data Visualization in Virtual Reality
NASA Astrophysics Data System (ADS)
Skolnik, S.; Ramirez-Linan, R.
2017-12-01
Utilizing next generation technology, Navteca's exploration of 3D and volumetric temporal data in Virtual Reality (VR) takes advantage of immersive user experiences where stakeholders are literally inside the data. No longer restricted by the edges of a screen, VR provides an innovative way of viewing spatially distributed 2D and 3D data that leverages a 360 field of view and positional-tracking input, allowing users to see and experience data differently. These concepts are relevant to many sectors, industries, and fields of study, as real-time collaboration in VR can enhance understanding and mission with VR visualizations that display temporally-aware 3D, meteorological, and other volumetric datasets. The ability to view data that is traditionally "difficult" to visualize, such as subsurface features or air columns, is a particularly compelling use of the technology. Various development iterations have resulted in Navteca's proof of concept that imports and renders volumetric point-cloud data in the virtual reality environment by interfacing PC-based VR hardware to a back-end server and popular GIS software. The integration of the geo-located data in VR and subsequent display of changeable basemaps, overlaid datasets, and the ability to zoom, navigate, and select specific areas show the potential for immersive VR to revolutionize the way Earth data is viewed, analyzed, and communicated.
Sounds of silence: How to animate virtual worlds with sound
NASA Technical Reports Server (NTRS)
Astheimer, Peter
1993-01-01
Sounds are an integral and sometimes annoying part of our daily life. Virtual worlds which imitate natural environments gain a lot of authenticity from fast, high quality visualization combined with sound effects. Sounds help to increase the degree of immersion for human dwellers in imaginary worlds significantly. The virtual reality toolkit of IGD (Institute for Computer Graphics) features a broad range of standard visual and advanced real-time audio components which interpret an object-oriented definition of the scene. The virtual reality system 'Virtual Design' realized with the toolkit enables the designer of virtual worlds to create a true audiovisual environment. Several examples on video demonstrate the usage of the audio features in Virtual Design.
Educational Uses of Virtual Reality Technology.
1998-01-01
technology. It is affordable in that a basic level of technology can be achieved on most existing personal computers at either no cost or some minimal...actually present in a virtual environment is termed "presence" and is an artifact of being visually immersed in the computer -generated virtual world...Carolina University, VREL Teachers 1996 onward £ CO ■3 u VR in Education University of Illinois, National Center for Super- computing Applications
Interactive 3D Models and Simulations for Nuclear Security Education, Training, and Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warner, David K.; Dickens, Brian Scott; Heimer, Donovan J.
By providing examples of products that have been produced in the past, it is the hopes of the authors that the audience will have a more thorough understanding of 3D modeling tools, potential applications, and capabilities that they can provide. Truly the applications and capabilities of these types of tools are only limited by one’s imagination. The future of three-dimensional models lies in the expansion into the world of virtual reality where one will experience a fully immersive first-person environment. The use of headsets and hand tools will allow students and instructors to have a more thorough spatial understanding ofmore » facilities and scenarios that they will encounter in the real world.« less
Deconditioning-induced exercise responses as influenced by heat acclimation
NASA Technical Reports Server (NTRS)
Shvartz, E.; Bhattacharya, A.; Sperinde, S. J.; Brock, P. J.; Sciaraffa, D.; Haines, R. F.; Greenleaf, J. E.
1979-01-01
A study to determine the effect of heat acclimation and physical training in temperate conditions on changes in exercise tolerance following water-immersion deconditioning is presented. Five young men were tested on a bicycle ergometer before and after heat acclimation and after water immersion. The subjects and the experimental procedure, heat acclimation and exercise training, water immersion, and exercise tolerance are discussed. Heat acclimation resulted in the usual decreases in exercise heart rate and rectal temperature and an increase in sweat rate. Water immersion resulted in substantial diuresis despite water consumed. The results show that heat acclimation provides an effective method of preventing the adverse effects of water-immersion deconditioning on exercise tolerance.
Tieri, Gaetano; Gioia, Annamaria; Scandola, Michele; Pavone, Enea F; Aglioti, Salvatore M
2017-05-01
To explore the link between Sense of Embodiment (SoE) over a virtual hand and physiological regulation of skin temperature, 24 healthy participants were immersed in virtual reality through a Head Mounted Display and had their real limb temperature recorded by means of a high-sensitivity infrared camera. Participants observed a virtual right upper limb (appearing either normally, or with the hand detached from the forearm) or limb-shaped non-corporeal control objects (continuous or discontinuous wooden blocks) from a first-person perspective. Subjective ratings of SoE were collected in each observation condition, as well as temperatures of the right and left hand, wrist and forearm. The observation of these complex, body and body-related virtual scenes resulted in increased real hand temperature when compared to a baseline condition in which a 3d virtual ball was presented. Crucially, observation of non-natural appearances of the virtual limb (discontinuous limb) and limb-shaped non-corporeal objects elicited high increase in real hand temperature and low SoE. In contrast, observation of the full virtual limb caused high SoE and low temperature changes in the real hand with respect to the other conditions. Interestingly, the temperature difference across the different conditions occurred according to a topographic rule that included both hands. Our study sheds new light on the role of an external hand's visual appearance and suggests a tight link between higher-order bodily self-representations and topographic regulation of skin temperature. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Ouellet, Émilie; Boller, Benjamin; Corriveau-Lecavalier, Nick; Cloutier, Simon; Belleville, Sylvie
2018-06-01
Assessing and predicting memory performance in everyday life is a common assignment for neuropsychologists. However, most traditional neuropsychological tasks are not conceived to capture everyday memory performance. The Virtual Shop is a fully immersive task developed to assess memory in a more ecological way than traditional neuropsychological assessments. Two studies were undertaken to assess the feasibility of the Virtual Shop and to appraise its ecological and construct validity. In study 1, 20 younger and 19 older adults completed the Virtual Shop task to evaluate its level of difficulty and the way the participants interacted with the VR material. The construct validity was examined with the contrasted-group method, by comparing the performance of younger and older adults. In study 2, 35 individuals with subjective cognitive decline completed the Virtual Shop task. Performance was correlated with an existing questionnaire evaluating everyday memory in order to appraise its ecological validity. To add further support to its construct validity, performance was correlated with traditional episodic memory and executive tasks. All participants successfully completed the Virtual Shop. The task had an appropriate level of difficulty that helped differentiate younger and older adults, supporting the feasibility and construct validity of the task. The performance on the Virtual Shop was significantly and moderately correlated with the performance on the questionnaire and on the traditional memory and executive tasks. Results support the feasibility and both the ecological and construct validity of the Virtual Shop. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Duarte, Ricardo Jordão; Cury, José; Oliveira, Luis Carlos Neves; Srougi, Miguel
2013-01-01
Medical literature is scarce on information to define a basic skills training program for laparoscopic surgery (peg and transferring, cutting, clipping). The aim of this study was to determine the minimal number of simulator sessions of basic laparoscopic tasks necessary to elaborate an optimal virtual reality training curriculum. Eleven medical students with no previous laparoscopic experience were spontaneously enrolled. They were submitted to simulator training sessions starting at level 1 (Immersion Lap VR, San Jose, CA), including sequentially camera handling, peg and transfer, clipping and cutting. Each student trained twice a week until 10 sessions were completed. The score indexes were registered and analyzed. The total of errors of the evaluation sequences (camera, peg and transfer, clipping and cutting) were computed and thereafter, they were correlated to the total of items evaluated in each step, resulting in a success percent ratio for each student for each set of each completed session. Thereafter, we computed the cumulative success rate in 10 sessions, obtaining an analysis of the learning process. By non-linear regression the learning curve was analyzed. By the non-linear regression method the learning curve was analyzed and a r2 = 0.73 (p < 0.001) was obtained, being necessary 4.26 (∼five sessions) to reach the plateau of 80% of the estimated acquired knowledge, being that 100% of the students have reached this level of skills. From the fifth session till the 10th, the gain of knowledge was not significant, although some students reached 96% of the expected improvement. This study revealed that after five simulator training sequential sessions the students' learning curve reaches a plateau. The forward sessions in the same difficult level do not promote any improvement in laparoscopic basic surgical skills, and the students should be introduced to a more difficult training tasks level.
iVFTs - immersive virtual field trips for interactive learning about Earth's environment.
NASA Astrophysics Data System (ADS)
Bruce, G.; Anbar, A. D.; Semken, S. C.; Summons, R. E.; Oliver, C.; Buxner, S.
2014-12-01
Innovations in immersive interactive technologies are changing the way students explore Earth and its environment. State-of-the-art hardware has given developers the tools needed to capture high-resolution spherical content, 360° panoramic video, giga-pixel imagery, and unique viewpoints via unmanned aerial vehicles as they explore remote and physically challenging regions of our planet. Advanced software enables integration of these data into seamless, dynamic, immersive, interactive, content-rich, and learner-driven virtual field explorations, experienced online via HTML5. These surpass conventional online exercises that use 2-D static imagery and enable the student to engage in these virtual environments that are more like games than like lectures. Grounded in the active learning of exploration, inquiry, and application of knowledge as it is acquired, users interact non-linearly in conjunction with an intelligent tutoring system (ITS). The integration of this system allows the educational experience to be adapted to each individual student as they interact within the program. Such explorations, which we term "immersive virtual field trips" (iVFTs), are being integrated into cyber-learning allowing science teachers to take students to scientifically significant but inaccessible environments. Our team and collaborators are producing a diverse suite of freely accessible, iVFTs to teach key concepts in geology, astrobiology, ecology, and anthropology. Topics include Early Life, Biodiversity, Impact craters, Photosynthesis, Geologic Time, Stratigraphy, Tectonics, Volcanism, Surface Processes, The Rise of Oxygen, Origin of Water, Early Civilizations, Early Multicellular Organisms, and Bioarcheology. These diverse topics allow students to experience field sites all over the world, including, Grand Canyon (USA), Flinders Ranges (Australia), Shark Bay (Australia), Rainforests (Panama), Teotihuacan (Mexico), Upheaval Dome (USA), Pilbara (Australia), Mid-Atlantic Ridge (Iceland), and Mauna Kea (Hawaii). iVFTs are being beta-tested and used at ASU in several large-enrollment courses to assess its usability and effectiveness in meeting specific learning objectives. We invite geoscience educators to partake of this resource and find new applications to their own teaching.
Simulation laboratories for training in obstetrics and gynecology.
Macedonia, Christian R; Gherman, Robert B; Satin, Andrew J
2003-08-01
Simulations have been used by the military, airline industry, and our colleagues in other medical specialties to educate, evaluate, and prepare for rare but life-threatening scenarios. Work hour limits for residents in obstetrics and gynecology and decreased patient availability for teaching of students and residents require us to think creatively and practically on how to optimize their education. Medical simulations may address scenarios in clinical practice that are considered important to know or understand. Simulations can take many forms, including computer programs, models or mannequins, virtual reality data immersion caves, and a combination of formats. The purpose of this commentary is to call attention to a potential role for medical simulation in obstetrics and gynecology. We briefly describe an example of how simulation may be incorporated into obstetric and gynecologic residency training. It is our contention that educators in obstetrics and gynecology should be aware of the potential for simulation in education. We hope this commentary will stimulate interest in the field, lead to validation studies, and improve training in and the practice of obstetrics and gynecology.
Immersed boundary-simplified lattice Boltzmann method for incompressible viscous flows
NASA Astrophysics Data System (ADS)
Chen, Z.; Shu, C.; Tan, D.
2018-05-01
An immersed boundary-simplified lattice Boltzmann method is developed in this paper for simulations of two-dimensional incompressible viscous flows with immersed objects. Assisted by the fractional step technique, the problem is resolved in a predictor-corrector scheme. The predictor step solves the flow field without considering immersed objects, and the corrector step imposes the effect of immersed boundaries on the velocity field. Different from the previous immersed boundary-lattice Boltzmann method which adopts the standard lattice Boltzmann method (LBM) as the flow solver in the predictor step, a recently developed simplified lattice Boltzmann method (SLBM) is applied in the present method to evaluate intermediate flow variables. Compared to the standard LBM, SLBM requires lower virtual memories, facilitates the implementation of physical boundary conditions, and shows better numerical stability. The boundary condition-enforced immersed boundary method, which accurately ensures no-slip boundary conditions, is implemented as the boundary solver in the corrector step. Four typical numerical examples are presented to demonstrate the stability, the flexibility, and the accuracy of the present method.
ERIC Educational Resources Information Center
Savin-Baden, Maggi
2010-01-01
This paper presents a study that used narrative inquiry to explore staff experiences of learning and teaching in immersive worlds. The findings introduced issues relating to identity play, the relationship between pedagogy and play and the ways in which learning, play and fun were managed (or not). At the same time there was a sense of imposed or…
Psychometric Assessment of Stereoscopic Head-Mounted Displays
2016-06-29
Journal Article 3. DATES COVERED (From – To) Jan 2015 - Dec 2015 4. TITLE AND SUBTITLE PSYCHOMETRIC ASSESSMENT OF STEREOSCOPIC HEAD- MOUNTED DISPLAYS...to render an immersive three-dimensional constructive environment. The purpose of this effort was to quantify the impact of aircrew vision on an...simulated tasks requiring precise depth discrimination. This work will provide an example validation method for future stereoscopic virtual immersive
Inertial Motion-Tracking Technology for Virtual 3-D
NASA Technical Reports Server (NTRS)
2005-01-01
In the 1990s, NASA pioneered virtual reality research. The concept was present long before, but, prior to this, the technology did not exist to make a viable virtual reality system. Scientists had theories and ideas they knew that the concept had potential, but the computers of the 1970s and 1980s were not fast enough, sensors were heavy and cumbersome, and people had difficulty blending fluidly with the machines. Scientists at Ames Research Center built upon the research of previous decades and put the necessary technology behind them, making the theories of virtual reality a reality. Virtual reality systems depend on complex motion-tracking sensors to convey information between the user and the computer to give the user the feeling that he is operating in the real world. These motion-tracking sensors measure and report an object s position and orientation as it changes. A simple example of motion tracking would be the cursor on a computer screen moving in correspondence to the shifting of the mouse. Tracking in 3-D, necessary to create virtual reality, however, is much more complex. To be successful, the perspective of the virtual image seen on the computer must be an accurate representation of what is seen in the real world. As the user s head or camera moves, turns, or tilts, the computer-generated environment must change accordingly with no noticeable lag, jitter, or distortion. Historically, the lack of smooth and rapid tracking of the user s motion has thwarted the widespread use of immersive 3-D computer graphics. NASA uses virtual reality technology for a variety of purposes, mostly training of astronauts. The actual missions are costly and dangerous, so any opportunity the crews have to practice their maneuvering in accurate situations before the mission is valuable and instructive. For that purpose, NASA has funded a great deal of virtual reality research, and benefited from the results.
WeaVR: a self-contained and wearable immersive virtual environment simulation system.
Hodgson, Eric; Bachmann, Eric R; Vincent, David; Zmuda, Michael; Waller, David; Calusdian, James
2015-03-01
We describe WeaVR, a computer simulation system that takes virtual reality technology beyond specialized laboratories and research sites and makes it available in any open space, such as a gymnasium or a public park. Novel hardware and software systems enable HMD-based immersive virtual reality simulations to be conducted in any arbitrary location, with no external infrastructure and little-to-no setup or site preparation. The ability of the WeaVR system to provide realistic motion-tracked navigation for users, to improve the study of large-scale navigation, and to generate usable behavioral data is shown in three demonstrations. First, participants navigated through a full-scale virtual grocery store while physically situated in an open grass field. Trajectory data are presented for both normal tracking and for tracking during the use of redirected walking that constrained users to a predefined area. Second, users followed a straight path within a virtual world for distances of up to 2 km while walking naturally and being redirected to stay within the field, demonstrating the ability of the system to study large-scale navigation by simulating virtual worlds that are potentially unlimited in extent. Finally, the portability and pedagogical implications of this system were demonstrated by taking it to a regional high school for live use by a computer science class on their own school campus.
The relationship between virtual body ownership and temperature sensitivity
Llobera, Joan; Sanchez-Vives, M. V.; Slater, Mel
2013-01-01
In the rubber hand illusion, tactile stimulation seen on a rubber hand, that is synchronous with tactile stimulation felt on the hidden real hand, can lead to an illusion of ownership over the rubber hand. This illusion has been shown to produce a temperature decrease in the hidden hand, suggesting that such illusory ownership produces disownership of the real hand. Here, we apply immersive virtual reality (VR) to experimentally investigate this with respect to sensitivity to temperature change. Forty participants experienced immersion in a VR with a virtual body (VB) seen from a first-person perspective. For half the participants, the VB was consistent in posture and movement with their own body, and in the other half, there was inconsistency. Temperature sensitivity on the palm of the hand was measured before and during the virtual experience. The results show that temperature sensitivity decreased in the consistent compared with the inconsistent condition. Moreover, the change in sensitivity was significantly correlated with the subjective illusion of virtual arm ownership but modulated by the illusion of ownership over the full VB. This suggests that a full body ownership illusion results in a unification of the virtual and real bodies into one overall entity—with proprioception and tactile sensations on the real body integrated with the visual presence of the VB. The results are interpreted in the framework of a ‘body matrix’ recently introduced into the literature. PMID:23720537
Revolutionizing Education: The Promise of Virtual Reality
ERIC Educational Resources Information Center
Gadelha, Rene
2018-01-01
Virtual reality (VR) has the potential to revolutionize education, as it immerses students in their learning more than any other available medium. By blocking out visual and auditory distractions in the classroom, it has the potential to help students deeply connect with the material they are learning in a way that has never been possible before.…
Designing a Virtual Social Space for Language Acquisition
ERIC Educational Resources Information Center
Woolson, Maria Alessandra
2012-01-01
Middleverse de Español (MdE) is an evolving platform for foreign language (FL) study, aligned to the goals of ACTFL's National Standards and 2007 MLA report. The project simulates an immersive environment in a virtual 3-D space for the acquisition of translingual and transcultural competence in Spanish meant to support content-based and…
SciEthics Interactive: Science and Ethics Learning in a Virtual Environment
ERIC Educational Resources Information Center
Nadolny, Larysa; Woolfrey, Joan; Pierlott, Matthew; Kahn, Seth
2013-01-01
Learning in immersive 3D environments allows students to collaborate, build, and interact with difficult course concepts. This case study examines the design and development of the TransGen Island within the SciEthics Interactive project, a National Science Foundation-funded, 3D virtual world emphasizing learning science content in the context of…
ERIC Educational Resources Information Center
Huang, Hsiu-Mei; Rauch, Ulrich; Liaw, Shu-Sheng
2010-01-01
The use of animation and multimedia for learning is now further extended by the provision of entire Virtual Reality Learning Environments (VRLE). This highlights a shift in Web-based learning from a conventional multimedia to a more immersive, interactive, intuitive and exciting VR learning environment. VRLEs simulate the real world through the…
Towards General Models of Effective Science Inquiry in Virtual Performance Assessments
ERIC Educational Resources Information Center
Baker, R. S.; Clarke-Midura, J.; Ocumpaugh, J.
2016-01-01
Recent interest in online assessment of scientific inquiry has led to several new online systems that attempt to assess these skills, but producing models that detect when students are successfully practising these skills can be challenging. In this paper, we study models that assess student inquiry in an immersive virtual environment, where a…
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.
2004-01-01
Virtual Reality has been defined in many different ways and now means different things in various contexts. VR can range from simple environments presented on a desktop computer to fully immersive multisensory environments experienced through complex headgear and bodysuits. In all of its manifestations, VR is basically a way of simulating or…
ERIC Educational Resources Information Center
Barab, Sasha A.; Hay, Kenneth E.; Squire, Kurt; Barnett, Michael; Schmidt, Rae; Karrigan, Kristen; Yamagata-Lynch, Lisa; Johnson, Christine
2000-01-01
Describes an introductory undergraduate astronomy course in which the large-lecture format was moved to one in which students were immersed in a technologically-rich, inquiry-based, participatory learning environment. Finds that virtual reality can be used effectively in regular undergraduate university courses as a tool through which students can…
Virtual reality, immersion, and the unforgettable experience
NASA Astrophysics Data System (ADS)
Morie, Jacquelyn F.
2006-02-01
Virtual reality has been in the public eye for nearly forty years. Its early promise was vast: worlds we could visit and live in, if we could bend the technology to our desires. Progress was made, but along the way the original directions and challenges of fully immersive VR took a back seat to more ubiquitous technology such as games that provided many of the same functions. What was lost in this transition was the potential for VR to become a stage for encounters that are meaningful, those experiences that tap into what it means to be human. This paper describes examples of such experiences using VR technology and puts forward several avenues of thought concerning how we might reinvigorate these types of VR explorations.
NASA Technical Reports Server (NTRS)
1997-01-01
I-FORCE, a computer peripheral from Immersion Corporation, was derived from virtual environment and human factors research at the Advanced Displays and Spatial Perception Laboratory at Ames Research Center in collaboration with Stanford University Center for Design Research. Entrepreneur Louis Rosenberg, a former Stanford researcher, now president of Immersion, collaborated with Dr. Bernard Adelstein at Ames on studies of perception in virtual reality. The result was an inexpensive way to incorporate motors and a sophisticated microprocessor into joysticks and other game controllers. These devices can emulate the feel of a car on the skid, a crashing plane, the bounce of a ball, compressed springs, or other physical phenomenon. The first products incorporating I-FORCE technology include CH- Products' line of FlightStick and CombatStick controllers.
Salimi, Zohreh; Ferguson-Pell, Martin
2018-06-01
Although wheelchair ergometers provide a safe and controlled environment for studying or training wheelchair users, until recently they had a major disadvantage in only being capable of simulating straight-line wheelchair propulsion. Virtual reality has helped overcome this problem and broaden the usability of wheelchair ergometers. However, for a wheelchair ergometer to be validly used in research studies, it needs to be able to simulate the biomechanics of real world wheelchair propulsion. In this paper, three versions of a wheelchair simulator were developed. They provide a sophisticated wheelchair ergometer in an immersive virtual reality environment. They are intended for manual wheelchair propulsion and all are able to simulate simple translational inertia. In addition, each of the systems reported uses a different approach to simulate wheelchair rotation and accommodate rotational inertial effects. The first system does not provide extra resistance against rotation and relies on merely linear inertia, hypothesizing that it can provide acceptable replication of biomechanics of wheelchair maneuvers. The second and third systems, however, are designed to simulate rotational inertia. System II uses mechanical compensation, and System III uses visual compensation simulating the influence that rotational inertia has on the visual perception of wheelchair movement in response to rotation at different speeds.
Connors, Erin C; Chrastil, Elizabeth R; Sánchez, Jaime; Merabet, Lotfi B
2014-01-01
For profoundly blind individuals, navigating in an unfamiliar building can represent a significant challenge. We investigated the use of an audio-based, virtual environment called Audio-based Environment Simulator (AbES) that can be explored for the purposes of learning the layout of an unfamiliar, complex indoor environment. Furthermore, we compared two modes of interaction with AbES. In one group, blind participants implicitly learned the layout of a target environment while playing an exploratory, goal-directed video game. By comparison, a second group was explicitly taught the same layout following a standard route and instructions provided by a sighted facilitator. As a control, a third group interacted with AbES while playing an exploratory, goal-directed video game however, the explored environment did not correspond to the target layout. Following interaction with AbES, a series of route navigation tasks were carried out in the virtual and physical building represented in the training environment to assess the transfer of acquired spatial information. We found that participants from both modes of interaction were able to transfer the spatial knowledge gained as indexed by their successful route navigation performance. This transfer was not apparent in the control participants. Most notably, the game-based learning strategy was also associated with enhanced performance when participants were required to find alternate routes and short cuts within the target building suggesting that a ludic-based training approach may provide for a more flexible mental representation of the environment. Furthermore, outcome comparisons between early and late blind individuals suggested that greater prior visual experience did not have a significant effect on overall navigation performance following training. Finally, performance did not appear to be associated with other factors of interest such as age, gender, and verbal memory recall. We conclude that the highly interactive and immersive exploration of the virtual environment greatly engages a blind user to develop skills akin to positive near transfer of learning. Learning through a game play strategy appears to confer certain behavioral advantages with respect to how spatial information is acquired and ultimately manipulated for navigation.
Connors, Erin C.; Chrastil, Elizabeth R.; Sánchez, Jaime; Merabet, Lotfi B.
2014-01-01
For profoundly blind individuals, navigating in an unfamiliar building can represent a significant challenge. We investigated the use of an audio-based, virtual environment called Audio-based Environment Simulator (AbES) that can be explored for the purposes of learning the layout of an unfamiliar, complex indoor environment. Furthermore, we compared two modes of interaction with AbES. In one group, blind participants implicitly learned the layout of a target environment while playing an exploratory, goal-directed video game. By comparison, a second group was explicitly taught the same layout following a standard route and instructions provided by a sighted facilitator. As a control, a third group interacted with AbES while playing an exploratory, goal-directed video game however, the explored environment did not correspond to the target layout. Following interaction with AbES, a series of route navigation tasks were carried out in the virtual and physical building represented in the training environment to assess the transfer of acquired spatial information. We found that participants from both modes of interaction were able to transfer the spatial knowledge gained as indexed by their successful route navigation performance. This transfer was not apparent in the control participants. Most notably, the game-based learning strategy was also associated with enhanced performance when participants were required to find alternate routes and short cuts within the target building suggesting that a ludic-based training approach may provide for a more flexible mental representation of the environment. Furthermore, outcome comparisons between early and late blind individuals suggested that greater prior visual experience did not have a significant effect on overall navigation performance following training. Finally, performance did not appear to be associated with other factors of interest such as age, gender, and verbal memory recall. We conclude that the highly interactive and immersive exploration of the virtual environment greatly engages a blind user to develop skills akin to positive near transfer of learning. Learning through a game play strategy appears to confer certain behavioral advantages with respect to how spatial information is acquired and ultimately manipulated for navigation. PMID:24822044