Sample records for virtual immersive environment

  1. Learning Relative Motion Concepts in Immersive and Non-Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria

    2013-01-01

    The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop…

  2. Social Interaction Development through Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2014-01-01

    The purpose of this pilot study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity…

  3. Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.

    PubMed

    Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E

    2007-01-01

    This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.

  4. The Efficacy of an Immersive 3D Virtual versus 2D Web Environment in Intercultural Sensitivity Acquisition

    ERIC Educational Resources Information Center

    Coffey, Amy Jo; Kamhawi, Rasha; Fishwick, Paul; Henderson, Julie

    2017-01-01

    Relatively few studies have empirically tested computer-based immersive virtual environments' efficacy in teaching or enhancing pro-social attitudes, such as intercultural sensitivity. This channel study experiment was conducted (N = 159) to compare what effects, if any, an immersive 3D virtual environment would have upon subjects' intercultural…

  5. The ALIVE Project: Astronomy Learning in Immersive Virtual Environments

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Sahami, K.; Denn, G.

    2008-06-01

    The Astronomy Learning in Immersive Virtual Environments (ALIVE) project seeks to discover learning modes and optimal teaching strategies using immersive virtual environments (VEs). VEs are computer-generated, three-dimensional environments that can be navigated to provide multiple perspectives. Immersive VEs provide the additional benefit of surrounding a viewer with the simulated reality. ALIVE evaluates the incorporation of an interactive, real-time ``virtual universe'' into formal college astronomy education. In the experiment, pre-course, post-course, and curriculum tests will be used to determine the efficacy of immersive visualizations presented in a digital planetarium versus the same visual simulations in the non-immersive setting of a normal classroom, as well as a control case using traditional classroom multimedia. To normalize for inter-instructor variability, each ALIVE instructor will teach at least one of each class in each of the three test groups.

  6. Declarative Knowledge Acquisition in Immersive Virtual Learning Environments

    ERIC Educational Resources Information Center

    Webster, Rustin

    2016-01-01

    The author investigated the interaction effect of immersive virtual reality (VR) in the classroom. The objective of the project was to develop and provide a low-cost, scalable, and portable VR system containing purposely designed and developed immersive virtual learning environments for the US Army. The purpose of the mixed design experiment was…

  7. Active Learning through the Use of Virtual Environments

    ERIC Educational Resources Information Center

    Mayrose, James

    2012-01-01

    Immersive Virtual Reality (VR) has seen explosive growth over the last decade. Immersive VR attempts to give users the sensation of being fully immersed in a synthetic environment by providing them with 3D hardware, and allowing them to interact with objects in virtual worlds. The technology is extremely effective for learning and exploration, and…

  8. Coercive Narratives, Motivation and Role Playing in Virtual Worlds

    DTIC Science & Technology

    2002-01-01

    resource for making immersive virtual environments highly engaging. Interaction also appeals to our natural desire to discover. Reading a book contains...participation in an open-ended Virtual Environment (VE). I intend to take advantage of a participants’ natural tendency to prefer interaction when possible...I hope this work will expand the potential of experience within virtual worlds. K e y w o r d s : Immersive Environments , Virtual Environments

  9. Digital Immersive Virtual Environments and Instructional Computing

    ERIC Educational Resources Information Center

    Blascovich, Jim; Beall, Andrew C.

    2010-01-01

    This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…

  10. Inclusion of Immersive Virtual Learning Environments and Visual Control Systems to Support the Learning of Students with Asperger Syndrome

    ERIC Educational Resources Information Center

    Lorenzo, Gonzalo; Pomares, Jorge; Lledo, Asuncion

    2013-01-01

    This paper presents the use of immersive virtual reality systems in the educational intervention with Asperger students. The starting points of this study are features of these students' cognitive style that requires an explicit teaching style supported by visual aids and highly structured environments. The proposed immersive virtual reality…

  11. Full Immersive Virtual Environment Cave[TM] in Chemistry Education

    ERIC Educational Resources Information Center

    Limniou, M.; Roberts, D.; Papadopoulos, N.

    2008-01-01

    By comparing two-dimensional (2D) chemical animations designed for computer's desktop with three-dimensional (3D) chemical animations designed for the full immersive virtual reality environment CAVE[TM] we studied how virtual reality environments could raise student's interest and motivation for learning. By using the 3ds max[TM], we can visualize…

  12. Using Virtual Reality to Help Students with Social Interaction Skills

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2015-01-01

    The purpose of this study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity to…

  13. Immersive Virtual Moon Scene System Based on Panoramic Camera Data of Chang'E-3

    NASA Astrophysics Data System (ADS)

    Gao, X.; Liu, J.; Mu, L.; Yan, W.; Zeng, X.; Zhang, X.; Li, C.

    2014-12-01

    The system "Immersive Virtual Moon Scene" is used to show the virtual environment of Moon surface in immersive environment. Utilizing stereo 360-degree imagery from panoramic camera of Yutu rover, the system enables the operator to visualize the terrain and the celestial background from the rover's point of view in 3D. To avoid image distortion, stereo 360-degree panorama stitched by 112 images is projected onto inside surface of sphere according to panorama orientation coordinates and camera parameters to build the virtual scene. Stars can be seen from the Moon at any time. So we render the sun, planets and stars according to time and rover's location based on Hipparcos catalogue as the background on the sphere. Immersing in the stereo virtual environment created by this imaged-based rendering technique, the operator can zoom, pan to interact with the virtual Moon scene and mark interesting objects. Hardware of the immersive virtual Moon system is made up of four high lumen projectors and a huge curve screen which is 31 meters long and 5.5 meters high. This system which take all panoramic camera data available and use it to create an immersive environment, enable operator to interact with the environment and mark interesting objects contributed heavily to establishment of science mission goals in Chang'E-3 mission. After Chang'E-3 mission, the lab with this system will be open to public. Besides this application, Moon terrain stereo animations based on Chang'E-1 and Chang'E-2 data will be showed to public on the huge screen in the lab. Based on the data of lunar exploration,we will made more immersive virtual moon scenes and animations to help the public understand more about the Moon in the future.

  14. The Evolution of Constructivist Learning Environments: Immersion in Distributed, Virtual Worlds.

    ERIC Educational Resources Information Center

    Dede, Chris

    1995-01-01

    Discusses the evolution of constructivist learning environments and examines the collaboration of simulated software models, virtual environments, and evolving mental models via immersion in artificial realities. A sidebar gives a realistic example of a student navigating through cyberspace. (JMV)

  15. Employing immersive virtual environments for innovative experiments in health care communication.

    PubMed

    Persky, Susan

    2011-03-01

    This report reviews the literature for studies that employ immersive virtual environment technology methods to conduct experimental studies in health care communication. Advantages and challenges of using these tools for research in this area are also discussed. A literature search was conducted using the Scopus database. Results were hand searched to identify the body of studies, conducted since 1995, that are related to the report objective. The review identified four relevant studies that stem from two unique projects. One project focused on the impact of a clinician's characteristics and behavior on health care communication, the other focused on the characteristics of the patient. Both projects illustrate key methodological advantages conferred by immersive virtual environments, including, ability to maintain simultaneously high experimental control and realism, ability to manipulate variables in new ways, and unique behavioral measurement opportunities. Though implementation challenges exist for immersive virtual environment-based research methods, given the technology's unique capabilities, benefits can outweigh the costs in many instances. Immersive virtual environments may therefore prove an important addition to the array of tools available for advancing our understanding of communication in health care. Published by Elsevier Ireland Ltd.

  16. 'Putting it on the table': direct-manipulative interaction and multi-user display technologies for semi-immersive environments and augmented reality applications.

    PubMed

    Encarnação, L Miguel; Bimber, Oliver

    2002-01-01

    Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.

  17. Evaluation of knowledge transfer in an immersive virtual learning environment for the transportation community.

    DOT National Transportation Integrated Search

    2014-05-01

    Immersive Virtual Learning Environments (IVLEs) are extensively used in training, but few rigorous scientific investigations regarding the : transfer of learning have been conducted. Measurement of learning transfer through evaluative methods is key ...

  18. The Immersive Virtual Reality Experience: A Typology of Users Revealed Through Multiple Correspondence Analysis Combined with Cluster Analysis Technique.

    PubMed

    Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz

    2016-03-01

    Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.

  19. Immersive virtual reality for visualization of abdominal CT

    NASA Astrophysics Data System (ADS)

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A.; Bodenheimer, Robert E.

    2013-03-01

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  20. Immersive Virtual Reality for Visualization of Abdominal CT.

    PubMed

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A; Bodenheimer, Robert E

    2013-03-28

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two-dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  1. The Effects of Instructor-Avatar Immediacy in Second Life, an Immersive and Interactive Three-Dimensional Virtual Environment

    ERIC Educational Resources Information Center

    Lawless-Reljic, Sabine Karine

    2010-01-01

    Growing interest of educational institutions in desktop 3D graphic virtual environments for hybrid and distance education prompts questions on the efficacy of such tools. Virtual worlds, such as Second Life[R], enable computer-mediated immersion and interactions encompassing multimodal communication channels including audio, video, and text-.…

  2. Immersive realities: articulating the shift from VR to mobile AR through artistic practice

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy; Berry, Rodney; DeFanti, Thomas A.

    2012-03-01

    Our contemporary imaginings of technological engagement with digital environments has transitioned from flying through Virtual Reality to mobile interactions with the physical world through personal media devices. Experiences technologically mediated through social interactivity within physical environments are now being preferenced over isolated environments such as CAVEs or HMDs. Examples of this trend can be seen in early tele-collaborative artworks which strove to use advanced networking to join multiple participants in shared virtual environments. Recent developments in mobile AR allow untethered access to such shared realities in places far removed from labs and home entertainment environments, and without the bulky and expensive technologies attached to our bodies that accompany most VR. This paper addresses the emerging trend favoring socially immersive artworks via mobile Augmented Reality rather than sensorially immersive Virtual Reality installations. With particular focus on AR as a mobile, locative technology, we will discuss how concepts of immersion and interactivity are evolving with this new medium. Immersion in context of mobile AR can be redefined to describe socially interactive experiences. Having distinctly different sensory, spatial and situational properties, mobile AR offers a new form for remixing elements from traditional virtual reality with physically based social experiences. This type of immersion offers a wide array of potential for mobile AR art forms. We are beginning to see examples of how artists can use mobile AR to create social immersive and interactive experiences.

  3. Learning Relative Motion Concepts in Immersive and Non-immersive Virtual Environments

    NASA Astrophysics Data System (ADS)

    Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria

    2013-12-01

    The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop virtual environment (DVE) conditions. Our results show that after the simulation activities, both IVE and DVE groups exhibited a significant shift toward a scientific understanding in their conceptual models and epistemological beliefs about the nature of relative motion, and also a significant improvement on relative motion problem-solving tests. In addition, we analyzed students' performance on one-dimensional and two-dimensional questions in the relative motion problem-solving test separately and found that after training in the simulation, the IVE group performed significantly better than the DVE group on solving two-dimensional relative motion problems. We suggest that egocentric encoding of the scene in IVE (where the learner constitutes a part of a scene they are immersed in), as compared to allocentric encoding on a computer screen in DVE (where the learner is looking at the scene from "outside"), is more beneficial than DVE for studying more complex (two-dimensional) relative motion problems. Overall, our findings suggest that such aspects of virtual realities as immersivity, first-hand experience, and the possibility of changing different frames of reference can facilitate understanding abstract scientific phenomena and help in displacing intuitive misconceptions with more accurate mental models.

  4. Immersive Training Systems: Virtual Reality and Education and Training.

    ERIC Educational Resources Information Center

    Psotka, Joseph

    1995-01-01

    Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…

  5. The effects of immersiveness on physiology.

    PubMed

    Wiederhold, B K; Davis, R; Wiederhold, M D

    1998-01-01

    The effects of varying levels of immersion in virtual reality environments on participant's heart rate, respiration rate, peripheral skin temperature, and skin resistance levels were examined. Subjective reports of presence were also noted. Participants were presented with a virtual environment of an airplane flight both as seen from a two-dimensional computer screen and as seen from within a head-mounted display. Subjects were randomly assigned to different order of conditions presented, but all subjects received both conditions. Differences between the non-phobics' physiological responses and the phobic's response when placed in a virtual environment related to the phobia were noted. Also noted were changes in physiology based on degree of immersion.

  6. CliniSpace: a multiperson 3D online immersive training environment accessible through a browser.

    PubMed

    Dev, Parvati; Heinrichs, W LeRoy; Youngblood, Patricia

    2011-01-01

    Immersive online medical environments, with dynamic virtual patients, have been shown to be effective for scenario-based learning (1). However, ease of use and ease of access have been barriers to their use. We used feedback from prior evaluation of these projects to design and develop CliniSpace. To improve usability, we retained the richness of prior virtual environments but modified the user interface. To improve access, we used a Software-as-a-Service (SaaS) approach to present a richly immersive 3D environment within a web browser.

  7. LVC interaction within a mixed-reality training system

    NASA Astrophysics Data System (ADS)

    Pollock, Brice; Winer, Eliot; Gilbert, Stephen; de la Cruz, Julio

    2012-03-01

    The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems.

  8. A Virtual World for Collaboration: The AETZone

    ERIC Educational Resources Information Center

    Cheney, Amelia W.; Sanders, Robert L.; Matzen, Nita J.; Bronack, Stephen C.; Riedl, Richard E.; Tashner, John H.

    2009-01-01

    Participation in learning communities, and the construction of knowledge in communities of practice, are important considerations in the use of 3D immersive worlds. This article describes the creation of this type of learning environment in AETZone, an immersive virtual environment in use within graduate programs at Appalachian State University…

  9. Feasibility of Using an Augmented Immersive Virtual Reality Learning Environment to Enhance Music Conducting Skills

    ERIC Educational Resources Information Center

    Orman, Evelyn K.; Price, Harry E.; Russell, Christine R.

    2017-01-01

    Acquiring nonverbal skills necessary to appropriately communicate and educate members of performing ensembles is essential for wind band conductors. Virtual reality learning environments (VRLEs) provide a unique setting for developing these proficiencies. For this feasibility study, we used an augmented immersive VRLE to enhance eye contact, torso…

  10. Measuring Flow Experience in an Immersive Virtual Environment for Collaborative Learning

    ERIC Educational Resources Information Center

    van Schaik, P.; Martin, S.; Vallance, M.

    2012-01-01

    In contexts other than immersive virtual environments, theoretical and empirical work has identified flow experience as a major factor in learning and human-computer interaction. Flow is defined as a "holistic sensation that people feel when they act with total involvement". We applied the concept of flow to modeling the experience of…

  11. Using Immersive Virtual Environments for Certification

    NASA Technical Reports Server (NTRS)

    Lutz, R.; Cruz-Neira, C.

    1998-01-01

    Immersive virtual environments (VEs) technology has matured to the point where it can be utilized as a scientific and engineering problem solving tool. In particular, VEs are starting to be used to design and evaluate safety-critical systems that involve human operators, such as flight and driving simulators, complex machinery training, and emergency rescue strategies.

  12. A Practical Guide, with Theoretical Underpinnings, for Creating Effective Virtual Reality Learning Environments

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.; Domingo, Jelia

    2017-01-01

    With the advent of open source virtual environments, the associated cost reductions, and the more flexible options, avatar-based virtual reality environments are within reach of educators. By using and repurposing readily available virtual environments, instructors can bring engaging, community-building, and immersive learning opportunities to…

  13. Evaluating the Effects of Immersive Embodied Interaction on Cognition in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Parmar, Dhaval

    Virtual reality is on its advent of becoming mainstream household technology, as technologies such as head-mounted displays, trackers, and interaction devices are becoming affordable and easily available. Virtual reality (VR) has immense potential in enhancing the fields of education and training, and its power can be used to spark interest and enthusiasm among learners. It is, therefore, imperative to evaluate the risks and benefits that immersive virtual reality poses to the field of education. Research suggests that learning is an embodied process. Learning depends on grounded aspects of the body including action, perception, and interactions with the environment. This research aims to study if immersive embodiment through the means of virtual reality facilitates embodied cognition. A pedagogical VR solution which takes advantage of embodied cognition can lead to enhanced learning benefits. Towards achieving this goal, this research presents a linear continuum for immersive embodied interaction within virtual reality. This research evaluates the effects of three levels of immersive embodied interactions on cognitive thinking, presence, usability, and satisfaction among users in the fields of science, technology, engineering, and mathematics (STEM) education. Results from the presented experiments show that immersive virtual reality is greatly effective in knowledge acquisition and retention, and highly enhances user satisfaction, interest and enthusiasm. Users experience high levels of presence and are profoundly engaged in the learning activities within the immersive virtual environments. The studies presented in this research evaluate pedagogical VR software to train and motivate students in STEM education, and provide an empirical analysis comparing desktop VR (DVR), immersive VR (IVR), and immersive embodied VR (IEVR) conditions for learning. This research also proposes a fully immersive embodied interaction metaphor (IEIVR) for learning of computational concepts as a future direction, and presents the challenges faced in implementing the IEIVR metaphor due to extended periods of immersion. Results from the conducted studies help in formulating guidelines for virtual reality and education researchers working in STEM education and training, and for educators and curriculum developers seeking to improve student engagement in the STEM fields.

  14. Correcting Distance Estimates by Interacting With Immersive Virtual Environments: Effects of Task and Available Sensory Information

    ERIC Educational Resources Information Center

    Waller, David; Richardson, Adam R.

    2008-01-01

    The tendency to underestimate egocentric distances in immersive virtual environments (VEs) is not well understood. However, previous research (A. R. Richardson & D. Waller, 2007) has demonstrated that a brief period of interaction with the VE prior to making distance judgments can effectively eliminate subsequent underestimation. Here the authors…

  15. Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life

    NASA Astrophysics Data System (ADS)

    Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia

    2011-03-01

    Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.

  16. Effects of Exercise in Immersive Virtual Environments on Cortical Neural Oscillations and Mental State

    PubMed Central

    Vogt, Tobias; Herpers, Rainer; Askew, Christopher D.; Scherfgen, David; Strüder, Heiko K.; Schneider, Stefan

    2015-01-01

    Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence. PMID:26366305

  17. Effects of Exercise in Immersive Virtual Environments on Cortical Neural Oscillations and Mental State.

    PubMed

    Vogt, Tobias; Herpers, Rainer; Askew, Christopher D; Scherfgen, David; Strüder, Heiko K; Schneider, Stefan

    2015-01-01

    Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence.

  18. A Framework for Aligning Instructional Design Strategies with Affordances of CAVE Immersive Virtual Reality Systems

    ERIC Educational Resources Information Center

    Ritz, Leah T.; Buss, Alan R.

    2016-01-01

    Increasing availability of immersive virtual reality (IVR) systems, such as the Cave Automatic Virtual Environment (CAVE) and head-mounted displays, for use in education contexts is providing new opportunities and challenges for instructional designers. By highlighting the affordances of IVR specific to the CAVE, the authors emphasize the…

  19. Student Responses to Their Immersion in a Virtual Environment.

    ERIC Educational Resources Information Center

    Taylor, Wayne

    Undertaken in conjunction with a larger study that investigated the educational efficacy of students building their own virtual worlds, this study measures the reactions of students in grades 4-12 to the experience of being immersed in virtual reality (VR). The study investigated the sense of "presence" experienced by the students, the…

  20. Children's Perception of Gap Affordances: Bicycling Across Traffic-Filled Intersections in an Immersive Virtual Environment

    ERIC Educational Resources Information Center

    Plumert, Jodie M.; Kearney, Joseph K.; Cremer, James F.

    2004-01-01

    This study examined gap choices and crossing behavior in children and adults using an immersive, interactive bicycling simulator. Ten- and 12-year-olds and adults rode a bicycle mounted on a stationary trainer through a virtual environment consisting of a street with 6 intersections. Participants faced continuous cross traffic traveling at 25mph…

  1. Cognitive factors associated with immersion in virtual environments

    NASA Technical Reports Server (NTRS)

    Psotka, Joseph; Davison, Sharon

    1993-01-01

    Immersion into the dataspace provided by a computer, and the feeling of really being there or 'presence', are commonly acknowledged as the uniquely important features of virtual reality environments. How immersed one feels appears to be determined by a complex set of physical components and affordances of the environment, and as yet poorly understood psychological processes. Pimentel and Teixeira say that the experience of being immersed in a computer-generated world involves the same mental shift of 'suspending your disbelief for a period of time' as 'when you get wrapped up in a good novel or become absorbed in playing a computer game'. That sounds as if it could be right, but it would be good to get some evidence for these important conclusions. It might be even better to try to connect these statements with theoretical positions that try to do justice to complex cognitive processes. The basic precondition for understanding Virtual Reality (VR) is understanding the spatial representation systems that localize our bodies or egocenters in space. The effort to understand these cognitive processes is being driven with new energy by the pragmatic demands of successful virtual reality environments, but the literature is largely sparse and anecdotal.

  2. The Fidelity of ’Feel’: Emotional Affordance in Virtual Environments

    DTIC Science & Technology

    2005-07-01

    The Fidelity of “Feel”: Emotional Affordance in Virtual Environments Jacquelyn Ford Morie, Josh Williams, Aimee Dozois, Donat-Pierre Luigi... environment but also the participant. We do this with the focus on what emotional affordances this manipulation will provide. Our first evaluation scenario...emotionally affective VEs. Keywords: Immersive Environments , Virtual Environments , VEs, Virtual Reality, emotion , affordance, fidelity, presence

  3. Simulation of Physical Experiments in Immersive Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Wasfy, Tamer M.

    2001-01-01

    An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.

  4. VILLAGE--Virtual Immersive Language Learning and Gaming Environment: Immersion and Presence

    ERIC Educational Resources Information Center

    Wang, Yi Fei; Petrina, Stephen; Feng, Francis

    2017-01-01

    3D virtual worlds are promising for immersive learning in English as a Foreign Language (EFL). Unlike English as a Second Language (ESL), EFL typically takes place in the learners' home countries, and the potential of the language is limited by geography. Although learning contexts where English is spoken is important, in most EFL courses at the…

  5. Level of Immersion in Virtual Environments Impacts the Ability to Assess and Teach Social Skills in Autism Spectrum Disorder

    PubMed Central

    Bugnariu, Nicoleta L.

    2016-01-01

    Abstract Virtual environments (VEs) may be useful for delivering social skills interventions to individuals with autism spectrum disorder (ASD). Immersive VEs provide opportunities for individuals with ASD to learn and practice skills in a controlled replicable setting. However, not all VEs are delivered using the same technology, and the level of immersion differs across settings. We group studies into low-, moderate-, and high-immersion categories by examining five aspects of immersion. In doing so, we draw conclusions regarding the influence of this technical manipulation on the efficacy of VEs as a tool for assessing and teaching social skills. We also highlight ways in which future studies can advance our understanding of how manipulating aspects of immersion may impact intervention success. PMID:26919157

  6. Knowledge Acquisition and Job Training for Advanced Technical Skills Using Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Watanuki, Keiichi; Kojima, Kazuyuki

    The environment in which Japanese industry has achieved great respect is changing tremendously due to the globalization of world economies, while Asian countries are undergoing economic and technical development as well as benefiting from the advances in information technology. For example, in the design of custom-made casting products, a designer who lacks knowledge of casting may not be able to produce a good design. In order to obtain a good design and manufacturing result, it is necessary to equip the designer and manufacturer with a support system related to casting design, or a so-called knowledge transfer and creation system. This paper proposes a new virtual reality based knowledge acquisition and job training system for casting design, which is composed of the explicit and tacit knowledge transfer systems using synchronized multimedia and the knowledge internalization system using portable virtual environment. In our proposed system, the education content is displayed in the immersive virtual environment, whereby a trainee may experience work in the virtual site operation. Provided that the trainee has gained explicit and tacit knowledge of casting through the multimedia-based knowledge transfer system, the immersive virtual environment catalyzes the internalization of knowledge and also enables the trainee to gain tacit knowledge before undergoing on-the-job training at a real-time operation site.

  7. Immersive Virtual Reality Therapy with Myoelectric Control for Treatment-resistant Phantom Limb Pain: Case Report.

    PubMed

    Chau, Brian; Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc

    2017-01-01

    Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain.

  8. Immersive Virtual Reality Therapy with Myoelectric Control for Treatment-resistant Phantom Limb Pain: Case Report

    PubMed Central

    Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc

    2017-01-01

    Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain. PMID:29616149

  9. Designing the Self: The Transformation of the Relational Self-Concept through Social Encounters in a Virtual Immersive Environment

    ERIC Educational Resources Information Center

    Knutzen, K. Brant; Kennedy, David M.

    2012-01-01

    This article describes the findings of a 3-month study on how social encounters mediated by an online Virtual Immersive Environment (VIE) impacted on the relational self-concept of adolescents. The study gathered data from two groups of students as they took an Introduction to Design and Programming class. Students in group 1 undertook course…

  10. Immersive virtual reality improves movement patterns in patients after ACL reconstruction: implications for enhanced criteria-based return-to-sport rehabilitation.

    PubMed

    Gokeler, Alli; Bisschop, Marsha; Myer, Gregory D; Benjaminse, Anne; Dijkstra, Pieter U; van Keeken, Helco G; van Raay, Jos J A M; Burgerhof, Johannes G M; Otten, Egbert

    2016-07-01

    The purpose of this study was to evaluate the influence of immersion in a virtual reality environment on knee biomechanics in patients after ACL reconstruction (ACLR). It was hypothesized that virtual reality techniques aimed to change attentional focus would influence altered knee flexion angle, knee extension moment and peak vertical ground reaction force (vGRF) in patients following ACLR. Twenty athletes following ACLR and 20 healthy controls (CTRL) performed a step-down task in both a non-virtual reality environment and a virtual reality environment displaying a pedestrian traffic scene. A motion analysis system and force plates were used to measure kinematics and kinetics during a step-down task to analyse each single-leg landing. A significant main effect was found for environment for knee flexion excursion (P = n.s.). Significant interaction differences were found between environment and groups for vGRF (P = 0.004), knee moment (P < 0.001), knee angle at peak vGRF (P = 0.01) and knee flexion excursion (P = 0.03). There was larger effect of virtual reality environment on knee biomechanics in patients after ACLR compared with controls. Patients after ACLR immersed in virtual reality environment demonstrated knee joint biomechanics that approximate those of CTRL. The results of this study indicate that a realistic virtual reality scenario may distract patients after ACLR from conscious motor control. Application of clinically available technology may aid in current rehabilitation programmes to target altered movement patterns after ACLR. Diagnostic study, Level III.

  11. Object Creation and Human Factors Evaluation for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1998-01-01

    The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.

  12. Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!

    NASA Astrophysics Data System (ADS)

    Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.

    2015-04-01

    Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.

  13. NASA Virtual Glovebox: An Immersive Virtual Desktop Environment for Training Astronauts in Life Science Experiments

    NASA Technical Reports Server (NTRS)

    Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard

    2003-01-01

    The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.

  14. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges

    PubMed Central

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants’ behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants’ height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother). PMID:26157414

  15. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges.

    PubMed

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants' behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants' height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother).

  16. New Desktop Virtual Reality Technology in Technical Education

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2008-01-01

    Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…

  17. Immersive Environments - A Connectivist Approach

    NASA Astrophysics Data System (ADS)

    Loureiro, Ana; Bettencourt, Teresa

    We are conducting a research project with the aim of achieving better and more efficient ways to facilitate teaching and learning in Higher Level Education. We have chosen virtual environments, with particular emphasis to Second Life® platform augmented by web 2.0 tools, to develop the study. The Second Life® environment has some interesting characteristics that captured our attention, it is immersive; it is a real world simulator; it is a social network; it allows real time communication, cooperation, collaboration and interaction; it is a safe and controlled environment. We specifically chose tools from web 2.0 that enable sharing and collaborative way of learning. Through understanding the characteristics of this learning environment, we believe that immersive learning along with other virtual tools can be integrated in today's pedagogical practices.

  18. Crowd behaviour during high-stress evacuations in an immersive virtual environment

    PubMed Central

    Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W.; Gross, Markus; Helbing, Dirk; Hölscher, Christoph

    2016-01-01

    Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. PMID:27605166

  19. Crowd behaviour during high-stress evacuations in an immersive virtual environment.

    PubMed

    Moussaïd, Mehdi; Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W; Gross, Markus; Helbing, Dirk; Hölscher, Christoph

    2016-09-01

    Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. © 2016 The Authors.

  20. Foreign language learning in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Chang, Benjamin; Sheldon, Lee; Si, Mei; Hand, Anton

    2012-03-01

    Virtual reality has long been used for training simulations in fields from medicine to welding to vehicular operation, but simulations involving more complex cognitive skills present new design challenges. Foreign language learning, for example, is increasingly vital in the global economy, but computer-assisted education is still in its early stages. Immersive virtual reality is a promising avenue for language learning as a way of dynamically creating believable scenes for conversational training and role-play simulation. Visual immersion alone, however, only provides a starting point. We suggest that the addition of social interactions and motivated engagement through narrative gameplay can lead to truly effective language learning in virtual environments. In this paper, we describe the development of a novel application for teaching Mandarin using CAVE-like VR, physical props, human actors and intelligent virtual agents, all within a semester-long multiplayer mystery game. Students travel (virtually) to China on a class field trip, which soon becomes complicated with intrigue and mystery surrounding the lost manuscript of an early Chinese literary classic. Virtual reality environments such as the Forbidden City and a Beijing teahouse provide the setting for learning language, cultural traditions, and social customs, as well as the discovery of clues through conversation in Mandarin with characters in the game.

  1. Walking in fully immersive virtual environments: an evaluation of potential adverse effects in older adults and individuals with Parkinson's disease.

    PubMed

    Kim, Aram; Darakjian, Nora; Finley, James M

    2017-02-21

    Virtual reality (VR) has recently been explored as a tool for neurorehabilitation to enable individuals with Parkinson's disease (PD) to practice challenging skills in a safe environment. Current technological advances have enabled the use of affordable, fully immersive head-mounted displays (HMDs) for potential therapeutic applications. However, while previous studies have used HMDs in individuals with PD, these were only used for short bouts of walking. Clinical applications of VR for gait training would likely involve an extended exposure to the virtual environment, which has the potential to cause individuals with PD to experience simulator-related adverse effects due to their age or pathology. Thus, our objective was to evaluate the safety of using an HMD for longer bouts of walking in fully immersive VR for older adults and individuals with PD. Thirty-three participants (11 healthy young, 11 healthy older adults, and 11 individuals with PD) were recruited for this study. Participants walked for 20 min while viewing a virtual city scene through an HMD (Oculus Rift DK2). Safety was evaluated using the mini-BESTest, measures of center of pressure (CoP) excursion, and questionnaires addressing symptoms of simulator sickness (SSQ) and measures of stress and arousal. Most participants successfully completed all trials without any discomfort. There were no significant changes for any of our groups in symptoms of simulator sickness or measures of static and dynamic balance after exposure to the virtual environment. Surprisingly, measures of stress decreased in all groups while the PD group also increased the level of arousal after exposure. Older adults and individuals with PD were able to successfully use immersive VR during walking without adverse effects. This provides systematic evidence supporting the safety of immersive VR for gait training in these populations.

  2. Envisioning the future of home care: applications of immersive virtual reality.

    PubMed

    Brennan, Patricia Flatley; Arnott Smith, Catherine; Ponto, Kevin; Radwin, Robert; Kreutz, Kendra

    2013-01-01

    Accelerating the design of technologies to support health in the home requires 1) better understanding of how the household context shapes consumer health behaviors and (2) the opportunity to afford engineers, designers, and health professionals the chance to systematically study the home environment. We developed the Living Environments Laboratory (LEL) with a fully immersive, six-sided virtual reality CAVE to enable recreation of a broad range of household environments. We have successfully developed a virtual apartment, including a kitchen, living space, and bathroom. Over 2000 people have visited the LEL CAVE. Participants use an electronic wand to activate common household affordances such as opening a refrigerator door or lifting a cup. Challenges currently being explored include creating natural gesture to interface with virtual objects, developing robust, simple procedures to capture actual living environments and rendering them in a 3D visualization, and devising systematic stable terminologies to characterize home environments.

  3. The Utility of Using Immersive Virtual Environments for the Assessment of Science Inquiry Learning

    ERIC Educational Resources Information Center

    Code, Jillianne; Clarke-Midura, Jody; Zap, Nick; Dede, Chris

    2013-01-01

    Determining the effectiveness of any educational technology depends upon teachers' and learners' perception of the functional utility of that tool for teaching, learning, and assessment. The Virtual Performance project at Harvard University is developing and studying the feasibility of using immersive technology to develop performance…

  4. The Design, Development and Evaluation of a Virtual Reality Based Learning Environment

    ERIC Educational Resources Information Center

    Chen, Chwen Jen

    2006-01-01

    Many researchers and instructional designers increasingly recognise the benefits of utilising three dimensional virtual reality (VR) technology in instruction. In general, there are two types of VR system, the immersive system and the non-immersive system. This article focuses on the latter system that merely uses the conventional personal…

  5. Ontological implications of being in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Morie, Jacquelyn F.

    2008-02-01

    The idea of Virtual Reality once conjured up visions of new territories to explore, and expectations of awaiting worlds of wonder. VR has matured to become a practical tool for therapy, medicine and commercial interests, yet artists, in particular, continue to expand the possibilities for the medium. Artistic virtual environments created over the past two decades probe the phenomenological nature of these virtual environments. When we inhabit a fully immersive virtual environment, we have entered into a new form of Being. Not only does our body continue to exist in the real, physical world, we are also embodied within the virtual by means of technology that translates our bodied actions into interactions with the virtual environment. Very few states in human existence allow this bifurcation of our Being, where we can exist simultaneously in two spaces at once, with the possible exception of meta-physical states such as shamanistic trance and out-of-body experiences. This paper discusses the nature of this simultaneous Being, how we enter the virtual space, what forms of persona we can don there, what forms of spaces we can inhabit, and what type of wondrous experiences we can both hope for and expect.

  6. Exploring the Relationship Between Distributed Training, Integrated Learning Environments, and Immersive Training Environments

    DTIC Science & Technology

    2007-01-01

    educating and training (O’Keefe IV & McIntyre III, 2006). Topics vary widely from standard educational topics such as teaching kids physics, mechanics...Winn, W., & Yu, R. (1997). The Impact of Three Dimensional Immersive Virtual Environments on Modern Pedagogy : Global Change, VR and Learning

  7. Training wheelchair navigation in immersive virtual environments for patients with spinal cord injury - end-user input to design an effective system.

    PubMed

    Nunnerley, Joanne; Gupta, Swati; Snell, Deborah; King, Marcus

    2017-05-01

    A user-centred design was used to develop and test the feasibility of an immersive 3D virtual reality wheelchair training tool for people with spinal cord injury (SCI). A Wheelchair Training System was designed and modelled using the Oculus Rift headset and a Dynamic Control wheelchair joystick. The system was tested by clinicians and expert wheelchair users with SCI. Data from focus groups and individual interviews were analysed using a general inductive approach to thematic analysis. Four themes emerged: Realistic System, which described the advantages of a realistic virtual environment; a Wheelchair Training System, which described participants' thoughts on the wheelchair training applications; Overcoming Resistance to Technology, the obstacles to introducing technology within the clinical setting; and Working outside the Rehabilitation Bubble which described the protective hospital environment. The Oculus Rift Wheelchair Training System has the potential to provide a virtual rehabilitation setting which could allow wheelchair users to learn valuable community wheelchair use in a safe environment. Nausea appears to be a side effect of the system, which will need to be resolved before this can be a viable clinical tool. Implications for Rehabilitation Immersive virtual reality shows promising benefit for wheelchair training in a rehabilitation setting. Early engagement with consumers can improve product development.

  8. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  9. Augmented Virtuality: A Real-time Process for Presenting Real-world Visual Sensory Information in an Immersive Virtual Environment for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.

    2017-12-01

    Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.

  10. VERS: a virtual environment for reconstructive surgery planning

    NASA Astrophysics Data System (ADS)

    Montgomery, Kevin N.

    1997-05-01

    The virtual environment for reconstructive surgery (VERS) project at the NASA Ames Biocomputation Center is applying virtual reality technology to aid surgeons in planning surgeries. We are working with a craniofacial surgeon at Stanford to assemble and visualize the bone structure of patients requiring reconstructive surgery either through developmental abnormalities or trauma. This project is an extension of our previous work in 3D reconstruction, mesh generation, and immersive visualization. The current VR system, consisting of an SGI Onyx RE2, FakeSpace BOOM and ImmersiveWorkbench, Virtual Technologies CyberGlove and Ascension Technologies tracker, is currently in development and has already been used to visualize defects preoperatively. In the near future it will be used to more fully plan the surgery and compute the projected result to soft tissue structure. This paper presents the work in progress and details the production of a high-performance, collaborative, and networked virtual environment.

  11. Presence Relates to Distinct Outcomes in Two Virtual Environments Employing Different Learning Modalities

    PubMed Central

    Persky, Susan; Kaphingst, Kimberly A.; McCall, Cade; Lachance, Christina; Beall, Andrew C.; Blascovich, Jim

    2009-01-01

    Presence in virtual learning environments (VLEs) has been associated with a number of outcome factors related to a user’s ability and motivation to learn. The extant but relatively small body of research suggests that a high level of presence is related to better performance on learning outcomes in VLEs. Different configurations of form and content variables such as those associated with active (self-driven, interactive activities) versus didactic (reading or lecture) learning may, however, influence how presence operates and on what content it operates. We compared the influence of presence between two types of immersive VLEs (i.e., active versus didactic techniques) on comprehension and engagement-related outcomes. The findings revealed that the active VLE promoted greater presence. Although we found no relationship between presence and learning comprehension outcomes for either virtual environment, presence was related to information engagement variables in the didactic immersive VLE but not the active environment. Results demonstrate that presence is not uniformly elicited or effective across immersive VLEs. Educational delivery mode and environment complexity may influence the impact of presence on engagement. PMID:19366319

  12. Presence relates to distinct outcomes in two virtual environments employing different learning modalities.

    PubMed

    Persky, Susan; Kaphingst, Kimberly A; McCall, Cade; Lachance, Christina; Beall, Andrew C; Blascovich, Jim

    2009-06-01

    Presence in virtual learning environments (VLEs) has been associated with a number of outcome factors related to a user's ability and motivation to learn. The extant but relatively small body of research suggests that a high level of presence is related to better performance on learning outcomes in VLEs. Different configurations of form and content variables such as those associated with active (self-driven, interactive activities) versus didactic (reading or lecture) learning may, however, influence how presence operates and on what content it operates. We compared the influence of presence between two types of immersive VLEs (i.e., active versus didactic techniques) on comprehension and engagement-related outcomes. The findings revealed that the active VLE promoted greater presence. Although we found no relationship between presence and learning comprehension outcomes for either virtual environment, presence was related to information engagement variables in the didactic immersive VLE but not the active environment. Results demonstrate that presence is not uniformly elicited or effective across immersive VLEs. Educational delivery mode and environment complexity may influence the impact of presence on engagement.

  13. Collaborative Science Learning in Three-Dimensional Immersive Virtual Worlds: Pre-Service Teachers' Experiences in Second Life

    ERIC Educational Resources Information Center

    Nussli, Natalie; Oh, Kevin; McCandless, Kevin

    2014-01-01

    The purpose of this mixed methods study was to help pre-service teachers experience and evaluate the potential of Second Life, a three-dimensional immersive virtual environment, for potential integration into their future teaching. By completing collaborative assignments in Second Life, nineteen pre-service general education teachers explored an…

  14. Computer-Assisted Culture Learning in an Online Augmented Reality Environment Based on Free-Hand Gesture Interaction

    ERIC Educational Resources Information Center

    Yang, Mau-Tsuen; Liao, Wan-Che

    2014-01-01

    The physical-virtual immersion and real-time interaction play an essential role in cultural and language learning. Augmented reality (AR) technology can be used to seamlessly merge virtual objects with real-world images to realize immersions. Additionally, computer vision (CV) technology can recognize free-hand gestures from live images to enable…

  15. Cranial implant design using augmented reality immersive system.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.

  16. The Use of Immersive Virtual Reality in the Learning Sciences: Digital Transformations of Teachers, Students, and Social Context

    ERIC Educational Resources Information Center

    Bailenson, Jeremy N.; Yee, Nick; Blascovich, Jim; Beall, Andrew C.; Lundblad, Nicole; Jin, Michael

    2008-01-01

    This article illustrates the utility of using virtual environments to transform social interaction via behavior and context, with the goal of improving learning in digital environments. We first describe the technology and theories behind virtual environments and then report data from 4 empirical studies. In Experiment 1, we demonstrated that…

  17. Immersive Virtual Environment Technology to Supplement Environmental Perception, Preference and Behavior Research: A Review with Applications

    PubMed Central

    Smith, Jordan W.

    2015-01-01

    Immersive virtual environment (IVE) technology offers a wide range of potential benefits to research focused on understanding how individuals perceive and respond to built and natural environments. In an effort to broaden awareness and use of IVE technology in perception, preference and behavior research, this review paper describes how IVE technology can be used to complement more traditional methods commonly applied in public health research. The paper also describes a relatively simple workflow for creating and displaying 360° virtual environments of built and natural settings and presents two freely-available and customizable applications that scientists from a variety of disciplines, including public health, can use to advance their research into human preferences, perceptions and behaviors related to built and natural settings. PMID:26378565

  18. Immersive Virtual Environment Technology to Supplement Environmental Perception, Preference and Behavior Research: A Review with Applications.

    PubMed

    Smith, Jordan W

    2015-09-11

    Immersive virtual environment (IVE) technology offers a wide range of potential benefits to research focused on understanding how individuals perceive and respond to built and natural environments. In an effort to broaden awareness and use of IVE technology in perception, preference and behavior research, this review paper describes how IVE technology can be used to complement more traditional methods commonly applied in public health research. The paper also describes a relatively simple workflow for creating and displaying 360° virtual environments of built and natural settings and presents two freely-available and customizable applications that scientists from a variety of disciplines, including public health, can use to advance their research into human preferences, perceptions and behaviors related to built and natural settings.

  19. Study on Collaborative Object Manipulation in Virtual Environment

    NASA Astrophysics Data System (ADS)

    Mayangsari, Maria Niken; Yong-Moo, Kwon

    This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.

  20. Exploring Learner Acceptance of the Use of Virtual Reality in Medical Education: A Case Study of Desktop and Projection-Based Display Systems

    ERIC Educational Resources Information Center

    Huang, Hsiu-Mei; Liaw, Shu-Sheng; Lai, Chung-Min

    2016-01-01

    Advanced technologies have been widely applied in medical education, including human-patient simulators, immersive virtual reality Cave Automatic Virtual Environment systems, and video conferencing. Evaluating learner acceptance of such virtual reality (VR) learning environments is a critical issue for ensuring that such technologies are used to…

  1. Taking Science Online: Evaluating Presence and Immersion through a Laboratory Experience in a Virtual Learning Environment for Entomology Students

    ERIC Educational Resources Information Center

    Annetta, Leonard; Klesath, Marta; Meyer, John

    2009-01-01

    A 3-D virtual field trip was integrated into an online college entomology course and developed as a trial for the possible incorporation of future virtual environments to supplement online higher education laboratories. This article provides an explanation of the rationale behind creating the virtual experience, the Bug Farm; the method and…

  2. Hybrid Reality Lab Capabilities - Video 2

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2016-01-01

    Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.

  3. Exploring "Magic Cottage": A Virtual Reality Environment for Stimulating Children's Imaginative Writing

    ERIC Educational Resources Information Center

    Patera, Marianne; Draper, Steve; Naef, Martin

    2008-01-01

    This paper presents an exploratory study that created a virtual reality environment (VRE) to stimulate motivation and creativity in imaginative writing at primary school level. The main aim of the study was to investigate if an interactive, semi-immersive virtual reality world could increase motivation and stimulate pupils' imagination in the…

  4. Pre-Service Teachers Designing Virtual World Learning Environments

    ERIC Educational Resources Information Center

    Jacka, Lisa; Booth, Kate

    2012-01-01

    Integrating Information Technology Communications in the classroom has been an important part of pre-service teacher education for over a decade. The advent of virtual worlds provides the pre-service teacher with an opportunity to study teaching and learning in a highly immersive 3D computer-based environment. Virtual worlds also provide a place…

  5. Immersive telepresence system using high-resolution omnidirectional movies and a locomotion interface

    NASA Astrophysics Data System (ADS)

    Ikeda, Sei; Sato, Tomokazu; Kanbara, Masayuki; Yokoya, Naokazu

    2004-05-01

    Technology that enables users to experience a remote site virtually is called telepresence. A telepresence system using real environment images is expected to be used in the field of entertainment, medicine, education and so on. This paper describes a novel telepresence system which enables users to walk through a photorealistic virtualized environment by actual walking. To realize such a system, a wide-angle high-resolution movie is projected on an immersive multi-screen display to present users the virtualized environments and a treadmill is controlled according to detected user's locomotion. In this study, we use an omnidirectional multi-camera system to acquire images real outdoor scene. The proposed system provides users with rich sense of walking in a remote site.

  6. Using immersive simulation for training first responders for mass casualty incidents.

    PubMed

    Wilkerson, William; Avstreih, Dan; Gruppen, Larry; Beier, Klaus-Peter; Woolliscroft, James

    2008-11-01

    A descriptive study was performed to better understand the possible utility of immersive virtual reality simulation for training first responders in a mass casualty event. Utilizing a virtual reality cave automatic virtual environment (CAVE) and high-fidelity human patient simulator (HPS), a group of experts modeled a football stadium that experienced a terrorist explosion during a football game. Avatars (virtual patients) were developed by expert consensus that demonstrated a spectrum of injuries ranging from death to minor lacerations. A group of paramedics was assessed by observation for decisions made and action taken. A critical action checklist was created and used for direct observation and viewing videotaped recordings. Of the 12 participants, only 35.7% identified the type of incident they encountered. None identified a secondary device that was easily visible. All participants were enthusiastic about the simulation and provided valuable comments and insights. Learner feedback and expert performance review suggests that immersive training in a virtual environment has the potential to be a powerful tool to train first responders for high-acuity, low-frequency events, such as a terrorist attack.

  7. Assessment of radiation awareness training in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Whisker, Vaughn E., III

    The prospect of new nuclear power plant orders in the near future and the graying of the current workforce create a need to train new personnel faster and better. Immersive virtual reality (VR) may offer a solution to the training challenge. VR technology presented in a CAVE Automatic Virtual Environment (CAVE) provides a high-fidelity, one-to-one scale environment where areas of the power plant can be recreated and virtual radiation environments can be simulated, making it possible to safely expose workers to virtual radiation in the context of the actual work environment. The use of virtual reality for training is supported by many educational theories; constructivism and discovery learning, in particular. Educational theory describes the importance of matching the training to the task. Plant access training and radiation worker training, common forms of training in the nuclear industry, rely on computer-based training methods in most cases, which effectively transfer declarative knowledge, but are poor at transferring skills. If an activity were to be added, the training would provide personnel with the opportunity to develop skills and apply their knowledge so they could be more effective when working in the radiation environment. An experiment was developed to test immersive virtual reality's suitability for training radiation awareness. Using a mixed methodology of quantitative and qualitative measures, the subjects' performances before and after training were assessed. First, subjects completed a pre-test to measure their knowledge prior to completing any training. Next they completed unsupervised computer-based training, which consisted of a PowerPoint presentation and a PDF document. After completing a brief orientation activity in the virtual environment, one group of participants received supplemental radiation awareness training in a simulated radiation environment presented in the CAVE, while a second group, the control group, moved directly to the assessment phase of the experiment. The CAVE supplied an activity-based training environment where learners were able to use a virtual survey meter to explore the properties of radiation sources and the effects of time and distance on radiation exposure. Once the training stage had ended, the subjects completed an assessment activity where they were asked to complete four tasks in a simulated radiation environment in the CAVE, which was designed to provide a more authentic assessment than simply testing understanding using a quiz. After the practicum, the subjects completed a post-test. Survey information was also collected to assist the researcher with interpretation of the collected data. Response to the training was measured by completion time, radiation exposure received, successful completion of the four tasks in the practicum, and scores on the post-test. These results were combined to create a radiation awareness score. In addition, observational data was collected as the subjects completed the tasks. The radiation awareness scores of the control group and the group that received supplemental training in the virtual environment were compared. T-tests showed that the effect of the supplemental training was not significant; however, calculation of the effect size showed a small-to-medium effect of the training. The CAVE group received significantly less radiation exposure during the assessment activity, and they completed the activities on an average of one minute faster. These results indicate that the training was effective, primarily for instilling radiation sensitivity. Observational data collected during the assessment supports this conclusion. The training environment provided by the immersive virtual reality recreated a radiation environment where learners could apply knowledge they had been taught by computer-based training. Activity-based training has been shown to be a more effective way to transfer skills because of the similarity between the training environment and the application environment. Virtual reality enables the training environment to look and feel like the application environment. Because of this, radiation awareness training in an immersive virtual environment should be considered by the nuclear industry, which is supported by the results of this experiment.

  8. Evaluation of knowledge transfer in an immersive virtual learning environment for the transportation community : [tech summary].

    DOT National Transportation Integrated Search

    2014-05-01

    mmersive Virtual Learning Environments (IVLEs) are extensively used in training, but few rigorous scienti c investigations regarding : the transfer of learning have been conducted. Measurement of learning transfer through evaluative methods is key...

  9. Brave New World

    ERIC Educational Resources Information Center

    Panettieri, Joseph C.

    2007-01-01

    Across the globe, progressive universities are embracing any number of MUVEs (multi-user virtual environments), 3D environments, and "immersive" virtual reality tools. And within the next few months, several universities are expected to test so-called "telepresence" videoconferencing systems from Cisco Systems and other leading…

  10. Accessible virtual reality therapy using portable media devices.

    PubMed

    Bruck, Susan; Watters, Paul A

    2010-01-01

    Simulated immersive environments displayed on large screens are a valuable therapeutic asset in the treatment of a range of psychological disorders. Permanent environments are expensive to build and maintain, require specialized clinician training and technical support and often have limited accessibility for clients. Ideally, virtual reality exposure therapy (VRET) could be accessible to the broader community if we could use inexpensive hardware with specifically designed software. This study tested whether watching a handheld non-immersive media device causes nausea and other cybersickness responses. Using a repeated measure design we found that nausea, general discomfort, eyestrain, blurred vision and an increase in salivation significantly increased in response to handheld non-immersive media device exposure.

  11. Recent developments in virtual experience design and production

    NASA Astrophysics Data System (ADS)

    Fisher, Scott S.

    1995-03-01

    Today, the media of VR and Telepresence are in their infancy and the emphasis is still on technology and engineering. But, it is not the hardware people might use that will determine whether VR becomes a powerful medium--instead, it will be the experiences that they are able to have that will drive its acceptance and impact. A critical challenge in the elaboration of these telepresence capabilities will be the development of environments that are as unpredictable and rich in interconnected processes as an actual location or experience. This paper will describe the recent development of several Virtual Experiences including: `Menagerie', an immersive Virtual Environment inhabited by virtual characters designed to respond to and interact with its users; and `The Virtual Brewery', an immersive public VR installation that provides multiple levels of interaction in an artistic interpretation of the brewing process.

  12. Multisensory Integration in the Virtual Hand Illusion with Active Movement

    PubMed Central

    Satoh, Satoru; Hachimura, Kozaburo

    2016-01-01

    Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality. PMID:27847822

  13. The interplays among technology and content, immersant and VE

    NASA Astrophysics Data System (ADS)

    Song, Meehae; Gromala, Diane; Shaw, Chris; Barnes, Steven J.

    2010-01-01

    The research program aims to explore and examine the fine balance necessary for maintaining the interplays between technology and the immersant, including identifying qualities that contribute to creating and maintaining a sense of "presence" and "immersion" in an immersive virtual reality (IVR) experience. Building upon and extending previous work, we compare sitting meditation with walking meditation in a virtual environment (VE). The Virtual Meditative Walk, a new work-in-progress, integrates VR and biofeedback technologies with a self-directed, uni-directional treadmill. As immersants learn how to meditate while walking, robust, real-time biofeedback technology continuously measures breathing, skin conductance and heart rate. The physiological states of the immersant will in turn affect the audio and stereoscopic visual media through shutter glasses. We plan to test the potential benefits and limitations of this physically active form of meditation with data from a sitting form of meditation. A mixed-methods approach to testing user outcomes parallels the knowledge bases of the collaborative team: a physician, computer scientists and artists.

  14. Working Collaboratively in Virtual Learning Environments: Using Second Life with Korean High School Students in History Class

    ERIC Educational Resources Information Center

    Kim, Mi Hwa

    2013-01-01

    The purpose of this experimental study was to investigate the impact of the use of a virtual environment for learning Korean history on high school students' learning outcomes and attitudes toward virtual worlds (collaboration, engagement, general use of SL [Second Life], and immersion). In addition, this experiment examined the relationships…

  15. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases.

    PubMed

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-04-15

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.

  16. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases

    PubMed Central

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-01-01

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907

  17. Constraint, Intelligence, and Control Hierarchy in Virtual Environments. Chapter 1

    NASA Technical Reports Server (NTRS)

    Sheridan, Thomas B.

    2007-01-01

    This paper seeks to deal directly with the question of what makes virtual actors and objects that are experienced in virtual environments seem real. (The term virtual reality, while more common in public usage, is an oxymoron; therefore virtual environment is the preferred term in this paper). Reality is difficult topic, treated for centuries in those sub-fields of philosophy called ontology- "of or relating to being or existence" and epistemology- "the study of the method and grounds of knowledge, especially with reference to its limits and validity" (both from Webster s, 1965). Advances in recent decades in the technologies of computers, sensors and graphics software have permitted human users to feel present or experience immersion in computer-generated virtual environments. This has motivated a keen interest in probing this phenomenon of presence and immersion not only philosophically but also psychologically and physiologically in terms of the parameters of the senses and sensory stimulation that correlate with the experience (Ellis, 1991). The pages of the journal Presence: Teleoperators and Virtual Environments have seen much discussion of what makes virtual environments seem real (see, e.g., Slater, 1999; Slater et al. 1994; Sheridan, 1992, 2000). Stephen Ellis, when organizing the meeting that motivated this paper, suggested to invited authors that "We may adopt as an organizing principle for the meeting that the genesis of apparently intelligent interaction arises from an upwelling of constraints determined by a hierarchy of lower levels of behavioral interaction. "My first reaction was "huh?" and my second was "yeah, that seems to make sense." Accordingly the paper seeks to explain from the author s viewpoint, why Ellis s hypothesis makes sense. What is the connection of "presence" or "immersion" of an observer in a virtual environment, to "constraints" and what types of constraints. What of "intelligent interaction," and is it the intelligence of the observer or the intelligence of the environment (whatever the latter may mean) that is salient? And finally, what might be relevant about "upwelling" of constraints as determined by a hierarchy of levels of interaction?

  18. KinImmerse: Macromolecular VR for NMR ensembles

    PubMed Central

    Block, Jeremy N; Zielinski, David J; Chen, Vincent B; Davis, Ian W; Vinson, E Claire; Brady, Rachael; Richardson, Jane S; Richardson, David C

    2009-01-01

    Background In molecular applications, virtual reality (VR) and immersive virtual environments have generally been used and valued for the visual and interactive experience – to enhance intuition and communicate excitement – rather than as part of the actual research process. In contrast, this work develops a software infrastructure for research use and illustrates such use on a specific case. Methods The Syzygy open-source toolkit for VR software was used to write the KinImmerse program, which translates the molecular capabilities of the kinemage graphics format into software for display and manipulation in the DiVE (Duke immersive Virtual Environment) or other VR system. KinImmerse is supported by the flexible display construction and editing features in the KiNG kinemage viewer and it implements new forms of user interaction in the DiVE. Results In addition to molecular visualizations and navigation, KinImmerse provides a set of research tools for manipulation, identification, co-centering of multiple models, free-form 3D annotation, and output of results. The molecular research test case analyzes the local neighborhood around an individual atom within an ensemble of nuclear magnetic resonance (NMR) models, enabling immersive visual comparison of the local conformation with the local NMR experimental data, including target curves for residual dipolar couplings (RDCs). Conclusion The promise of KinImmerse for production-level molecular research in the DiVE is shown by the locally co-centered RDC visualization developed there, which gave new insights now being pursued in wider data analysis. PMID:19222844

  19. Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters

    NASA Astrophysics Data System (ADS)

    Apostolellis, Panagiotis; Daradoumis, Thanasis

    Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.

  20. Comparative study on collaborative interaction in non-immersive and immersive systems

    NASA Astrophysics Data System (ADS)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki

    2007-09-01

    This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.

  1. A succinct overview of virtual reality technology use in Alzheimer's disease.

    PubMed

    García-Betances, Rebeca I; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer's disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers' education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments.

  2. How incorporation of scents could enhance immersive virtual experiences

    PubMed Central

    Ischer, Matthieu; Baron, Naëm; Mermoud, Christophe; Cayeux, Isabelle; Porcherot, Christelle; Sander, David; Delplanque, Sylvain

    2014-01-01

    Under normal everyday conditions, senses all work together to create experiences that fill a typical person's life. Unfortunately for behavioral and cognitive researchers who investigate such experiences, standard laboratory tests are usually conducted in a nondescript room in front of a computer screen. They are very far from replicating the complexity of real world experiences. Recently, immersive virtual reality (IVR) environments became promising methods to immerse people into an almost real environment that involves more senses. IVR environments provide many similarities to the complexity of the real world and at the same time allow experimenters to constrain experimental parameters to obtain empirical data. This can eventually lead to better treatment options and/or new mechanistic hypotheses. The idea that increasing sensory modalities improve the realism of IVR environments has been empirically supported, but the senses used did not usually include olfaction. In this technology report, we will present an odor delivery system applied to a state-of-the-art IVR technology. The platform provides a three-dimensional, immersive, and fully interactive visualization environment called “Brain and Behavioral Laboratory—Immersive System” (BBL-IS). The solution we propose can reliably deliver various complex scents during different virtual scenarios, at a precise time and space and without contamination of the environment. The main features of this platform are: (i) the limited cross-contamination between odorant streams with a fast odor delivery (< 500 ms), (ii) the ease of use and control, and (iii) the possibility to synchronize the delivery of the odorant with pictures, videos or sounds. How this unique technology could be used to investigate typical research questions in olfaction (e.g., emotional elicitation, memory encoding or attentional capture by scents) will also be addressed. PMID:25101017

  3. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  4. Virtual Reality: Emerging Applications and Future Directions

    ERIC Educational Resources Information Center

    Ludlow, Barbara L.

    2015-01-01

    Virtual reality is an emerging technology that has resulted in rapid expansion in the development of virtual immersive environments for use as educational simulations in schools, colleges and universities. This article presents an overview of virtual reality, describes a number of applications currently being used by special educators for…

  5. Experience with V-STORE: considerations on presence in virtual environments for effective neuropsychological rehabilitation of executive functions.

    PubMed

    Lo Priore, Corrado; Castelnuovo, Gianluca; Liccione, Diego; Liccione, Davide

    2003-06-01

    The paper discusses the use of immersive virtual reality systems for the cognitive rehabilitation of dysexecutive syndrome, usually caused by prefrontal brain injuries. With respect to classical P&P and flat-screen computer rehabilitative tools, IVR systems might prove capable of evoking a more intense and compelling sense of presence, thanks to the highly naturalistic subject-environment interaction allowed. Within a constructivist framework applied to holistic rehabilitation, we suggest that this difference might enhance the ecological validity of cognitive training, partly overcoming the implicit limits of a lab setting, which seem to affect non-immersive procedures especially when applied to dysexecutive symptoms. We tested presence in a pilot study applied to a new VR-based rehabilitation tool for executive functions, V-Store; it allows patients to explore a virtual environment where they solve six series of tasks, ordered for complexity and designed to stimulate executive functions, programming, categorical abstraction, short-term memory and attention. We compared sense of presence experienced by unskilled normal subjects, randomly assigned to immersive or non-immersive (flat screen) sessions of V-Store, through four different indexes: self-report questionnaire, psychophysiological (GSR, skin conductance), neuropsychological (incidental recall memory test related to auditory information coming from the "real" environment) and count of breaks in presence (BIPs). Preliminary results show in the immersive group a significantly higher GSR response during tasks; neuropsychological data (fewer recalled elements from "reality") and less BIPs only show a congruent but yet non-significant advantage for the immersive condition; no differences were evident from the self-report questionnaire. A larger experimental group is currently under examination to evaluate significance of these data, which also might prove interesting with respect to the question of objective-subjective measures of presence.

  6. Synchronizing Self and Object Movement: How Child and Adult Cyclists Intercept Moving Gaps in a Virtual Environment

    ERIC Educational Resources Information Center

    Chihak, Benjamin J.; Plumert, Jodie M.; Ziemer, Christine J.; Babu, Sabarish; Grechkin, Timofey; Cremer, James F.; Kearney, Joseph K.

    2010-01-01

    Two experiments examined how 10- and 12-year-old children and adults intercept moving gaps while bicycling in an immersive virtual environment. Participants rode an actual bicycle along a virtual roadway. At 12 test intersections, participants attempted to pass through a gap between 2 moving, car-sized blocks without stopping. The blocks were…

  7. Academic Library Services in Virtual Worlds: An Examination of the Potential for Library Services in Immersive Environments

    ERIC Educational Resources Information Center

    Ryan, Jenna; Porter, Marjorie; Miller, Rebecca

    2010-01-01

    Current literature on libraries is abundant with articles about the uses and the potential of new interactive communication technology, including Web 2.0 tools. Recently, the advent and use of virtual worlds have received top billing in these works. Many library institutions are exploring these virtual environments; this exploration and the…

  8. Virtual Worlds; Real Learning: Design Principles for Engaging Immersive Environments

    NASA Technical Reports Server (NTRS)

    Wu (u. Sjarpm)

    2012-01-01

    The EMDT master's program at Full Sail University embarked on a small project to use a virtual environment to teach graduate students. The property used for this project has evolved our several iterations and has yielded some basic design principles and pedagogy for virtual spaces. As a result, students are emerging from the program with a better grasp of future possibilities.

  9. EXPLORING ENVIRONMENTAL DATA IN A HIGHLY IMMERSIVE VIRTUAL REALITY ENVIRONMENT

    EPA Science Inventory

    Geography inherently fills a 3D space and yet we struggle with displaying geography using, primaarily, 2D display devices. Virtual environments offer a more realistically-dimensioned display space and this is being realized in the expanding area of research on 3D Geographic Infor...

  10. PC-Based Virtual Reality for CAD Model Viewing

    ERIC Educational Resources Information Center

    Seth, Abhishek; Smith, Shana S.-F.

    2004-01-01

    Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…

  11. Use of Immersive Simulations to Enhance Graduate Student Learning: Implications for Educational Leadership Programs

    ERIC Educational Resources Information Center

    Voelkel, Robert H.; Johnson, Christie W.; Gilbert, Kristen A.

    2016-01-01

    The purpose of this article is to present how one university incorporates immersive simulations through platforms which employ avatars to enhance graduate student understanding and learning in educational leadership programs. While using simulations and immersive virtual environments continues to grow, the literature suggests limited evidence of…

  12. Research on evaluation techniques for immersive multimedia

    NASA Astrophysics Data System (ADS)

    Hashim, Aslinda M.; Romli, Fakaruddin Fahmi; Zainal Osman, Zosipha

    2013-03-01

    Nowadays Immersive Multimedia covers most usage in tremendous ways, such as healthcare/surgery, military, architecture, art, entertainment, education, business, media, sport, rehabilitation/treatment and training areas. Moreover, the significant of Immersive Multimedia to directly meet the end-users, clients and customers needs for a diversity of feature and purpose is the assembly of multiple elements that drive effective Immersive Multimedia system design, so evaluation techniques is crucial for Immersive Multimedia environments. A brief general idea of virtual environment (VE) context and `realism' concept that formulate the Immersive Multimedia environments is then provided. This is followed by a concise summary of the elements of VE assessment technique that is applied in Immersive Multimedia system design, which outlines the classification space for Immersive Multimedia environments evaluation techniques and gives an overview of the types of results reported. A particular focus is placed on the implications of the Immersive Multimedia environments evaluation techniques in relation to the elements of VE assessment technique, which is the primary purpose of producing this research. The paper will then conclude with an extensive overview of the recommendations emanating from the research.

  13. Investigating Learners' Attitudes toward Virtual Reality Learning Environments: Based on a Constructivist Approach

    ERIC Educational Resources Information Center

    Huang, Hsiu-Mei; Rauch, Ulrich; Liaw, Shu-Sheng

    2010-01-01

    The use of animation and multimedia for learning is now further extended by the provision of entire Virtual Reality Learning Environments (VRLE). This highlights a shift in Web-based learning from a conventional multimedia to a more immersive, interactive, intuitive and exciting VR learning environment. VRLEs simulate the real world through the…

  14. Cognitive Presence and Effect of Immersion in Virtual Learning Environment

    ERIC Educational Resources Information Center

    Katernyak, Ihor; Loboda, Viktoriya

    2016-01-01

    This paper presents the approach to successful application of two knowledge management techniques--community of practice and eLearning, in order to create and manage a competence-developing virtual learning environment. It explains how "4A" model of involving practitioners in eLearning process (through attention, actualization,…

  15. Virtual Environments and Autism: A Developmental Psychopathological Approach

    ERIC Educational Resources Information Center

    Rajendran, G.

    2013-01-01

    Individuals with autism spectrum disorders supposedly have an affinity with information and communication technology (ICT), making it an ideally suited media for this population. Virtual environments (VEs)--both two-dimensional and immersive--represent a particular kind of ICT that might be of special benefit. Specifically, this paper discusses…

  16. Framing the magic

    NASA Astrophysics Data System (ADS)

    Tsoupikova, Daria

    2006-02-01

    This paper will explore how the aesthetics of the virtual world affects, transforms, and enhances the immersive emotional experience of the user. What we see and what we do upon entering the virtual environment influences our feelings, mental state, physiological changes and sensibility. To create a unique virtual experience the important component to design is the beauty of the virtual world based on the aesthetics of the graphical objects such as textures, models, animation, and special effects. The aesthetic potency of the images that comprise the virtual environment can make the immersive experience much stronger and more compelling. The aesthetic qualities of the virtual world as born out through images and graphics can influence the user's state of mind. Particular changes and effects on the user can be induced through the application of techniques derived from the research fields of psychology, anthropology, biology, color theory, education, art therapy, music, and art history. Many contemporary artists and developers derive much inspiration for their work from their experience with traditional arts such as painting, sculpture, design, architecture and music. This knowledge helps them create a higher quality of images and stereo graphics in the virtual world. The understanding of the close relation between the aesthetic quality of the virtual environment and the resulting human perception is the key to developing an impressive virtual experience.

  17. Manipulating the fidelity of lower extremity visual feedback to identify obstacle negotiation strategies in immersive virtual reality.

    PubMed

    Kim, Aram; Zhou, Zixuan; Kretch, Kari S; Finley, James M

    2017-07-01

    The ability to successfully navigate obstacles in our environment requires integration of visual information about the environment with estimates of our body's state. Previous studies have used partial occlusion of the visual field to explore how information about the body and impending obstacles are integrated to mediate a successful clearance strategy. However, because these manipulations often remove information about both the body and obstacle, it remains to be seen how information about the lower extremities alone is utilized during obstacle crossing. Here, we used an immersive virtual reality (VR) interface to explore how visual feedback of the lower extremities influences obstacle crossing performance. Participants wore a head-mounted display while walking on treadmill and were instructed to step over obstacles in a virtual corridor in four different feedback trials. The trials involved: (1) No visual feedback of the lower extremities, (2) an endpoint-only model, (3) a link-segment model, and (4) a volumetric multi-segment model. We found that the volumetric model improved success rate, placed their trailing foot before crossing and leading foot after crossing more consistently, and placed their leading foot closer to the obstacle after crossing compared to no model. This knowledge is critical for the design of obstacle negotiation tasks in immersive virtual environments as it may provide information about the fidelity necessary to reproduce ecologically valid practice environments.

  18. Simulation Exploration through Immersive Parallel Planes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  19. Simulation Exploration through Immersive Parallel Planes: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  20. The effect on lower spine muscle activation of walking on a narrow beam in virtual reality.

    PubMed

    Antley, Angus; Slater, Mel

    2011-02-01

    To what extent do people behave in immersive virtual environments as they would in similar situations in a physical environment? There are many ways to address this question, ranging from questionnaires, behavioral studies, and the use of physiological measures. Here, we compare the onsets of muscle activity using surface electromyography (EMG) while participants were walking under three different conditions: on a normal floor surface, on a narrow ribbon along the floor, and on a narrow platform raised off the floor. The same situation was rendered in an immersive virtual environment (IVE) Cave-like system, and 12 participants did the three types of walking in a counter-balanced within-groups design. The mean number of EMG activity onsets per unit time followed the same pattern in the virtual environment as in the physical environment-significantly higher for walking on the platform compared to walking on the floor. Even though participants knew that they were in fact really walking at floor level in the virtual environment condition, the visual illusion of walking on a raised platform was sufficient to influence their behavior in a measurable way. This opens up the door for this technique to be used in gait and posture related scenarios including rehabilitation.

  1. Immersive virtual reality platform for medical training: a "killer-application".

    PubMed

    2000-01-01

    The Medical Readiness Trainer (MRT) integrates fully immersive Virtual Reality (VR), highly advanced medical simulation technologies, and medical data to enable unprecedented medical education and training. The flexibility offered by the MRT environment serves as a practical teaching tool today and in the near future the will serve as an ideal vehicle for facilitating the transition to the next level of medical practice, i.e., telepresence and next generation Internet-based collaborative learning.

  2. The Selimiye Mosque of Edirne, Turkey - AN Immersive and Interactive Virtual Reality Experience Using Htc Vive

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A. P.

    2017-05-01

    Recent advances in contemporary Virtual Reality (VR) technologies are going to have a significant impact on veryday life. Through VR it is possible to virtually explore a computer-generated environment as a different reality, and to immerse oneself into the past or in a virtual museum without leaving the current real-life situation. For such the ultimate VR experience, the user should only see the virtual world. Currently, the user must wear a VR headset which fits around the head and over the eyes to visually separate themselves from the physical world. Via the headset images are fed to the eyes through two small lenses. Cultural heritage monuments are ideally suited both for thorough multi-dimensional geometric documentation and for realistic interactive visualisation in immersive VR applications. Additionally, the game industry offers tools for interactive visualisation of objects to motivate users to virtually visit objects and places. In this paper the generation of a virtual 3D model of the Selimiye mosque in the city of Edirne, Turkey and its processing for data integration into the game engine Unity is presented. The project has been carried out as a co-operation between BİMTAŞ, a company of the Greater Municipality of Istanbul, Turkey and the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg, Germany to demonstrate an immersive and interactive visualisation using the new VR system HTC Vive. The workflow from data acquisition to VR visualisation, including the necessary programming for navigation, is described. Furthermore, the possible use (including simultaneous multiple users environments) of such a VR visualisation for a CH monument is discussed in this contribution.

  3. Female artists and the VR crucible: expanding the aesthetic vocabulary

    NASA Astrophysics Data System (ADS)

    Morie, Jacquelyn Ford

    2012-03-01

    Virtual Reality was a technological wonder in its early days, and it was widely held to be a domain where men were the main practitioners. However, a survey done in 2007 of VR Artworks (Immersive Virtual Environments or VEs) showed that women have actually created the majority of artistic immersive works. This argues against the popular idea that the field has been totally dominated by men. While men have made great contributions in advancing the field, especially technologically, it appears most artistic works emerge from a decidedly feminine approach. Such an approach seems well suited to immersive environments as it incorporates aspects of inclusion, wholeness, and a blending of the body and the spirit. Female attention to holistic concerns fits the gestalt approach needed to create in a fully functional yet open-ended virtual world, which focuses not so much on producing a finished object (like a text or a sculpture) but rather on creating a possibility for becoming, like bringing a child into the world. Immersive VEs are not objective works of art to be hung on a wall and critiqued. They are vehicles for experience, vessels to live within for a piece of time.

  4. A Succinct Overview of Virtual Reality Technology Use in Alzheimer’s Disease

    PubMed Central

    García-Betances, Rebeca I.; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer’s disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers’ education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments. PMID:26029101

  5. Assessment in Immersive Virtual Environments: Cases for Learning, of Learning, and as Learning

    ERIC Educational Resources Information Center

    Code, Jillianne; Zap, Nick

    2017-01-01

    The key to education reform lies in exploring alternative forms of assessment. Alternative performance assessments provide a more valid measure than multiple-choice tests of students' conceptual understanding and higher-level skills such as problem solving and inquiry. Advances in game-based and virtual environment technologies are creating new…

  6. Visual Perspectives within Educational Computer Games: Effects on Presence and Flow within Virtual Immersive Learning Environments

    ERIC Educational Resources Information Center

    Scoresby, Jon; Shelton, Brett E.

    2011-01-01

    The mis-categorizing of cognitive states involved in learning within virtual environments has complicated instructional technology research. Further, most educational computer game research does not account for how learning activity is influenced by factors of game content and differences in viewing perspectives. This study is a qualitative…

  7. Making Web3D Less Scary: Toward Easy-to-Use Web3D e-Learning Content Development Tools for Educators

    ERIC Educational Resources Information Center

    de Byl, Penny

    2009-01-01

    Penny de Byl argues that one of the biggest challenges facing educators today is the integration of rich and immersive three-dimensional environments with existing teaching and learning materials. To empower educators with the ability to embrace emerging Web3D technologies, the Advanced Learning and Immersive Virtual Environment (ALIVE) research…

  8. Using Virtual Worlds to Identify Multidimensional Student Engagement in High School Foreign Language Learning Classrooms

    ERIC Educational Resources Information Center

    Jacob, Laura Beth

    2012-01-01

    Virtual world environments have evolved from object-oriented, text-based online games to complex three-dimensional immersive social spaces where the lines between reality and computer-generated begin to blur. Educators use virtual worlds to create engaging three-dimensional learning spaces for students, but the impact of virtual worlds in…

  9. Using Immersive Virtual Reality for Electrical Substation Training

    ERIC Educational Resources Information Center

    Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana

    2015-01-01

    Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…

  10. Scenario-Based Spoken Interaction with Virtual Agents

    ERIC Educational Resources Information Center

    Morton, Hazel; Jack, Mervyn A.

    2005-01-01

    This paper describes a CALL approach which integrates software for speaker independent continuous speech recognition with embodied virtual agents and virtual worlds to create an immersive environment in which learners can converse in the target language in contextualised scenarios. The result is a self-access learning package: SPELL (Spoken…

  11. Learning through Place-Making: Virtual Environments and Future Literacies

    ERIC Educational Resources Information Center

    Berry, Maryanne Susan

    2010-01-01

    This study examines a project through which elementary school and high school students collaborated with university Architecture/New Media students in building models of virtual, immersive libraries. It presents the project in the context of multiple and cross-disciplinary fields currently investigating the use of virtual and immersive…

  12. The effect of degree of immersion upon learning performance in virtual reality simulations for medical education.

    PubMed

    Gutiérrez, Fátima; Pierce, Jennifer; Vergara, Víctor M; Coulter, Robert; Saland, Linda; Caudell, Thomas P; Goldsmith, Timothy E; Alverson, Dale C

    2007-01-01

    Simulations are being used in education and training to enhance understanding, improve performance, and assess competence. However, it is important to measure the performance of these simulations as learning and training tools. This study examined and compared knowledge acquisition using a knowledge structure design. The subjects were first-year medical students at The University of New Mexico School of Medicine. One group used a fully immersed virtual reality (VR) environment using a head mounted display (HMD) and another group used a partially immersed (computer screen) VR environment. The study aims were to determine whether there were significant differences between the two groups as measured by changes in knowledge structure before and after the VR simulation experience. The results showed that both groups benefited from the VR simulation training as measured by the significant increased similarity to the expert knowledge network after the training experience. However, the immersed group showed a significantly higher gain than the partially immersed group. This study demonstrated a positive effect of VR simulation on learning as reflected by improvements in knowledge structure but an enhanced effect of full-immersion using a HMD vs. a screen-based VR system.

  13. The effectiveness of virtual reality distraction for pain reduction: a systematic review.

    PubMed

    Malloy, Kevin M; Milling, Leonard S

    2010-12-01

    Virtual reality technology enables people to become immersed in a computer-simulated, three-dimensional environment. This article provides a comprehensive review of controlled research on the effectiveness of virtual reality (VR) distraction for reducing pain. To be included in the review, studies were required to use a between-subjects or mixed model design in which VR distraction was compared with a control condition or an alternative intervention in relieving pain. An exhaustive search identified 11 studies satisfying these criteria. VR distraction was shown to be effective for reducing experimental pain, as well as the discomfort associated with burn injury care. Studies of needle-related pain provided less consistent findings. Use of more sophisticated virtual reality technology capable of fully immersing the individual in a virtual environment was associated with greater relief. Overall, controlled research suggests that VR distraction may be a useful tool for clinicians who work with a variety of pain problems. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Exploring Design Requirements for Repurposing Dental Virtual Patients From the Web to Second Life: A Focus Group Study

    PubMed Central

    Antoniou, Panagiotis E; Athanasopoulou, Christina A; Dafli, Eleni

    2014-01-01

    Background Since their inception, virtual patients have provided health care educators with a way to engage learners in an experience simulating the clinician’s environment without danger to learners and patients. This has led this learning modality to be accepted as an essential component of medical education. With the advent of the visually and audio-rich 3-dimensional multi-user virtual environment (MUVE), a new deployment platform has emerged for educational content. Immersive, highly interactive, multimedia-rich, MUVEs that seamlessly foster collaboration provide a new hotbed for the deployment of medical education content. Objective This work aims to assess the suitability of the Second Life MUVE as a virtual patient deployment platform for undergraduate dental education, and to explore the requirements and specifications needed to meaningfully repurpose Web-based virtual patients in MUVEs. Methods Through the scripting capabilities and available art assets in Second Life, we repurposed an existing Web-based periodontology virtual patient into Second Life. Through a series of point-and-click interactions and multiple-choice queries, the user experienced a specific periodontology case and was asked to provide the optimal responses for each of the challenges of the case. A focus group of 9 undergraduate dentistry students experienced both the Web-based and the Second Life version of this virtual patient. The group convened 3 times and discussed relevant issues such as the group’s computer literacy, the assessment of Second Life as a virtual patient deployment platform, and compared the Web-based and MUVE-deployed virtual patients. Results A comparison between the Web-based and the Second Life virtual patient revealed the inherent advantages of the more experiential and immersive Second Life virtual environment. However, several challenges for the successful repurposing of virtual patients from the Web to the MUVE were identified. The identified challenges for repurposing of Web virtual patients to the MUVE platform from the focus group study were (1) increased case complexity to facilitate the user’s gaming preconception in a MUVE, (2) necessity to decrease textual narration and provide the pertinent information in a more immersive sensory way, and (3) requirement to allow the user to actuate the solutions of problems instead of describing them through narration. Conclusions For a successful systematic repurposing effort of virtual patients to MUVEs such as Second Life, the best practices of experiential and immersive game design should be organically incorporated in the repurposing workflow (automated or not). These findings are pivotal in an era in which open educational content is transferred to and shared among users, learners, and educators of various open repositories/environments. PMID:24927470

  15. Will Anything Useful Come Out of Virtual Reality? Examination of a Naval Application

    DTIC Science & Technology

    1993-05-01

    The term virtual reality can encompass varying meanings, but some generally accepted attributes of a virtual environment are that it is immersive...technology, but at present there are few practical applications which are utilizing the broad range of virtual reality technology. This paper will discuss an...Operability, operator functions, Virtual reality , Man-machine interface, Decision aids/decision making, Decision support. ASW.

  16. Understanding Immersivity: Image Generation and Transformation Processes in 3D Immersive Environments

    PubMed Central

    Kozhevnikov, Maria; Dhond, Rupali P.

    2012-01-01

    Most research on three-dimensional (3D) visual-spatial processing has been conducted using traditional non-immersive 2D displays. Here we investigated how individuals generate and transform mental images within 3D immersive (3DI) virtual environments, in which the viewers perceive themselves as being surrounded by a 3D world. In Experiment 1, we compared participants’ performance on the Shepard and Metzler (1971) mental rotation (MR) task across the following three types of visual presentation environments; traditional 2D non-immersive (2DNI), 3D non-immersive (3DNI – anaglyphic glasses), and 3DI (head mounted display with position and head orientation tracking). In Experiment 2, we examined how the use of different backgrounds affected MR processes within the 3DI environment. In Experiment 3, we compared electroencephalogram data recorded while participants were mentally rotating visual-spatial images presented in 3DI vs. 2DNI environments. Overall, the findings of the three experiments suggest that visual-spatial processing is different in immersive and non-immersive environments, and that immersive environments may require different image encoding and transformation strategies than the two other non-immersive environments. Specifically, in a non-immersive environment, participants may utilize a scene-based frame of reference and allocentric encoding whereas immersive environments may encourage the use of a viewer-centered frame of reference and egocentric encoding. These findings also suggest that MR performed in laboratory conditions using a traditional 2D computer screen may not reflect spatial processing as it would occur in the real world. PMID:22908003

  17. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  18. Stage Cylindrical Immersive Display

    NASA Technical Reports Server (NTRS)

    Abramyan, Lucy; Norris, Jeffrey S.; Powell, Mark W.; Mittman, David S.; Shams, Khawaja S.

    2011-01-01

    Panoramic images with a wide field of view intend to provide a better understanding of an environment by placing objects of the environment on one seamless image. However, understanding the sizes and relative positions of the objects in a panorama is not intuitive and prone to errors because the field of view is unnatural to human perception. Scientists are often faced with the difficult task of interpreting the sizes and relative positions of objects in an environment when viewing an image of the environment on computer monitors or prints. A panorama can display an object that appears to be to the right of the viewer when it is, in fact, behind the viewer. This misinterpretation can be very costly, especially when the environment is remote and/or only accessible by unmanned vehicles. A 270 cylindrical display has been developed that surrounds the viewer with carefully calibrated panoramic imagery that correctly engages their natural kinesthetic senses and provides a more accurate awareness of the environment. The cylindrical immersive display offers a more natural window to the environment than a standard cubic CAVE (Cave Automatic Virtual Environment), and the geometry allows multiple collocated users to simultaneously view data and share important decision-making tasks. A CAVE is an immersive virtual reality environment that allows one or more users to absorb themselves in a virtual environment. A common CAVE setup is a room-sized cube where the cube sides act as projection planes. By nature, all cubic CAVEs face a problem with edge matching at edges and corners of the display. Modern immersive displays have found ways to minimize seams by creating very tight edges, and rely on the user to ignore the seam. One significant deficiency of flat-walled CAVEs is that the sense of orientation and perspective within the scene is broken across adjacent walls. On any single wall, parallel lines properly converge at their vanishing point as they should, and the sense of perspective within the scene contained on only one wall has integrity. Unfortunately, parallel lines that lie on adjacent walls do not necessarily remain parallel. This results in inaccuracies in the scene that can distract the viewer and subtract from the immersive experience of the CAVE.

  19. Ames Lab 101: C6: Virtual Engineering

    ScienceCinema

    McCorkle, Doug

    2018-01-01

    Ames Laboratory scientist Doug McCorkle explains the importance of virtual engineering and talks about the C6. The C6 is a three-dimensional, fully-immersive synthetic environment residing in the center atrium of Iowa State University's Howe Hall.

  20. Radiological tele-immersion for next generation networks.

    PubMed

    Ai, Z; Dech, F; Rasmussen, M; Silverstein, J C

    2000-01-01

    Since the acquisition of high-resolution three-dimensional patient images has become widespread, medical volumetric datasets (CT or MR) larger than 100 MB and encompassing more than 250 slices are common. It is important to make this patient-specific data quickly available and usable to many specialists at different geographical sites. Web-based systems have been developed to provide volume or surface rendering of medical data over networks with low fidelity, but these cannot adequately handle stereoscopic visualization or huge datasets. State-of-the-art virtual reality techniques and high speed networks have made it possible to create an environment for clinicians geographically distributed to immersively share these massive datasets in real-time. An object-oriented method for instantaneously importing medical volumetric data into Tele-Immersive environments has been developed at the Virtual Reality in Medicine Laboratory (VRMedLab) at the University of Illinois at Chicago (UIC). This networked-VR setup is based on LIMBO, an application framework or template that provides the basic capabilities of Tele-Immersion. We have developed a modular general purpose Tele-Immersion program that automatically combines 3D medical data with the methods for handling the data. For this purpose a DICOM loader for IRIS Performer has been developed. The loader was designed for SGI machines as a shared object, which is executed at LIMBO's runtime. The loader loads not only the selected DICOM dataset, but also methods for rendering, handling, and interacting with the data, bringing networked, real-time, stereoscopic interaction with radiological data to reality. Collaborative, interactive methods currently implemented in the loader include cutting planes and windowing. The Tele-Immersive environment has been tested on the UIC campus over an ATM network. We tested the environment with 3 nodes; one ImmersaDesk at the VRMedLab, one CAVE at the Electronic Visualization Laboratory (EVL) on east campus, and a CT scan machine in UIC Hospital. CT data was pulled directly from the scan machine to the Tele-Immersion server in our Laboratory, and then the data was synchronously distributed by our Onyx2 Rack server to all the VR setups. Instead of permitting medical volume visualization at one VR device, by combining teleconferencing, tele-presence, and virtual reality, the Tele-Immersive environment will enable geographically distributed clinicians to intuitively interact with the same medical volumetric models, point, gesture, converse, and see each other. This environment will bring together clinicians at different geographic locations to participate in Tele-Immersive consultation and collaboration.

  1. SciEthics Interactive: Science and Ethics Learning in a Virtual Environment

    ERIC Educational Resources Information Center

    Nadolny, Larysa; Woolfrey, Joan; Pierlott, Matthew; Kahn, Seth

    2013-01-01

    Learning in immersive 3D environments allows students to collaborate, build, and interact with difficult course concepts. This case study examines the design and development of the TransGen Island within the SciEthics Interactive project, a National Science Foundation-funded, 3D virtual world emphasizing learning science content in the context of…

  2. Desktop Virtual Reality: A Powerful New Technology for Teaching and Research in Industrial Teacher Education

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2004-01-01

    Virtual Reality has been defined in many different ways and now means different things in various contexts. VR can range from simple environments presented on a desktop computer to fully immersive multisensory environments experienced through complex headgear and bodysuits. In all of its manifestations, VR is basically a way of simulating or…

  3. Virtual Solar System Project: Learning through a Technology-Rich, Inquiry-Based, Participatory Learning Environment.

    ERIC Educational Resources Information Center

    Barab, Sasha A.; Hay, Kenneth E.; Squire, Kurt; Barnett, Michael; Schmidt, Rae; Karrigan, Kristen; Yamagata-Lynch, Lisa; Johnson, Christine

    2000-01-01

    Describes an introductory undergraduate astronomy course in which the large-lecture format was moved to one in which students were immersed in a technologically-rich, inquiry-based, participatory learning environment. Finds that virtual reality can be used effectively in regular undergraduate university courses as a tool through which students can…

  4. History Educators and the Challenge of Immersive Pasts: A Critical Review of Virtual Reality "Tools" and History Pedagogy

    ERIC Educational Resources Information Center

    Allison, John

    2008-01-01

    This paper will undertake a critical review of the impact of virtual reality tools on the teaching of history. Virtual reality is useful in several different ways. History educators, elementary and secondary school teachers and professors, can all profit from the digital environment. Challenges arise quickly however. Virtual reality technologies…

  5. The Pixelated Professor: Faculty in Immersive Virtual Worlds

    ERIC Educational Resources Information Center

    Blackmon, Stephanie

    2015-01-01

    Online environments, particularly virtual worlds, can sometimes complicate issues of self expression. For example, the faculty member who loves punk rock has an opportunity, through hairstyle and attire choices in the virtual world, to share that part of herself with students. However, deciding to share that part of the self can depend on a number…

  6. An Investigation into Cooperative Learning in a Virtual World Using Problem-Based Learning

    ERIC Educational Resources Information Center

    Parson, Vanessa; Bignell, Simon

    2017-01-01

    Three-dimensional multi-user virtual environments (MUVEs) have the potential to provide experiential learning qualitatively similar to that found in the real world. MUVEs offer a pedagogically-driven immersive learning opportunity for educationalists that is cost-effective and enjoyable. A family of digital virtual avatars was created within…

  7. Virtual community centre for power wheelchair training: Experience of children and clinicians.

    PubMed

    Torkia, Caryne; Ryan, Stephen E; Reid, Denise; Boissy, Patrick; Lemay, Martin; Routhier, François; Contardo, Resi; Woodhouse, Janet; Archambault, Phillipe S

    2017-11-02

    To: 1) characterize the overall experience in using the McGill immersive wheelchair - community centre (miWe-CC) simulator; and 2) investigate the experience of presence (i.e., sense of being in the virtual rather than in the real, physical environment) while driving a PW in the miWe-CC. A qualitative research design with structured interviews was used. Fifteen clinicians and 11 children were interviewed after driving a power wheelchair (PW) in the miWe-CC simulator. Data were analyzed using the conventional and directed content analysis approaches. Overall, participants enjoyed using the simulator and experienced a sense of presence in the virtual space. They felt a sense of being in the virtual environment, involved and focused on driving the virtual PW rather than on the surroundings of the actual room where they were. Participants reported several similarities between the virtual community centre layout and activities of the miWe-CC and the day-to-day reality of paediatric PW users. The simulator replicated participants' expectations of real-life PW use and promises to have an effect on improving the driving skills of new PW users. Implications for rehabilitation Among young users, the McGill immersive wheelchair (miWe) simulator provides an experience of presence within the virtual environment. This experience of presence is generated by a sense of being in the virtual scene, a sense of being involved, engaged, and focused on interacting within the virtual environment, and by the perception that the virtual environment is consistent with the real world. The miWe is a relevant and accessible approach, complementary to real world power wheelchair training for young users.

  8. The Optokinetic Cervical Reflex (OKCR) in Pilots of High-Performance Aircraft.

    DTIC Science & Technology

    1997-04-01

    Coupled System virtual reality - the attempt to create a realistic, three-dimensional environment or synthetic immersive environment in which the user ...factors interface between the pilot and the flight environment. The final section is a case study of head- and helmet-mounted displays (HMD) and the impact...themselves as actually moving (flying) through a virtual environment. However, in the studies of Held, et al. (1975) and Young, et al. (1975) the

  9. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    PubMed Central

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545

  10. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.

    PubMed

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-05-17

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.

  11. Skill training in multimodal virtual environments.

    PubMed

    Gopher, Daniel

    2012-01-01

    Multimodal, immersive, virtual reality (VR) techniques open new perspectives for perceptual-motor skill trainers. They also introduce new risks and dangers. This paper describes the benefits and pitfalls of multimodal training and the cognitive building blocks of a multimodal, VR training simulators.

  12. [Virtual reality therapy in anxiety disorders].

    PubMed

    Mitrousia, V; Giotakos, O

    2016-01-01

    During the last decade a number of studies have been conducted in order to examine if virtual reality exposure therapy can be an alternative form of therapy for the treatment of mental disorders and particularly for the treatment of anxiety disorders. Imaginal exposure therapy, which is one of the components of Cognitive Behavioral Therapy, cannot be easily applied to all patients and in cases like those virtual reality can be used as an alternative or a supportive psychotherapeutic technique. Most studies using virtual reality have focused on anxiety disorders, mainly in specific phobias, but some extend to other disorders such as eating disorders, drug dependence, pain control and palliative care and rehabilitation. Main characteristics of virtual reality therapy are: "interaction", "immersion", and "presence". High levels of "immersion" and "presence" are associated with increased response to exposure therapy in virtual environments, as well as better therapeutic outcomes and sustained therapeutic gains. Typical devices that are used in order patient's immersion to be achieved are the Head-Mounted Displays (HMD), which are only for individual use, and the computer automatic virtual environment (CAVE), which is a multiuser. Virtual reality therapy's disadvantages lie in the difficulties that arise due to the demanded specialized technology skills, devices' cost and side effects. Therapists' training is necessary in order for them to be able to manipulate the software and the hardware and to adjust it to each case's needs. Devices' cost is high but as technology continuously improves it constantly decreases. Immersion during virtual reality therapy can induce mild and temporary side effects such as nausea, dizziness or headache. Until today, however, experience shows that virtual reality offers several advantages. Patient's avoidance to be exposed in phobic stimuli is reduced via the use of virtual reality since the patient is exposed to them as many times as he wishes and under the supervision of the therapist. The technique takes place in the therapist's office which ensures confidentiality and privacy. The therapist is able to control unpredicted events that can occur during patient's exposure in real environments. Mainly the therapist can control the intensity of exposure and adapt it to the patient's needs. Virtual reality can be proven particularly useful in some specific psychological states. For instance, patients with post-traumatic stress disorder (PTSD) who prone to avoid the reminders of the traumatic events. Exposure in virtual reality can solve this problem providing to the patient a large number of stimuli that activate the senses causing the necessary physiological and psychological anxiety reactions, regardless of his willingness or ability to recall in his imagination the traumatic event.

  13. Virtually numbed: immersive video gaming alters real-life experience.

    PubMed

    Weger, Ulrich W; Loughnan, Stephen

    2014-04-01

    As actors in a highly mechanized environment, we are citizens of a world populated not only by fellow humans, but also by virtual characters (avatars). Does immersive video gaming, during which the player takes on the mantle of an avatar, prompt people to adopt the coldness and rigidity associated with robotic behavior and desensitize them to real-life experience? In one study, we correlated participants' reported video-gaming behavior with their emotional rigidity (as indicated by the number of paperclips that they removed from ice-cold water). In a second experiment, we manipulated immersive and nonimmersive gaming behavior and then likewise measured the extent of the participants' emotional rigidity. Both studies yielded reliable impacts, and thus suggest that immersion into a robotic viewpoint desensitizes people to real-life experiences in oneself and others.

  14. An Analysis of VR Technology Used in Immersive Simulations with a Serious Game Perspective.

    PubMed

    Menin, Aline; Torchelsen, Rafael; Nedel, Luciana

    2018-03-01

    Using virtual environments (VEs) is a safer and cost-effective alternative to executing dangerous tasks, such as training firefighters and industrial operators. Immersive virtual reality (VR) combined with game aspects have the potential to improve the user experience in the VE by increasing realism, engagement, and motivation. This article investigates the impact of VR technology on 46 immersive gamified simulations with serious purposes and classifies it towards a taxonomy. Our findings suggest that immersive VR improves simulation outcomes, such as increasing learning gain and knowledge retention and improving clinical outcomes for rehabilitation. However, it also has limitations such as motion sickness and restricted access to VR hardware. Our contributions are to provide a better understanding of the benefits and limitations of using VR in immersive simulations with serious purposes, to propose a taxonomy that classifies them, and to discuss whether methods and participants profiles influence results.

  15. Using CLIPS to represent knowledge in a VR simulation

    NASA Technical Reports Server (NTRS)

    Engelberg, Mark L.

    1994-01-01

    Virtual reality (VR) is an exciting use of advanced hardware and software technologies to achieve an immersive simulation. Until recently, the majority of virtual environments were merely 'fly-throughs' in which a user could freely explore a 3-dimensional world or a visualized dataset. Now that the underlying technologies are reaching a level of maturity, programmers are seeking ways to increase the complexity and interactivity of immersive simulations. In most cases, interactivity in a virtual environment can be specified in the form 'whenever such-and-such happens to object X, it reacts in the following manner.' CLIPS and COOL provide a simple and elegant framework for representing this knowledge-base in an efficient manner that can be extended incrementally. The complexity of a detailed simulation becomes more manageable when the control flow is governed by CLIPS' rule-based inference engine as opposed to by traditional procedural mechanisms. Examples in this paper will illustrate an effective way to represent VR information in CLIPS, and to tie this knowledge base to the input and output C routines of a typical virtual environment.

  16. Navigating Massively Multiplayer Online Games: Evaluating 21st Century Skills for Learning within Virtual Environments

    ERIC Educational Resources Information Center

    McCreery, Michael P.; Schrader, P. G.; Krach, S. Kathleen

    2011-01-01

    There is a substantial and growing interest in immersive virtual spaces as contexts for 21st century skills like problem solving, communication, and collaboration. However, little consideration has been given to the ways in which users become proficient in these environments or what types of target behaviors are associated with 21st century…

  17. A Fully Immersive Set-Up for Remote Interaction and Neurorehabilitation Based on Virtual Body Ownership

    PubMed Central

    Perez-Marcos, Daniel; Solazzi, Massimiliano; Steptoe, William; Oyekoya, Oyewole; Frisoli, Antonio; Weyrich, Tim; Steed, Anthony; Tecchia, Franco; Slater, Mel; Sanchez-Vives, Maria V.

    2012-01-01

    Although telerehabilitation systems represent one of the most technologically appealing clinical solutions for the immediate future, they still present limitations that prevent their standardization. Here we propose an integrated approach that includes three key and novel factors: (a) fully immersive virtual environments, including virtual body representation and ownership; (b) multimodal interaction with remote people and virtual objects including haptic interaction; and (c) a physical representation of the patient at the hospital through embodiment agents (e.g., as a physical robot). The importance of secure and rapid communication between the nodes is also stressed and an example implemented solution is described. Finally, we discuss the proposed approach with reference to the existing literature and systems. PMID:22787454

  18. Sculpting 3D worlds with music: advanced texturing techniques

    NASA Astrophysics Data System (ADS)

    Greuel, Christian; Bolas, Mark T.; Bolas, Niko; McDowall, Ian E.

    1996-04-01

    Sound within the virtual environment is often considered to be secondary to the graphics. In a typical scenario, either audio cues are locally associated with specific 3D objects or a general aural ambiance is supplied in order to alleviate the sterility of an artificial experience. This paper discusses a completely different approach, in which cues are extracted from live or recorded music in order to create geometry and control object behaviors within a computer- generated environment. Advanced texturing techniques used to generate complex stereoscopic images are also discussed. By analyzing music for standard audio characteristics such as rhythm and frequency, information is extracted and repackaged for processing. With the Soundsculpt Toolkit, this data is mapped onto individual objects within the virtual environment, along with one or more predetermined behaviors. Mapping decisions are implemented with a user definable schedule and are based on the aesthetic requirements of directors and designers. This provides for visually active, immersive environments in which virtual objects behave in real-time correlation with the music. The resulting music-driven virtual reality opens up several possibilities for new types of artistic and entertainment experiences, such as fully immersive 3D `music videos' and interactive landscapes for live performance.

  19. Altered Perspectives: Immersive Environments

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Webley, P. W.

    2016-12-01

    Immersive environments provide an exciting experiential technology to visualize the natural world. Given the increasing accessibility of 360o cameras and virtual reality headsets we are now able to visualize artistic principles and scientific concepts in a fully immersive environment. The technology has become popular for photographers as well as designers, industry, educational groups, and museums. Here we show a sci-art perspective on the use of optics and light in the capture and manipulation of 360o images and video of geologic phenomena and cultural heritage sites in Alaska, England, and France. Additionally, we will generate intentionally altered perspectives to lend a surrealistic quality to the landscapes. Locations include the Catacombs of Paris, the Palace of Versailles, and the Northern Lights over Fairbanks, Alaska. Some 360o view cameras now use small portable dual lens technology extending beyond the 180o fish eye lens previously used, providing better coverage and image quality. Virtual reality headsets range in level of sophistication and cost, with the most affordable versions using smart phones and Google Cardboard viewers. The equipment used in this presentation includes a Ricoh Theta S spherical imaging camera. Here we will demonstrate the use of 360o imaging with attendees being able to be part of the immersive environment and experience our locations as if they were visiting themselves.

  20. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  1. Using smartphone technology to deliver a virtual pedestrian environment: usability and validation.

    PubMed

    Schwebel, David C; Severson, Joan; He, Yefei

    2017-09-01

    Various programs effectively teach children to cross streets more safely, but all are labor- and cost-intensive. Recent developments in mobile phone technology offer opportunity to deliver virtual reality pedestrian environments to mobile smartphone platforms. Such an environment may offer a cost- and labor-effective strategy to teach children to cross streets safely. This study evaluated usability, feasibility, and validity of a smartphone-based virtual pedestrian environment. A total of 68 adults completed 12 virtual crossings within each of two virtual pedestrian environments, one delivered by smartphone and the other a semi-immersive kiosk virtual environment. Participants completed self-report measures of perceived realism and simulator sickness experienced in each virtual environment, plus self-reported demographic and personality characteristics. All participants followed system instructions and used the smartphone-based virtual environment without difficulty. No significant simulator sickness was reported or observed. Users rated the smartphone virtual environment as highly realistic. Convergent validity was detected, with many aspects of pedestrian behavior in the smartphone-based virtual environment matching behavior in the kiosk virtual environment. Anticipated correlations between personality and kiosk virtual reality pedestrian behavior emerged for the smartphone-based system. A smartphone-based virtual environment can be usable and valid. Future research should develop and evaluate such a training system.

  2. Virtual Heritage Tours: Developing Interactive Narrative-Based Environments for Historical Sites

    NASA Astrophysics Data System (ADS)

    Tuck, Deborah; Kuksa, Iryna

    In the last decade there has been a noticeable growth in the use of virtual reality (VR) technologies for reconstructing cultural heritage sites. However, many of these virtual reconstructions evidence little of sites' social histories. Narrating the Past is a research project that aims to re-address this issue by investigating methods for embedding social histories within cultural heritage sites and by creating narrative based virtual environments (VEs) within them. The project aims to enhance the visitor's knowledge and understanding by developing a navigable 3D story space, in which participants are immersed. This has the potential to create a malleable virtual environment allowing the visitor to configure their own narrative paths.

  3. Virtually Ostracized: Studying Ostracism in Immersive Virtual Environments

    PubMed Central

    Wesselmann, Eric D.; Law, Alvin Ty; Williams, Kipling D.

    2012-01-01

    Abstract Electronic-based communication (such as Immersive Virtual Environments; IVEs) may offer new ways of satisfying the need for social connection, but they also provide ways this need can be thwarted. Ostracism, being ignored and excluded, is a common social experience that threatens fundamental human needs (i.e., belonging, control, self-esteem, and meaningful existence). Previous ostracism research has made use of a variety of paradigms, including minimal electronic-based interactions (e.g., Cyberball) and communication (e.g., chatrooms and Short Message Services). These paradigms, however, lack the mundane realism that many IVEs now offer. Further, IVE paradigms designed to measure ostracism may allow researchers to test more nuanced hypotheses about the effects of ostracism. We created an IVE in which ostracism could be manipulated experimentally, emulating a previously validated minimal ostracism paradigm. We found that participants who were ostracized in this IVE experienced the same negative effects demonstrated in other ostracism paradigms, providing, to our knowledge, the first evidence of the negative effects of ostracism in virtual environments. Though further research directly exploring these effects in online virtual environments is needed, this research suggests that individuals encountering ostracism in other virtual environments (such as massively multiplayer online role playing games; MMORPGs) may experience negative effects similar to those of being ostracized in real life. This possibility may have serious implications for individuals who are marginalized in their real life and turn to IVEs to satisfy their need for social connection. PMID:22897472

  4. Virtually ostracized: studying ostracism in immersive virtual environments.

    PubMed

    Kassner, Matthew P; Wesselmann, Eric D; Law, Alvin Ty; Williams, Kipling D

    2012-08-01

    Electronic-based communication (such as Immersive Virtual Environments; IVEs) may offer new ways of satisfying the need for social connection, but they also provide ways this need can be thwarted. Ostracism, being ignored and excluded, is a common social experience that threatens fundamental human needs (i.e., belonging, control, self-esteem, and meaningful existence). Previous ostracism research has made use of a variety of paradigms, including minimal electronic-based interactions (e.g., Cyberball) and communication (e.g., chatrooms and Short Message Services). These paradigms, however, lack the mundane realism that many IVEs now offer. Further, IVE paradigms designed to measure ostracism may allow researchers to test more nuanced hypotheses about the effects of ostracism. We created an IVE in which ostracism could be manipulated experimentally, emulating a previously validated minimal ostracism paradigm. We found that participants who were ostracized in this IVE experienced the same negative effects demonstrated in other ostracism paradigms, providing, to our knowledge, the first evidence of the negative effects of ostracism in virtual environments. Though further research directly exploring these effects in online virtual environments is needed, this research suggests that individuals encountering ostracism in other virtual environments (such as massively multiplayer online role playing games; MMORPGs) may experience negative effects similar to those of being ostracized in real life. This possibility may have serious implications for individuals who are marginalized in their real life and turn to IVEs to satisfy their need for social connection.

  5. Headphone and Head-Mounted Visual Displays for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Begault, Duran R.; Ellis, Stephen R.; Wenzel, Elizabeth M.; Trejo, Leonard J. (Technical Monitor)

    1998-01-01

    A realistic auditory environment can contribute to both the overall subjective sense of presence in a virtual display, and to a quantitative metric predicting human performance. Here, the role of audio in a virtual display and the importance of auditory-visual interaction are examined. Conjectures are proposed regarding the effectiveness of audio compared to visual information for creating a sensation of immersion, the frame of reference within a virtual display, and the compensation of visual fidelity by supplying auditory information. Future areas of research are outlined for improving simulations of virtual visual and acoustic spaces. This paper will describe some of the intersensory phenomena that arise during operator interaction within combined visual and auditory virtual environments. Conjectures regarding audio-visual interaction will be proposed.

  6. Contextual EFL Learning in a 3D Virtual Environment

    ERIC Educational Resources Information Center

    Lan, Yu-Ju

    2015-01-01

    The purposes of the current study are to develop virtually immersive EFL learning contexts for EFL learners in Taiwan to pre- and review English materials beyond the regular English class schedule. A 2-iteration action research lasting for one semester was conducted to evaluate the effects of virtual contexts on learners' EFL learning. 132…

  7. Enhance Learning on Software Project Management through a Role-Play Game in a Virtual World

    ERIC Educational Resources Information Center

    Maratou, Vicky; Chatzidaki, Eleni; Xenos, Michalis

    2016-01-01

    This article presents a role-play game for software project management (SPM) in a three-dimensional online multiuser virtual world. The Opensimulator platform is used for the creation of an immersive virtual environment that facilitates students' collaboration and realistic interaction, in order to manage unexpected events occurring during the…

  8. Development, Implementation, and Assessment of General Chemistry Lab Experiments Performed in the Virtual World of Second Life

    ERIC Educational Resources Information Center

    Winkelmann, Kurt; Keeney-Kennicutt, Wendy; Fowler, Debra; Macik, Maria

    2017-01-01

    Virtual worlds are a potential medium for teaching college-level chemistry laboratory courses. To determine the feasibility of conducting chemistry experiments in such an environment, undergraduate students performed two experiments in the immersive virtual world of Second Life (SL) as part of their regular General Chemistry 2 laboratory course.…

  9. Teaching Physics to Deaf College Students in a 3-D Virtual Lab

    ERIC Educational Resources Information Center

    Robinson, Vicki

    2013-01-01

    Virtual worlds are used in many educational and business applications. At the National Technical Institute for the Deaf at Rochester Institute of Technology (NTID/RIT), deaf college students are introduced to the virtual world of Second Life, which is a 3-D immersive, interactive environment, accessed through computer software. NTID students use…

  10. L2 Immersion in 3D Virtual Worlds: The Next Thing to Being There?

    ERIC Educational Resources Information Center

    Paillat, Edith

    2014-01-01

    Second Life is one of the many three-dimensional virtual environments accessible through a computer and a fast broadband connection. Thousands of participants connect to this platform to interact virtually with the world, join international communities of practice and, for some, role play groups. Unlike online role play games however, Second Life…

  11. The Use of Virtual Reality for Creating Unusual Environmental Stimulation to Motivate Students to Explore Creative Ideas

    ERIC Educational Resources Information Center

    Lau, Kung Wong; Lee, Pui Yuen

    2015-01-01

    This paper discusses the roles of simulation in creativity education and how to apply immersive virtual environments to enhance students' learning experiences in university, through the provision of interactive simulations. An empirical study of a simulated virtual reality was carried out in order to investigate the effectiveness of providing…

  12. Cue-exposure software for the treatment of bulimia nervosa and binge eating disorder.

    PubMed

    Gutiérrez-Maldonado, José; Pla-Sanjuanelo, Joana; Ferrer-García, Marta

    2016-11-01

    Cue-exposure therapy (CET) has proven its efficacy in treating patients with bulimia nervosa and binge eating disorder who are resistant to standard treatment. Furthermore, incorporating virtual reality (VR) technology is increasingly considered a valid exposure method that may help to increase the efficacy of standard treatments in a variety of eating disorders. Although immersive displays improve the beneficial effects, expensive technology is not always necessary. We aimed to assess whether exposure to food related virtual environments could decrease food craving in a non-clinical sample. In addition, we specifically compared the effects of two VR systems (one non-immersive and one immersive) during CET. We therefore applied a one-session CET to 113 undergraduate students. Decreased food craving was found during exposure to both VR environments compared with pre-treatment levels, supporting the efficacy of VR-CET in reducing food craving. We found no significant differences in craving between immersive and non-immersive systems. Low-cost non-immersive systems applied through 3D laptops can improve the accessibility of this technique. By reducing the costs and improving the usability, VR-CET on 3D laptops may become a viable option that can be readily applied in a greater range of clinical contexts.

  13. A standardized set of 3-D objects for virtual reality research and applications.

    PubMed

    Peeters, David

    2018-06-01

    The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.

  14. Virtual reality environments for post-stroke arm rehabilitation.

    PubMed

    Subramanian, Sandeep; Knaut, Luiz A; Beaudoin, Christian; McFadyen, Bradford J; Feldman, Anatol G; Levin, Mindy F

    2007-06-22

    Optimal practice and feedback elements are essential requirements for maximal motor recovery in patients with motor deficits due to central nervous system lesions. A virtual environment (VE) was created that incorporates practice and feedback elements necessary for maximal motor recovery. It permits varied and challenging practice in a motivating environment that provides salient feedback. The VE gives the user knowledge of results feedback about motor behavior and knowledge of performance feedback about the quality of pointing movements made in a virtual elevator. Movement distances are related to length of body segments. We describe an immersive and interactive experimental protocol developed in a virtual reality environment using the CAREN system. The VE can be used as a training environment for the upper limb in patients with motor impairments.

  15. Sounds of silence: How to animate virtual worlds with sound

    NASA Technical Reports Server (NTRS)

    Astheimer, Peter

    1993-01-01

    Sounds are an integral and sometimes annoying part of our daily life. Virtual worlds which imitate natural environments gain a lot of authenticity from fast, high quality visualization combined with sound effects. Sounds help to increase the degree of immersion for human dwellers in imaginary worlds significantly. The virtual reality toolkit of IGD (Institute for Computer Graphics) features a broad range of standard visual and advanced real-time audio components which interpret an object-oriented definition of the scene. The virtual reality system 'Virtual Design' realized with the toolkit enables the designer of virtual worlds to create a true audiovisual environment. Several examples on video demonstrate the usage of the audio features in Virtual Design.

  16. Immersive Learning Technologies: Realism and Online Authentic Learning

    ERIC Educational Resources Information Center

    Herrington, Jan; Reeves, Thomas C.; Oliver, Ron

    2007-01-01

    The development of immersive learning technologies in the form of virtual reality and advanced computer applications has meant that realistic creations of simulated environments are now possible. Such simulations have been used to great effect in training in the military, air force, and in medical training. But how realistic do problems need to be…

  17. iVFTs - immersive virtual field trips for interactive learning about Earth's environment.

    NASA Astrophysics Data System (ADS)

    Bruce, G.; Anbar, A. D.; Semken, S. C.; Summons, R. E.; Oliver, C.; Buxner, S.

    2014-12-01

    Innovations in immersive interactive technologies are changing the way students explore Earth and its environment. State-of-the-art hardware has given developers the tools needed to capture high-resolution spherical content, 360° panoramic video, giga-pixel imagery, and unique viewpoints via unmanned aerial vehicles as they explore remote and physically challenging regions of our planet. Advanced software enables integration of these data into seamless, dynamic, immersive, interactive, content-rich, and learner-driven virtual field explorations, experienced online via HTML5. These surpass conventional online exercises that use 2-D static imagery and enable the student to engage in these virtual environments that are more like games than like lectures. Grounded in the active learning of exploration, inquiry, and application of knowledge as it is acquired, users interact non-linearly in conjunction with an intelligent tutoring system (ITS). The integration of this system allows the educational experience to be adapted to each individual student as they interact within the program. Such explorations, which we term "immersive virtual field trips" (iVFTs), are being integrated into cyber-learning allowing science teachers to take students to scientifically significant but inaccessible environments. Our team and collaborators are producing a diverse suite of freely accessible, iVFTs to teach key concepts in geology, astrobiology, ecology, and anthropology. Topics include Early Life, Biodiversity, Impact craters, Photosynthesis, Geologic Time, Stratigraphy, Tectonics, Volcanism, Surface Processes, The Rise of Oxygen, Origin of Water, Early Civilizations, Early Multicellular Organisms, and Bioarcheology. These diverse topics allow students to experience field sites all over the world, including, Grand Canyon (USA), Flinders Ranges (Australia), Shark Bay (Australia), Rainforests (Panama), Teotihuacan (Mexico), Upheaval Dome (USA), Pilbara (Australia), Mid-Atlantic Ridge (Iceland), and Mauna Kea (Hawaii). iVFTs are being beta-tested and used at ASU in several large-enrollment courses to assess its usability and effectiveness in meeting specific learning objectives. We invite geoscience educators to partake of this resource and find new applications to their own teaching.

  18. Eye movements, visual search and scene memory, in an immersive virtual environment.

    PubMed

    Kit, Dmitry; Katz, Leor; Sullivan, Brian; Snyder, Kat; Ballard, Dana; Hayhoe, Mary

    2014-01-01

    Visual memory has been demonstrated to play a role in both visual search and attentional prioritization in natural scenes. However, it has been studied predominantly in experimental paradigms using multiple two-dimensional images. Natural experience, however, entails prolonged immersion in a limited number of three-dimensional environments. The goal of the present experiment was to recreate circumstances comparable to natural visual experience in order to evaluate the role of scene memory in guiding eye movements in a natural environment. Subjects performed a continuous visual-search task within an immersive virtual-reality environment over three days. We found that, similar to two-dimensional contexts, viewers rapidly learn the location of objects in the environment over time, and use spatial memory to guide search. Incidental fixations did not provide obvious benefit to subsequent search, suggesting that semantic contextual cues may often be just as efficient, or that many incidentally fixated items are not held in memory in the absence of a specific task. On the third day of the experience in the environment, previous search items changed in color. These items were fixated upon with increased probability relative to control objects, suggesting that memory-guided prioritization (or Surprise) may be a robust mechanisms for attracting gaze to novel features of natural environments, in addition to task factors and simple spatial saliency.

  19. Temporally coherent 4D video segmentation for teleconferencing

    NASA Astrophysics Data System (ADS)

    Ehmann, Jana; Guleryuz, Onur G.

    2013-09-01

    We develop an algorithm for 4-D (RGB+Depth) video segmentation targeting immersive teleconferencing ap- plications on emerging mobile devices. Our algorithm extracts users from their environments and places them onto virtual backgrounds similar to green-screening. The virtual backgrounds increase immersion and interac- tivity, relieving the users of the system from distractions caused by disparate environments. Commodity depth sensors, while providing useful information for segmentation, result in noisy depth maps with a large number of missing depth values. By combining depth and RGB information, our work signi¯cantly improves the other- wise very coarse segmentation. Further imposing temporal coherence yields compositions where the foregrounds seamlessly blend with the virtual backgrounds with minimal °icker and other artifacts. We achieve said improve- ments by correcting the missing information in depth maps before fast RGB-based segmentation, which operates in conjunction with temporal coherence. Simulation results indicate the e±cacy of the proposed system in video conferencing scenarios.

  20. A Conceptual Framework for Mediated Environments

    ERIC Educational Resources Information Center

    Childs, Mark

    2010-01-01

    Background: Immersive virtual worlds are one of a range of different platforms that can be grouped under the concept of mediated environments, i.e. environments that create a metaphorical space in which participants can position themselves and be embodied. Synthesising the literatures concerning the various mediated environment technologies…

  1. Learning immersion without getting wet

    NASA Astrophysics Data System (ADS)

    Aguilera, Julieta C.

    2012-03-01

    This paper describes the teaching of an immersive environments class on the Spring of 2011. The class had students from undergraduate as well as graduate art related majors. Their digital background and interests were also diverse. These variables were channeled as different approaches throughout the semester. Class components included fundamentals of stereoscopic computer graphics to explore spatial depth, 3D modeling and skeleton animation to in turn explore presence, exposure to formats like a stereo projection wall and dome environments to compare field of view across devices, and finally, interaction and tracking to explore issues of embodiment. All these components were supported by theoretical readings discussed in class. Guest artists presented their work in Virtual Reality, Dome Environments and other immersive formats. Museum professionals also introduced students to space science visualizations, which utilize immersive formats. Here I present the assignments and their outcome, together with insights as to how the creation of immersive environments can be learned through constraints that expose students to situations of embodied cognition.

  2. What about the Firewall? Creating Virtual Worlds in a Public Primary School Using Sim-on-a-Stick

    ERIC Educational Resources Information Center

    Jacka, Lisa; Booth, Kate

    2012-01-01

    Virtual worlds are highly immersive, engaging and popular computer mediated environments being explored by children and adults. Why then aren't more teachers using virtual worlds in the classroom with primary and secondary school students? Reasons often cited are the learning required to master the technology, low-end graphics cards, poor…

  3. Possibilities and Determinants of Using Low-Cost Devices in Virtual Education Applications

    ERIC Educational Resources Information Center

    Bun, Pawel Kazimierz; Wichniarek, Radoslaw; Górski, Filip; Grajewski, Damian; Zawadzki, Przemyslaw; Hamrol, Adam

    2017-01-01

    Virtual reality (VR) may be used as an innovative educational tool. However, in order to fully exploit its potential, it is essential to achieve the effect of immersion. To more completely submerge the user in a virtual environment, it is necessary to ensure that the user's actions are directly translated into the image generated by the…

  4. Emerging Conceptual Understanding of Complex Astronomical Phenomena by Using a Virtual Solar System

    ERIC Educational Resources Information Center

    Gazit, Elhanan; Yair, Yoav; Chen, David

    2005-01-01

    This study describes high school students' conceptual development of the basic astronomical phenomena during real-time interactions with a Virtual Solar System (VSS). The VSS is a non-immersive virtual environment which has a dynamic frame of reference that can be altered by the user. Ten 10th grade students were given tasks containing a set of…

  5. Intelligent Tutors in Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Yan, Peng; Slator, Brian M.; Vender, Bradley; Jin, Wei; Kariluoma, Matti; Borchert, Otto; Hokanson, Guy; Aggarwal, Vaibhav; Cosmano, Bob; Cox, Kathleen T.; Pilch, André; Marry, Andrew

    2013-01-01

    Research into virtual role-based learning has progressed over the past decade. Modern issues include gauging the difficulty of designing a goal system capable of meeting the requirements of students with different knowledge levels, and the reasonability and possibility of taking advantage of the well-designed formula and techniques served in other…

  6. Walking through doorways causes forgetting: Further explorations.

    PubMed

    Radvansky, Gabriel A; Krawietz, Sabine A; Tamplin, Andrea K

    2011-08-01

    Previous research using virtual environments has revealed a location-updating effect in which there is a decline in memory when people move from one location to another. Here we assess whether this effect reflects the influence of the experienced context, in terms of the degree of immersion of a person in an environment, as suggested by some work in spatial cognition, or by a shift in context. In Experiment 1, the degree of immersion was reduced by using smaller displays. In comparison, in Experiment 2 an actual, rather than a virtual, environment was used, to maximize immersion. Location-updating effects were observed under both of these conditions. In Experiment 3, the original encoding context was reinstated by having a person return to the original room in which objects were first encoded. However, inconsistent with an encoding specificity account, memory did not improve by reinstating this context. Finally, we did a further analysis of the results of this and previous experiments to assess the differential influence of foregrounding and retrieval interference. Overall, these data are interpreted in terms of the event horizon model of event cognition and memory.

  7. Designers workbench: toward real-time immersive modeling

    NASA Astrophysics Data System (ADS)

    Kuester, Falko; Duchaineau, Mark A.; Hamann, Bernd; Joy, Kenneth I.; Ma, Kwan-Liu

    2000-05-01

    This paper introduces the Designers Workbench, a semi- immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing, and computer-aided engineering systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates form a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The Designers Workbench aims at closing this technology or 'digital gap' experienced by design and CAD engineers by transforming the classical design paradigm into its fully integrate digital and virtual analog allowing collaborative development in a semi- immersive virtual environment. This project emphasizes two key components form the classical product design cycle: freeform modeling and analysis. In the freedom modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.

  8. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve

    2011-03-01

    Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.

  9. The Effects of Virtual Weather on Presence

    NASA Astrophysics Data System (ADS)

    Wissmath, Bartholomäus; Weibel, David; Mast, Fred W.

    In modern societies people tend to spend more time in front of computer screens than outdoors. Along with an increasing degree of realism displayed in digital environments, simulated weather appears more and more realistic and more often implemented in digital environments. Research has found that the actual weather influences behavior and mood. In this paper we experimentally examine the effects of virtual weather on the sense of presence. Thereby we found individuals (N=30) to immerse deeper in digital environments displaying fair weather conditions than in environments displaying bad weather. We also investigate whether virtual weather can influence behavior. The possible implications of theses findings for presence theory as well as digital environment designers will be discussed.

  10. Generating Contextual Descriptions of Virtual Reality (VR) Spaces

    NASA Astrophysics Data System (ADS)

    Olson, D. M.; Zaman, C. H.; Sutherland, A.

    2017-12-01

    Virtual reality holds great potential for science communication, education, and research. However, interfaces for manipulating data and environments in virtual worlds are limited and idiosyncratic. Furthermore, speech and vision are the primary modalities by which humans collect information about the world, but the linking of visual and natural language domains is a relatively new pursuit in computer vision. Machine learning techniques have been shown to be effective at image and speech classification, as well as at describing images with language (Karpathy 2016), but have not yet been used to describe potential actions. We propose a technique for creating a library of possible context-specific actions associated with 3D objects in immersive virtual worlds based on a novel dataset generated natively in virtual reality containing speech, image, gaze, and acceleration data. We will discuss the design and execution of a user study in virtual reality that enabled the collection and the development of this dataset. We will also discuss the development of a hybrid machine learning algorithm linking vision data with environmental affordances in natural language. Our findings demonstrate that it is possible to develop a model which can generate interpretable verbal descriptions of possible actions associated with recognized 3D objects within immersive VR environments. This suggests promising applications for more intuitive user interfaces through voice interaction within 3D environments. It also demonstrates the potential to apply vast bodies of embodied and semantic knowledge to enrich user interaction within VR environments. This technology would allow for applications such as expert knowledge annotation of 3D environments, complex verbal data querying and object manipulation in virtual spaces, and computer-generated, dynamic 3D object affordances and functionality during simulations.

  11. CAVE2: a hybrid reality environment for immersive simulation and information analysis

    NASA Astrophysics Data System (ADS)

    Febretti, Alessandro; Nishimoto, Arthur; Thigpen, Terrance; Talandis, Jonas; Long, Lance; Pirtle, J. D.; Peterka, Tom; Verlo, Alan; Brown, Maxine; Plepys, Dana; Sandin, Dan; Renambot, Luc; Johnson, Andrew; Leigh, Jason

    2013-03-01

    Hybrid Reality Environments represent a new kind of visualization spaces that blur the line between virtual environments and high resolution tiled display walls. This paper outlines the design and implementation of the CAVE2TM Hybrid Reality Environment. CAVE2 is the world's first near-seamless flat-panel-based, surround-screen immersive system. Unique to CAVE2 is that it will enable users to simultaneously view both 2D and 3D information, providing more flexibility for mixed media applications. CAVE2 is a cylindrical system of 24 feet in diameter and 8 feet tall, and consists of 72 near-seamless, off-axisoptimized passive stereo LCD panels, creating an approximately 320 degree panoramic environment for displaying information at 37 Megapixels (in stereoscopic 3D) or 74 Megapixels in 2D and at a horizontal visual acuity of 20/20. Custom LCD panels with shifted polarizers were built so the images in the top and bottom rows of LCDs are optimized for vertical off-center viewing- allowing viewers to come closer to the displays while minimizing ghosting. CAVE2 is designed to support multiple operating modes. In the Fully Immersive mode, the entire room can be dedicated to one virtual simulation. In 2D model, the room can operate like a traditional tiled display wall enabling users to work with large numbers of documents at the same time. In the Hybrid mode, a mixture of both 2D and 3D applications can be simultaneously supported. The ability to treat immersive work spaces in this Hybrid way has never been achieved before, and leverages the special abilities of CAVE2 to enable researchers to seamlessly interact with large collections of 2D and 3D data. To realize this hybrid ability, we merged the Scalable Adaptive Graphics Environment (SAGE) - a system for supporting 2D tiled displays, with Omegalib - a virtual reality middleware supporting OpenGL, OpenSceneGraph and Vtk applications.

  12. Hands-on Learning in the Virtual World

    ERIC Educational Resources Information Center

    Branson, John; Thomson, Diane

    2013-01-01

    The U.S. military has long understood the value of immersive simulations in education. Before the Navy entrusts a ship to a crew, crew members must first practice and demonstrate their competency in a fully immersive, simulated environment. Why not teach students in the same way? K-12 educators in Pennsylvania, USA, recently did just that when…

  13. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca

    2009-01-01

    The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…

  14. Virtual Reality to Train Diagnostic Skills in Eating Disorders. Comparison of two Low Cost Systems.

    PubMed

    Gutiérrez-Maldonado, José; Ferrer-García, Marta; Plasanjuanelo, Joana; Andrés-Pueyo, Antonio; Talarn-Caparrós, Antoni

    2015-01-01

    Enhancing the ability to perform differential diagnosis and psychopathological exploration is important for students who wish to work in the clinical field, as well as for professionals already working in this area. Virtual reality (VR) simulations can immerse students totally in educational experiences in a way that is not possible using other methods. Learning in a VR environment can also be more effective and motivating than usual classroom practices. Traditionally, immersion has been considered central to the quality of a VR system; immersive VR is considered a special and unique experience that cannot achieved by three-dimensional (3D) interactions on desktop PCs. However, some authors have suggested that if the content design is emotionally engaging, immersive systems are not always necessary. The main purpose of this study is to compare the efficacy and usability of two low-cost VR systems, offering different levels of immersion, in order to develop the ability to perform diagnostic interviews in eating disorders by means of simulations of psychopathological explorations.

  15. Virtual reality enhanced mannequin (VREM) that is well received by resuscitation experts.

    PubMed

    Semeraro, Federico; Frisoli, Antonio; Bergamasco, Massimo; Cerchiari, Erga L

    2009-04-01

    The objective of this study was to test acceptance of, and interest in, a newly developed prototype of virtual reality enhanced mannequin (VREM) on a sample of congress attendees who volunteered to participate in the evaluation session and to respond to a specifically designed questionnaire. A commercial Laerdal HeartSim 4000 mannequin was developed to integrate virtual reality (VR) technologies with specially developed virtual reality software to increase the immersive perception of emergency scenarios. To evaluate the acceptance of a virtual reality enhanced mannequin (VREM), we presented it to a sample of 39 possible users. Each evaluation session involved one trainee and two instructors with a standardized procedure and scenario: the operator was invited by the instructor to wear the data-gloves and the head mounted display and was briefly introduced to the scope of the simulation. The instructor helped the operator familiarize himself with the environment. After the patient's collapse, the operator was asked to check the patient's clinical conditions and start CPR. Finally, the patient started to recover signs of circulation and the evaluation session was concluded. Each participant was then asked to respond to a questionnaire designed to explore the trainee's perception in the areas of user-friendliness, realism, and interaction/immersion. Overall, the evaluation of the system was very positive, as was the feeling of immersion and realism of the environment and simulation. Overall, 84.6% of the participants judged the virtual reality experience as interesting and believed that its development could be very useful for healthcare training. The prototype of the virtual reality enhanced mannequin was well-liked, without interfence by interaction devices, and deserves full technological development and validation in emergency medical training.

  16. Generation IV Nuclear Energy Systems Construction Cost Reductions through the Use of Virtual Environments - Task 4 Report: Virtual Mockup Maintenance Task Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Shaw; Anthony Baratta; Vaughn Whisker

    2005-02-28

    Task 4 report of 3 year DOE NERI-sponsored effort evaluating immersive virtual reality (CAVE) technology for design review, construction planning, and maintenance planning and training for next generation nuclear power plants. Program covers development of full-scale virtual mockups generated from 3D CAD data presented in a CAVE visualization facility. This report focuses on using Full-scale virtual mockups for nuclear power plant training applications.

  17. Validation of virtual reality as a tool to understand and prevent child pedestrian injury.

    PubMed

    Schwebel, David C; Gaines, Joanna; Severson, Joan

    2008-07-01

    In recent years, virtual reality has emerged as an innovative tool for health-related education and training. Among the many benefits of virtual reality is the opportunity for novice users to engage unsupervised in a safe environment when the real environment might be dangerous. Virtual environments are only useful for health-related research, however, if behavior in the virtual world validly matches behavior in the real world. This study was designed to test the validity of an immersive, interactive virtual pedestrian environment. A sample of 102 children and 74 adults was recruited to complete simulated road-crossings in both the virtual environment and the identical real environment. In both the child and adult samples, construct validity was demonstrated via significant correlations between behavior in the virtual and real worlds. Results also indicate construct validity through developmental differences in behavior; convergent validity by showing correlations between parent-reported child temperament and behavior in the virtual world; internal reliability of various measures of pedestrian safety in the virtual world; and face validity, as measured by users' self-reported perception of realism in the virtual world. We discuss issues of generalizability to other virtual environments, and the implications for application of virtual reality to understanding and preventing pediatric pedestrian injuries.

  18. Modulation of cortical activity in 2D versus 3D virtual reality environments: an EEG study.

    PubMed

    Slobounov, Semyon M; Ray, William; Johnson, Brian; Slobounov, Elena; Newell, Karl M

    2015-03-01

    There is a growing empirical evidence that virtual reality (VR) is valuable for education, training, entertaining and medical rehabilitation due to its capacity to represent real-life events and situations. However, the neural mechanisms underlying behavioral confounds in VR environments are still poorly understood. In two experiments, we examined the effect of fully immersive 3D stereoscopic presentations and less immersive 2D VR environments on brain functions and behavioral outcomes. In Experiment 1 we examined behavioral and neural underpinnings of spatial navigation tasks using electroencephalography (EEG). In Experiment 2, we examined EEG correlates of postural stability and balance. Our major findings showed that fully immersive 3D VR induced a higher subjective sense of presence along with enhanced success rate of spatial navigation compared to 2D. In Experiment 1 power of frontal midline EEG (FM-theta) was significantly higher during the encoding phase of route presentation in the 3D VR. In Experiment 2, the 3D VR resulted in greater postural instability and modulation of EEG patterns as a function of 3D versus 2D environments. The findings support the inference that the fully immersive 3D enriched-environment requires allocation of more brain and sensory resources for cognitive/motor control during both tasks than 2D presentations. This is further evidence that 3D VR tasks using EEG may be a promising approach for performance enhancement and potential applications in clinical/rehabilitation settings. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Linking Immersive Virtual Field Trips with an Adaptive Learning Platform

    NASA Astrophysics Data System (ADS)

    Bruce, G.; Taylor, W.; Anbar, A. D.; Semken, S. C.; Buxner, S.; Mead, C.; El-Moujaber, E.; Summons, R. E.; Oliver, C.

    2016-12-01

    The use of virtual environments in science education has been constrained by the difficulty of guiding a learner's actions within the those environments. In this work, we demonstrate how advances in education software technology allow educators to create interactive learning experiences that respond and adapt intelligently to learner input within the virtual environment. This innovative technology provides a far greater capacity for delivering authentic inquiry-driven educational experiences in unique settings from around the world. Our immersive virtual field trips (iVFT) bring students virtually to geologically significant but inaccessible environments, where they learn through authentic practices of scientific inquiry. In one recent example, students explore the fossil beds in Nilpena, South Australia to learn about the Ediacaran fauna. Students interactively engage in 360° recreations of the environment, uncover the nature of the historical ecosystem by identifying fossils with a dichotomous key, explore actual fossil beds in high resolution imagery, and reconstruct what an ecosystem might have looked like millions of years ago in an interactive simulation. With the new capacity to connect actions within the iVFT to an intelligent tutoring system, these learning experiences can be tracked, guided, and tailored individually to the immediate actions of the student. This new capacity also has great potential for learning designers to take a data-driven approach to lesson improvement and for education researchers to study learning in virtual environments. Thus, we expect iVFT will be fertile ground for novel research. Such iVFT are currently in use in several introductory classes offered online at Arizona State University in anthropology, introductory biology, and astrobiology, reaching thousands of students to date. Drawing from these experiences, we are designing a curriculum for historical geology that will be built around iVFT-based exploration of Earth history.

  20. Vroom: designing an augmented environment for remote collaboration in digital cinema production

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy

    2013-03-01

    As media technologies become increasingly affordable, compact and inherently networked, new generations of telecollaborative platforms continue to arise which integrate these new affordances. Virtual reality has been primarily concerned with creating simulations of environments that can transport participants to real or imagined spaces that replace the "real world". Meanwhile Augmented Reality systems have evolved to interleave objects from Virtual Reality environments into the physical landscape. Perhaps now there is a new class of systems that reverse this precept to enhance dynamic media landscapes and immersive physical display environments to enable intuitive data exploration through collaboration. Vroom (Virtual Room) is a next-generation reconfigurable tiled display environment in development at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. Vroom enables freely scalable digital collaboratories, connecting distributed, high-resolution visualization resources for collaborative work in the sciences, engineering and the arts. Vroom transforms a physical space into an immersive media environment with large format interactive display surfaces, video teleconferencing and spatialized audio built on a highspeed optical network backbone. Vroom enables group collaboration for local and remote participants to share knowledge and experiences. Possible applications include: remote learning, command and control, storyboarding, post-production editorial review, high resolution video playback, 3D visualization, screencasting and image, video and multimedia file sharing. To support these various scenarios, Vroom features support for multiple user interfaces (optical tracking, touch UI, gesture interface, etc.), support for directional and spatialized audio, giga-pixel image interactivity, 4K video streaming, 3D visualization and telematic production. This paper explains the design process that has been utilized to make Vroom an accessible and intuitive immersive environment for remote collaboration specifically for digital cinema production.

  1. Research on Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Lobeck, William E.

    2002-01-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  2. Research on Intelligent Synthesis Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.; Loftin, R. Bowen

    2002-12-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  3. Future Evolution of Virtual Worlds as Communication Environments

    NASA Astrophysics Data System (ADS)

    Prisco, Giulio

    Extensive experience creating locations and activities inside virtual worlds provides the basis for contemplating their future. Users of virtual worlds are diverse in their goals for these online environments; for example, immersionists want them to be alternative realities disconnected from real life, whereas augmentationists want them to be communication media supporting real-life activities. As the technology improves, the diversity of virtual worlds will increase along with their significance. Many will incorporate more advanced virtual reality, or serve as major media for long-distance collaboration, or become the venues for futurist social movements. Key issues are how people can create their own virtual worlds, travel across worlds, and experience a variety of multimedia immersive environments. This chapter concludes by noting the view among some computer scientists that future technologies will permit uploading human personalities to artificial intelligence avatars, thereby enhancing human beings and rendering the virtual worlds entirely real.

  4. A Virtual Walk through London: Culture Learning through a Cultural Immersion Experience

    ERIC Educational Resources Information Center

    Shih, Ya-Chun

    2015-01-01

    Integrating Google Street View into a three-dimensional virtual environment in which users control personal avatars provides these said users with access to an innovative, interactive, and real-world context for communication and culture learning. We have selected London, a city famous for its rich historical, architectural, and artistic heritage,…

  5. The Best of All Worlds: Immersive Interfaces for Art Education in Virtual and Real World Teaching and Learning Environments

    ERIC Educational Resources Information Center

    Grenfell, Janette

    2013-01-01

    Selected ubiquitous technologies encourage collaborative participation between higher education students and educators within a virtual socially networked e-learning landscape. Multiple modes of teaching and learning, ranging from real world experiences, to text and digital images accessed within the Deakin studies online learning management…

  6. Assessment of Psychophysiological Differences of West Point Cadets and Civilian Controls Immersed within a Virtual Environment

    DTIC Science & Technology

    2009-01-01

    Bowerly, T., Buckwalter, J.G., Rizzo, A.A.: A controlled clinical compari- son of attention performance in children with ADHD in a virtual reality... classroom com- pared to standard neuropsychological methods. Child Neuropsychology 13, 363–381 (2007) 7. Parsons, T.D., Rizzo, A.A

  7. 13 Tips for Virtual World Teaching

    ERIC Educational Resources Information Center

    Villano, Matt

    2008-01-01

    Multi-user virtual environments (MUVEs) are gaining momentum as the latest and greatest learning tool in the world of education technology. How does one get started with them? How do they work? This article shares 13 secrets from immersive education experts and educators on how to have success in implementing these new tools and technologies on…

  8. Road-Crossing Safety in Virtual Reality: A Comparison of Adolescents With and Without ADHD

    ERIC Educational Resources Information Center

    Clancy, Tamera A.; Rucklidge, Julia J.; Owen, Dean

    2006-01-01

    This study investigated the potential accident-proneness of adolescents with attention deficit hyperactivity disorder (ADHD) in a hazardous road-crossing environment. An immersive virtual reality traffic gap-choice task was used to determine whether ADHD adolescents show more unsafe road-crossing behavior than controls. Participants (ages 13 to…

  9. Eye Movements, Visual Search and Scene Memory, in an Immersive Virtual Environment

    PubMed Central

    Sullivan, Brian; Snyder, Kat; Ballard, Dana; Hayhoe, Mary

    2014-01-01

    Visual memory has been demonstrated to play a role in both visual search and attentional prioritization in natural scenes. However, it has been studied predominantly in experimental paradigms using multiple two-dimensional images. Natural experience, however, entails prolonged immersion in a limited number of three-dimensional environments. The goal of the present experiment was to recreate circumstances comparable to natural visual experience in order to evaluate the role of scene memory in guiding eye movements in a natural environment. Subjects performed a continuous visual-search task within an immersive virtual-reality environment over three days. We found that, similar to two-dimensional contexts, viewers rapidly learn the location of objects in the environment over time, and use spatial memory to guide search. Incidental fixations did not provide obvious benefit to subsequent search, suggesting that semantic contextual cues may often be just as efficient, or that many incidentally fixated items are not held in memory in the absence of a specific task. On the third day of the experience in the environment, previous search items changed in color. These items were fixated upon with increased probability relative to control objects, suggesting that memory-guided prioritization (or Surprise) may be a robust mechanisms for attracting gaze to novel features of natural environments, in addition to task factors and simple spatial saliency. PMID:24759905

  10. Scientific Inquiry Self-Efficacy and Computer Game Self-Efficacy as Predictors and Outcomes of Middle School Boys' and Girls' Performance in a Science Assessment in a Virtual Environment

    ERIC Educational Resources Information Center

    Bergey, Bradley W.; Ketelhut, Diane Jass; Liang, Senfeng; Natarajan, Uma; Karakus, Melissa

    2015-01-01

    The primary aim of the study was to examine whether performance on a science assessment in an immersive virtual environment was associated with changes in scientific inquiry self-efficacy. A secondary aim of the study was to examine whether performance on the science assessment was equitable for students with different levels of computer game…

  11. IQ-Station: A Low Cost Portable Immersive Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric Whiting; Patrick O'Leary; William Sherman

    2010-11-01

    The emergence of inexpensive 3D TV’s, affordable input and rendering hardware and open-source software has created a yeasty atmosphere for the development of low-cost immersive environments (IE). A low cost IE system, or IQ-station, fashioned from commercial off the shelf technology (COTS), coupled with a targeted immersive application can be a viable laboratory instrument for enhancing scientific workflow for exploration and analysis. The use of an IQ-station in a laboratory setting also has the potential of quickening the adoption of a more sophisticated immersive environment as a critical enabler in modern scientific and engineering workflows. Prior work in immersive environmentsmore » generally required either a head mounted display (HMD) system or a large projector-based implementation both of which have limitations in terms of cost, usability, or space requirements. The solution presented here provides an alternative platform providing a reasonable immersive experience that addresses those limitations. Our work brings together the needed hardware and software to create a fully integrated immersive display and interface system that can be readily deployed in laboratories and common workspaces. By doing so, it is now feasible for immersive technologies to be included in researchers’ day-to-day workflows. The IQ-Station sets the stage for much wider adoption of immersive environments outside the small communities of virtual reality centers.« less

  12. Introducing an Avatar Acceptance Model: Student Intention to Use 3D Immersive Learning Tools in an Online Learning Classroom

    ERIC Educational Resources Information Center

    Kemp, Jeremy William

    2011-01-01

    This quantitative survey study examines the willingness of online students to adopt an immersive virtual environment as a classroom tool and compares this with their feelings about more traditional learning modes including our ANGEL learning management system and the Elluminate live Web conferencing tool. I surveyed 1,108 graduate students in…

  13. Designing Virtual Museum Using Web3D Technology

    NASA Astrophysics Data System (ADS)

    Zhao, Jianghai

    VRT was born to have the potentiality of constructing an effective learning environment due to its 3I characteristics: Interaction, Immersion and Imagination. It is now applied in education in a more profound way along with the development of VRT. Virtual Museum is one of the applications. The Virtual Museum is based on the WEB3D technology and extensibility is the most important factor. Considering the advantage and disadvantage of each WEB3D technology, VRML, CULT3D AND VIEWPOINT technologies are chosen. A web chatroom based on flash and ASP technology is also been created in order to make the Virtual Museum an interactive learning environment.

  14. Eye height scaling of absolute size in immersive and nonimmersive displays

    NASA Technical Reports Server (NTRS)

    Dixon, M. W.; Wraga, M.; Proffitt, D. R.; Williams, G. C.; Kaiser, M. K. (Principal Investigator)

    2000-01-01

    Eye-height (EH) scaling of absolute height was investigated in three experiments. In Experiment 1, standing observers viewed cubes in an immersive virtual environment. Observers' center of projection was placed at actual EH and at 0.7 times actual EH. Observers' size judgments revealed that the EH manipulation was 76.8% effective. In Experiment 2, seated observers viewed the same cubes on an interactive desktop display; however, no effect of EH was found in response to the simulated EH manipulation. Experiment 3 tested standing observers in the immersive environment with the field of view reduced to match that of the desktop. Comparable to Experiment 1, the effect of EH was 77%. These results suggest that EH scaling is not generally used when people view an interactive desktop display because the altitude of the center of projection is indeterminate. EH scaling is spontaneously evoked, however, in immersive environments.

  15. Second Life in Higher Education: Assessing the Potential for and the Barriers to Deploying Virtual Worlds in Learning and Teaching

    ERIC Educational Resources Information Center

    Warburton, Steven

    2009-01-01

    "Second Life" (SL) is currently the most mature and popular multi-user virtual world platform being used in education. Through an in-depth examination of SL, this article explores its potential and the barriers that multi-user virtual environments present to educators wanting to use immersive 3-D spaces in their teaching. The context is set by…

  16. Utility of virtual reality environments to examine physiological reactivity and subjective distress in adults who stutter.

    PubMed

    Brundage, Shelley B; Brinton, James M; Hancock, Adrienne B

    2016-12-01

    Virtual reality environments (VREs) allow for immersion in speaking environments that mimic real-life interactions while maintaining researcher control. VREs have been used successfully to engender arousal in other disorders. The purpose of this study was to investigate the utility of virtual reality environments to examine physiological reactivity and subjective ratings of distress in persons who stutter (PWS). Subjective and objective measures of arousal were collected from 10PWS during four-minute speeches to a virtual audience and to a virtual empty room. Stuttering frequency and physiological measures (skin conductance level and heart rate) did not differ across speaking conditions, but subjective ratings of distress were significantly higher in the virtual audience condition compared to the virtual empty room. VREs have utility in elevating subjective ratings of distress in PWS. VREs have the potential to be useful tools for practicing treatment targets in a safe, controlled, and systematic manner. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Agency and Gender Influence Older Adults' Presence-Related Experiences in an Interactive Virtual Environment.

    PubMed

    Kothgassner, Oswald D; Goreis, Andreas; Kafka, Johanna X; Hlavacs, Helmut; Beutl, Leon; Kryspin-Exner, Ilse; Felnhofer, Anna

    2018-05-01

    While virtual humans are increasingly used to benefit the elderly, considerably little is still known about older adults' virtual experiences. However, due to age-related changes, older adults' perceptions of virtual environments (VEs) may be unique. Hence, our objective was to examine possible gender differences in immersion, flow, and emotional states as well as physical and social presence in elderly males and females interacting either with a computer-controlled agent or a human-controlled avatar. Seventy-eight German-speaking older adults were randomly assigned to an avatar or an agent condition and were exposed to a brief social encounter in a virtual café. Results indicate no overall gender differences, but a significant effect of agency on social presence, physical presence, immersion, and flow. Participants in the avatar condition reported higher levels in all measures, except for involvement. Furthermore, significant gender × agency interactions were found, with females showing more social presence, spatial presence, and flow when interacting with a human-controlled avatar and more realism when conversing with an agent. Also, all participants showed significant changes in their affect post exposure. In sum, older adults' virtual experiences seem to follow unique patterns, yet, they do not preclude the elderly from successfully participating in VEs.

  18. Sensorimotor Learning during a Marksmanship Task in Immersive Virtual Reality

    PubMed Central

    Rao, Hrishikesh M.; Khanna, Rajan; Zielinski, David J.; Lu, Yvonne; Clements, Jillian M.; Potter, Nicholas D.; Sommer, Marc A.; Kopper, Regis; Appelbaum, Lawrence G.

    2018-01-01

    Sensorimotor learning refers to improvements that occur through practice in the performance of sensory-guided motor behaviors. Leveraging novel technical capabilities of an immersive virtual environment, we probed the component kinematic processes that mediate sensorimotor learning. Twenty naïve subjects performed a simulated marksmanship task modeled after Olympic Trap Shooting standards. We measured movement kinematics and shooting performance as participants practiced 350 trials while receiving trial-by-trial feedback about shooting success. Spatiotemporal analysis of motion tracking elucidated the ballistic and refinement phases of hand movements. We found systematic changes in movement kinematics that accompanied improvements in shot accuracy during training, though reaction and response times did not change over blocks. In particular, we observed longer, slower, and more precise ballistic movements that replaced effort spent on corrections and refinement. Collectively, these results leverage developments in immersive virtual reality technology to quantify and compare the kinematics of movement during early learning of full-body sensorimotor orienting. PMID:29467693

  19. An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing.

    PubMed

    Chalil Madathil, Kapil; Greenstein, Joel S

    2017-11-01

    Collaborative virtual reality-based systems have integrated high fidelity voice-based communication, immersive audio and screen-sharing tools into virtual environments. Such three-dimensional collaborative virtual environments can mirror the collaboration among usability test participants and facilitators when they are physically collocated, potentially enabling moderated usability tests to be conducted effectively when the facilitator and participant are located in different places. We developed a virtual collaborative three-dimensional remote moderated usability testing laboratory and employed it in a controlled study to evaluate the effectiveness of moderated usability testing in a collaborative virtual reality-based environment with two other moderated usability testing methods: the traditional lab approach and Cisco WebEx, a web-based conferencing and screen sharing approach. Using a mixed methods experimental design, 36 test participants and 12 test facilitators were asked to complete representative tasks on a simulated online shopping website. The dependent variables included the time taken to complete the tasks; the usability defects identified and their severity; and the subjective ratings on the workload index, presence and satisfaction questionnaires. Remote moderated usability testing methodology using a collaborative virtual reality system performed similarly in terms of the total number of defects identified, the number of high severity defects identified and the time taken to complete the tasks with the other two methodologies. The overall workload experienced by the test participants and facilitators was the least with the traditional lab condition. No significant differences were identified for the workload experienced with the virtual reality and the WebEx conditions. However, test participants experienced greater involvement and a more immersive experience in the virtual environment than in the WebEx condition. The ratings for the virtual environment condition were not significantly different from those for the traditional lab condition. The results of this study suggest that participants were productive and enjoyed the virtual lab condition, indicating the potential of a virtual world based approach as an alternative to conventional approaches for synchronous usability testing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. From Cognitive Capability to Social Reform? Shifting Perceptions of Learning in Immersive Virtual Worlds

    ERIC Educational Resources Information Center

    Savin-Baden, Maggi

    2008-01-01

    Learning in immersive virtual worlds (simulations and virtual worlds such as Second Life) could become a central learning approach in many curricula, but the socio-political impact of virtual world learning on higher education remains under-researched. Much of the recent research into learning in immersive virtual worlds centres around games and…

  1. Virtual hand: a 3D tactile interface to virtual environments

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.; Borrel, Paul

    2008-02-01

    We introduce a novel system that allows users to experience the sensation of touch in a computer graphics environment. In this system, the user places his/her hand on an array of pins, which is moved about space on a 6 degree-of-freedom robot arm. The surface of the pins defines a surface in the virtual world. This "virtual hand" can move about the virtual world. When the virtual hand encounters an object in the virtual world, the heights of the pins are adjusted so that they represent the object's shape, surface, and texture. A control system integrates pin and robot arm motions to transmit information about objects in the computer graphics world to the user. It also allows the user to edit, change and move the virtual objects, shapes and textures. This system provides a general framework for touching, manipulating, and modifying objects in a 3-D computer graphics environment, which may be useful in a wide range of applications, including computer games, computer aided design systems, and immersive virtual worlds.

  2. The perception of spatial layout in real and virtual worlds.

    PubMed

    Arthur, E J; Hancock, P A; Chrysler, S T

    1997-01-01

    As human-machine interfaces grow more immersive and graphically-oriented, virtual environment systems become more prominent as the medium for human-machine communication. Often, virtual environments (VE) are built to provide exact metrical representations of existing or proposed physical spaces. However, it is not known how individuals develop representational models of these spaces in which they are immersed and how those models may be distorted with respect to both the virtual and real-world equivalents. To evaluate the process of model development, the present experiment examined participant's ability to reproduce a complex spatial layout of objects having experienced them previously under different viewing conditions. The layout consisted of nine common objects arranged on a flat plane. These objects could be viewed in a free binocular virtual condition, a free binocular real-world condition, and in a static monocular view of the real world. The first two allowed active exploration of the environment while the latter condition allowed the participant only a passive opportunity to observe from a single viewpoint. Viewing conditions were a between-subject variable with 10 participants randomly assigned to each condition. Performance was assessed using mapping accuracy and triadic comparisons of relative inter-object distances. Mapping results showed a significant effect of viewing condition where, interestingly, the static monocular condition was superior to both the active virtual and real binocular conditions. Results for the triadic comparisons showed a significant interaction for gender by viewing condition in which males were more accurate than females. These results suggest that the situation model resulting from interaction with a virtual environment was indistinguishable from interaction with real objects at least within the constraints of the present procedure.

  3. An Overview of Virtual Acoustic Simulation of Aircraft Flyover Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.

    2013-01-01

    Methods for testing human subject response to aircraft flyover noise have greatly advanced in recent years as a result of advances in simulation technology. Capabilities have been developed which now allow subjects to be immersed both visually and aurally in a three-dimensional, virtual environment. While suitable for displaying recorded aircraft noise, the true potential is found when synthesizing aircraft flyover noise because it allows the flexibility and freedom to study sounds from aircraft not yet flown. A virtual acoustic simulation method is described which is built upon prediction-based source noise synthesis, engineering-based propagation modeling, and empirically-based receiver modeling. This source-path-receiver paradigm allows complete control over all aspects of flyover auralization. With this capability, it is now possible to assess human response to flyover noise by systematically evaluating source noise reductions within the context of a system level simulation. Examples of auralized flyover noise and movie clips representative of an immersive aircraft flyover environment are made in the presentation.

  4. CaveCAD: a tool for architectural design in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Schulze, Jürgen P.; Hughes, Cathleen E.; Zhang, Lelin; Edelstein, Eve; Macagno, Eduardo

    2014-02-01

    Existing 3D modeling tools were designed to run on desktop computers with monitor, keyboard and mouse. To make 3D modeling possible with mouse and keyboard, many 3D interactions, such as point placement or translations of geometry, had to be mapped to the 2D parameter space of the mouse, possibly supported by mouse buttons or keyboard keys. We hypothesize that had the designers of these existing systems had been able to assume immersive virtual reality systems as their target platforms, they would have been able to design 3D interactions much more intuitively. In collaboration with professional architects, we created a simple, but complete 3D modeling tool for virtual environments from the ground up and use direct 3D interaction wherever possible and adequate. In this publication, we present our approaches for interactions for typical 3D modeling functions, such as geometry creation, modification of existing geometry, and assignment of surface materials. We also discuss preliminary user experiences with this system.

  5. Cognitive evaluation for the diagnosis of Alzheimer's disease based on Turing Test and Virtual Environments.

    PubMed

    Fernandez Montenegro, Juan Manuel; Argyriou, Vasileios

    2017-05-01

    Alzheimer's screening tests are commonly used by doctors to diagnose the patient's condition and stage as early as possible. Most of these tests are based on pen-paper interaction and do not embrace the advantages provided by new technologies. This paper proposes novel Alzheimer's screening tests based on virtual environments and game principles using new immersive technologies combined with advanced Human Computer Interaction (HCI) systems. These new tests are focused on the immersion of the patient in a virtual room, in order to mislead and deceive the patient's mind. In addition, we propose two novel variations of Turing Test proposed by Alan Turing as a method to detect dementia. As a result, four tests are introduced demonstrating the wide range of screening mechanisms that could be designed using virtual environments and game concepts. The proposed tests are focused on the evaluation of memory loss related to common objects, recent conversations and events; the diagnosis of problems in expressing and understanding language; the ability to recognize abnormalities; and to differentiate between virtual worlds and reality, or humans and machines. The proposed screening tests were evaluated and tested using both patients and healthy adults in a comparative study with state-of-the-art Alzheimer's screening tests. The results show the capacity of the new tests to distinguish healthy people from Alzheimer's patients. Copyright © 2017. Published by Elsevier Inc.

  6. Effects of sensory cueing in virtual motor rehabilitation. A review.

    PubMed

    Palacios-Navarro, Guillermo; Albiol-Pérez, Sergio; García-Magariño García, Iván

    2016-04-01

    To critically identify studies that evaluate the effects of cueing in virtual motor rehabilitation in patients having different neurological disorders and to make recommendations for future studies. Data from MEDLINE®, IEEExplore, Science Direct, Cochrane library and Web of Science was searched until February 2015. We included studies that investigate the effects of cueing in virtual motor rehabilitation related to interventions for upper or lower extremities using auditory, visual, and tactile cues on motor performance in non-immersive, semi-immersive, or fully immersive virtual environments. These studies compared virtual cueing with an alternative or no intervention. Ten studies with a total number of 153 patients were included in the review. All of them refer to the impact of cueing in virtual motor rehabilitation, regardless of the pathological condition. After selecting the articles, the following variables were extracted: year of publication, sample size, study design, type of cueing, intervention procedures, outcome measures, and main findings. The outcome evaluation was done at baseline and end of the treatment in most of the studies. All of studies except one showed improvements in some or all outcomes after intervention, or, in some cases, in favor of the virtual rehabilitation group compared to the control group. Virtual cueing seems to be a promising approach to improve motor learning, providing a channel for non-pharmacological therapeutic intervention in different neurological disorders. However, further studies using larger and more homogeneous groups of patients are required to confirm these findings. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Development of Techniques for Visualization of Scalar and Vector Fields in the Immersive Environment

    NASA Technical Reports Server (NTRS)

    Bidasaria, Hari B.; Wilson, John W.; Nealy, John E.

    2005-01-01

    Visualization of scalar and vector fields in the immersive environment (CAVE - Cave Automated Virtual Environment) is important for its application to radiation shielding research at NASA Langley Research Center. A complete methodology and the underlying software for this purpose have been developed. The developed software has been put to use for the visualization of the earth s magnetic field, and in particular for the study of the South Atlantic Anomaly. The methodology has also been put to use for the visualization of geomagnetically trapped protons and electrons within Earth's magnetosphere.

  8. Wireless physiological monitoring and ocular tracking: 3D calibration in a fully-immersive virtual health care environment.

    PubMed

    Zhang, Lelin; Chi, Yu Mike; Edelstein, Eve; Schulze, Jurgen; Gramann, Klaus; Velasquez, Alvaro; Cauwenberghs, Gert; Macagno, Eduardo

    2010-01-01

    Wireless physiological/neurological monitoring in virtual reality (VR) offers a unique opportunity for unobtrusively quantifying human responses to precisely controlled and readily modulated VR representations of health care environments. Here we present such a wireless, light-weight head-mounted system for measuring electrooculogram (EOG) and electroencephalogram (EEG) activity in human subjects interacting with and navigating in the Calit2 StarCAVE, a five-sided immersive 3-D visualization VR environment. The system can be easily expanded to include other measurements, such as cardiac activity and galvanic skin responses. We demonstrate the capacity of the system to track focus of gaze in 3-D and report a novel calibration procedure for estimating eye movements from responses to the presentation of a set of dynamic visual cues in the StarCAVE. We discuss cyber and clinical applications that include a 3-D cursor for visual navigation in VR interactive environments, and the monitoring of neurological and ocular dysfunction in vision/attention disorders.

  9. Low-cost telepresence for collaborative virtual environments.

    PubMed

    Rhee, Seon-Min; Ziegler, Remo; Park, Jiyoung; Naef, Martin; Gross, Markus; Kim, Myoung-Hee

    2007-01-01

    We present a novel low-cost method for visual communication and telepresence in a CAVE -like environment, relying on 2D stereo-based video avatars. The system combines a selection of proven efficient algorithms and approximations in a unique way, resulting in a convincing stereoscopic real-time representation of a remote user acquired in a spatially immersive display. The system was designed to extend existing projection systems with acquisition capabilities requiring minimal hardware modifications and cost. The system uses infrared-based image segmentation to enable concurrent acquisition and projection in an immersive environment without a static background. The system consists of two color cameras and two additional b/w cameras used for segmentation in the near-IR spectrum. There is no need for special optics as the mask and color image are merged using image-warping based on a depth estimation. The resulting stereo image stream is compressed, streamed across a network, and displayed as a frame-sequential stereo texture on a billboard in the remote virtual environment.

  10. Designers Workbench: Towards Real-Time Immersive Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuester, F; Duchaineau, M A; Hamann, B

    2001-10-03

    This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technologymore » or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.« less

  11. Venus Quadrangle Geological Mapping: Use of Geoscience Data Visualization Systems in Mapping and Training

    NASA Technical Reports Server (NTRS)

    Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil

    2008-01-01

    We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].

  12. Designing a Virtual Social Space for Language Acquisition

    ERIC Educational Resources Information Center

    Woolson, Maria Alessandra

    2012-01-01

    Middleverse de Español (MdE) is an evolving platform for foreign language (FL) study, aligned to the goals of ACTFL's National Standards and 2007 MLA report. The project simulates an immersive environment in a virtual 3-D space for the acquisition of translingual and transcultural competence in Spanish meant to support content-based and…

  13. Towards General Models of Effective Science Inquiry in Virtual Performance Assessments

    ERIC Educational Resources Information Center

    Baker, R. S.; Clarke-Midura, J.; Ocumpaugh, J.

    2016-01-01

    Recent interest in online assessment of scientific inquiry has led to several new online systems that attempt to assess these skills, but producing models that detect when students are successfully practising these skills can be challenging. In this paper, we study models that assess student inquiry in an immersive virtual environment, where a…

  14. Improving post-stroke cognitive and behavioral abnormalities by using virtual reality: A case report on a novel use of nirvana.

    PubMed

    De Luca, Rosaria; Torrisi, Michele; Piccolo, Adriana; Bonfiglio, Giovanni; Tomasello, Provvidenza; Naro, Antonino; Calabrò, Rocco Salvatore

    2017-10-11

    Cognitive impairment, as well as mood and anxiety disorders, occur frequently in patients following stroke. Aim of this study was to evaluate the effects of a combined rehabilitative treatment using conventional relaxation and respiratory techniques, in a specific rehabilitative virtual environment (by using Bts-Nirvana). A 58-year-old woman, affected by hemorrhagic stroke, underwent two different rehabilitation trainings, including either standard relaxation techniques alone in a common clinical setting or the same psychological approach in a semi-immersive virtual environment with an augmented sensorial (audio-video) and motor feedback (sensory motor-interaction). We evaluated the patient's cognitive and psychological profile before and after the two different trainings, by using a specific psychometric battery, aimed to assess cognitive status, attention processes and to estimate the presence of mood alterations, anxiety and coping strategies. Only at the end of the combined approach, we observed a significant improvement in attention and memory functions, with a nearly complete relief of anxiety symptoms and an improvement in coping strategies. Relaxation and respiratory techniques in a semi-immersive virtual reality environment, using Bts-Nirvana, may be a promising tool in improving attention process, coping strategies, and anxiety in individuals with neurological disorders, including stroke.

  15. Along the Virtuality Continuum - Two Showcases on how xR Technologies Transform Geoscience Research and Education

    NASA Astrophysics Data System (ADS)

    Klippel, A.; Zhao, J.; Masrur, A.; Wallgruen, J. O.; La Femina, P. C.

    2017-12-01

    We present work along the virtuality continuum showcasing both AR and VR environments for geoscience applications and research. The AR/VR project focusses on one of the most prominent landmarks on the Penn State campus which, at the same time, is a representation of the geology of Pennsylvania. The Penn State Obelisk is a 32" high, 51 ton monument composed of 281 rocks collected from across Pennsylvania. While information about its origins and composition are scattered in articles and some web databases, we compiled all the available data from the web and archives and curated them as a basis for an immersive xR experience. Tabular data was amended by xR data such as 360° photos, videos, and 3D models (e.g., the Obelisk). Our xR (both AR and VR) prototype provides an immersive analytical environment that supports interactive data visualization and virtual navigation in a natural environment (a campus model of today and of 1896, the year of the Obelisk's installation). This work-in-progress project can provide an interactive immersive learning platform (specifically, for K-12 and introductory level geosciences students) where learning process is enhanced through seamless navigation between 3D data space and physical space. The, second, VR focused application is creating and empirically evaluating virtual reality (VR) experiences for geosciences research, specifically, an interactive volcano experience based on LiDAR and image data of Iceland's Thrihnukar volcano. The prototype addresses the lack of content and tools for immersive virtual reality (iVR) in geoscientific education and research and how to make it easier to integrate iVR into research and classroom experiences. It makes use of environmentally sensed data such that interaction and linked content can be integrated into a single experience. We discuss our workflows as well as methods and authoring tools for iVR analysis and creation of virtual experiences. These methods and tools aim to enhance the utility of geospatial data from repositories such as OpenTopography.org through unlocking treasure-troves of geospatial data for VR applications. Their enhanced accessibility in education and research for the geosciences and beyond will benefit geoscientists and educators who cannot be expected to be VR and 3D application experts.

  16. Psychometric Assessment of Stereoscopic Head-Mounted Displays

    DTIC Science & Technology

    2016-06-29

    Journal Article 3. DATES COVERED (From – To) Jan 2015 - Dec 2015 4. TITLE AND SUBTITLE PSYCHOMETRIC ASSESSMENT OF STEREOSCOPIC HEAD- MOUNTED DISPLAYS...to render an immersive three-dimensional constructive environment. The purpose of this effort was to quantify the impact of aircrew vision on an...simulated tasks requiring precise depth discrimination. This work will provide an example validation method for future stereoscopic virtual immersive

  17. Using voice input and audio feedback to enhance the reality of a virtual experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miner, N.E.

    1994-04-01

    Virtual Reality (VR) is a rapidly emerging technology which allows participants to experience a virtual environment through stimulation of the participant`s senses. Intuitive and natural interactions with the virtual world help to create a realistic experience. Typically, a participant is immersed in a virtual environment through the use of a 3-D viewer. Realistic, computer-generated environment models and accurate tracking of a participant`s view are important factors for adding realism to a virtual experience. Stimulating a participant`s sense of sound and providing a natural form of communication for interacting with the virtual world are equally important. This paper discusses the advantagesmore » and importance of incorporating voice recognition and audio feedback capabilities into a virtual world experience. Various approaches and levels of complexity are discussed. Examples of the use of voice and sound are presented through the description of a research application developed in the VR laboratory at Sandia National Laboratories.« less

  18. Educational Uses of Virtual Reality Technology.

    DTIC Science & Technology

    1998-01-01

    technology. It is affordable in that a basic level of technology can be achieved on most existing personal computers at either no cost or some minimal...actually present in a virtual environment is termed "presence" and is an artifact of being visually immersed in the computer -generated virtual world...Carolina University, VREL Teachers 1996 onward £ CO ■3 u VR in Education University of Illinois, National Center for Super- computing Applications

  19. Virtual reality and telerobotics applications of an Address Recalculation Pipeline

    NASA Technical Reports Server (NTRS)

    Regan, Matthew; Pose, Ronald

    1994-01-01

    The technology described in this paper was designed to reduce latency to user interactions in immersive virtual reality environments. It is also ideally suited to telerobotic applications such as interaction with remote robotic manipulators in space or in deep sea operations. in such circumstances the significant latency is observed response to user stimulus which is due to communications delays, and the disturbing jerkiness due to low and unpredictable frame rates on compressed video user feedback or computationally limited virtual worlds, can be masked by our techniques. The user is provided with highly responsive visual feedback independent of communication or computational delays in providing physical video feedback or in rendering virtual world images. Virtual and physical environments can be combined seamlessly using these techniques.

  20. A model for flexible tools used in minimally invasive medical virtual environments.

    PubMed

    Soler, Francisco; Luzon, M Victoria; Pop, Serban R; Hughes, Chris J; John, Nigel W; Torres, Juan Carlos

    2011-01-01

    Within the limits of current technology, many applications of a virtual environment will trade-off accuracy for speed. This is not an acceptable compromise in a medical training application where both are essential. Efficient algorithms must therefore be developed. The purpose of this project is the development and validation of a novel physics-based real time tool manipulation model, which is easy to integrate into any medical virtual environment that requires support for the insertion of long flexible tools into complex geometries. This encompasses medical specialities such as vascular interventional radiology, endoscopy, and laparoscopy, where training, prototyping of new instruments/tools and mission rehearsal can all be facilitated by using an immersive medical virtual environment. Our model recognises and uses accurately patient specific data and adapts to the geometrical complexity of the vessel in real time.

  1. The Use of Immersive Virtual Reality (VR) to Predict the Occurrence 6 Months Later of Paranoid Thinking and Posttraumatic Stress Symptoms Assessed by Self-Report and Interviewer Methods: A Study of Individuals Who Have Been Physically Assaulted

    PubMed Central

    2014-01-01

    Presentation of social situations via immersive virtual reality (VR) has the potential to be an ecologically valid way of assessing psychiatric symptoms. In this study we assess the occurrence of paranoid thinking and of symptoms of posttraumatic stress disorder (PTSD) in response to a single neutral VR social environment as predictors of later psychiatric symptoms assessed by standard methods. One hundred six people entered an immersive VR social environment (a train ride), presented via a head-mounted display, 4 weeks after having attended hospital because of a physical assault. Paranoid thinking about the neutral computer-generated characters and the occurrence of PTSD symptoms in VR were assessed. Reactions in VR were then used to predict the occurrence 6 months later of symptoms of paranoia and PTSD, as assessed by standard interviewer and self-report methods. Responses to VR predicted the severity of paranoia and PTSD symptoms as assessed by standard measures 6 months later. The VR assessments also added predictive value to the baseline interviewer methods, especially for paranoia. Brief exposure to environments presented via virtual reality provides a symptom assessment with predictive ability over many months. VR assessment may be of particular benefit for difficult to assess problems, such as paranoia, that have no gold standard assessment method. In the future, VR environments may be used in the clinic to complement standard self-report and clinical interview methods. PMID:24708073

  2. The use of immersive virtual reality (VR) to predict the occurrence 6 months later of paranoid thinking and posttraumatic stress symptoms assessed by self-report and interviewer methods: a study of individuals who have been physically assaulted.

    PubMed

    Freeman, Daniel; Antley, Angus; Ehlers, Anke; Dunn, Graham; Thompson, Claire; Vorontsova, Natasha; Garety, Philippa; Kuipers, Elizabeth; Glucksman, Edward; Slater, Mel

    2014-09-01

    Presentation of social situations via immersive virtual reality (VR) has the potential to be an ecologically valid way of assessing psychiatric symptoms. In this study we assess the occurrence of paranoid thinking and of symptoms of posttraumatic stress disorder (PTSD) in response to a single neutral VR social environment as predictors of later psychiatric symptoms assessed by standard methods. One hundred six people entered an immersive VR social environment (a train ride), presented via a head-mounted display, 4 weeks after having attended hospital because of a physical assault. Paranoid thinking about the neutral computer-generated characters and the occurrence of PTSD symptoms in VR were assessed. Reactions in VR were then used to predict the occurrence 6 months later of symptoms of paranoia and PTSD, as assessed by standard interviewer and self-report methods. Responses to VR predicted the severity of paranoia and PTSD symptoms as assessed by standard measures 6 months later. The VR assessments also added predictive value to the baseline interviewer methods, especially for paranoia. Brief exposure to environments presented via virtual reality provides a symptom assessment with predictive ability over many months. VR assessment may be of particular benefit for difficult to assess problems, such as paranoia, that have no gold standard assessment method. In the future, VR environments may be used in the clinic to complement standard self-report and clinical interview methods. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  3. Towards Gesture-Based Multi-User Interactions in Collaborative Virtual Environments

    NASA Astrophysics Data System (ADS)

    Pretto, N.; Poiesi, F.

    2017-11-01

    We present a virtual reality (VR) setup that enables multiple users to participate in collaborative virtual environments and interact via gestures. A collaborative VR session is established through a network of users that is composed of a server and a set of clients. The server manages the communication amongst clients and is created by one of the users. Each user's VR setup consists of a Head Mounted Display (HMD) for immersive visualisation, a hand tracking system to interact with virtual objects and a single-hand joypad to move in the virtual environment. We use Google Cardboard as a HMD for the VR experience and a Leap Motion for hand tracking, thus making our solution low cost. We evaluate our VR setup though a forensics use case, where real-world objects pertaining to a simulated crime scene are included in a VR environment, acquired using a smartphone-based 3D reconstruction pipeline. Users can interact using virtual gesture-based tools such as pointers and rulers.

  4. The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment

    PubMed Central

    Milella, Ferdinando; Pinto, Carlo; Cant, Iain; White, Mark; Meyer, Georg

    2018-01-01

    Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as ‘presence’, when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user’s overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience. PMID:29390023

  5. The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment.

    PubMed

    Cooper, Natalia; Milella, Ferdinando; Pinto, Carlo; Cant, Iain; White, Mark; Meyer, Georg

    2018-01-01

    Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as 'presence', when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user's overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience.

  6. You Spin my Head Right Round: Threshold of Limited Immersion for Rotation Gains in Redirected Walking.

    PubMed

    Schmitz, Patric; Hildebrandt, Julian; Valdez, Andre Calero; Kobbelt, Leif; Ziefle, Martina

    2018-04-01

    In virtual environments, the space that can be explored by real walking is limited by the size of the tracked area. To enable unimpeded walking through large virtual spaces in small real-world surroundings, redirection techniques are used. These unnoticeably manipulate the user's virtual walking trajectory. It is important to know how strongly such techniques can be applied without the user noticing the manipulation-or getting cybersick. Previously, this was estimated by measuring a detection threshold (DT) in highly-controlled psychophysical studies, which experimentally isolate the effect but do not aim for perceived immersion in the context of VR applications. While these studies suggest that only relatively low degrees of manipulation are tolerable, we claim that, besides establishing detection thresholds, it is important to know when the user's immersion breaks. We hypothesize that the degree of unnoticed manipulation is significantly different from the detection threshold when the user is immersed in a task. We conducted three studies: a) to devise an experimental paradigm to measure the threshold of limited immersion (TLI), b) to measure the TLI for slowly decreasing and increasing rotation gains, and c) to establish a baseline of cybersickness for our experimental setup. For rotation gains greater than 1.0, we found that immersion breaks quite late after the gain is detectable. However, for gains lesser than 1.0, some users reported a break of immersion even before established detection thresholds were reached. Apparently, the developed metric measures an additional quality of user experience. This article contributes to the development of effective spatial compression methods by utilizing the break of immersion as a benchmark for redirection techniques.

  7. Immersive Collaboration Simulations: Multi-User Virtual Environments and Augmented Realities

    NASA Technical Reports Server (NTRS)

    Dede, Chris

    2008-01-01

    Emerging information technologies are reshaping the following: shifts in the knowledge and skills society values, development of new methods of teaching and learning, and changes in the characteristics of learning.

  8. Human Fear Conditioning Conducted in Full Immersion 3-Dimensional Virtual Reality

    PubMed Central

    Huff, Nicole C.; Zielinski, David J.; Fecteau, Matthew E.; Brady, Rachael; LaBar, Kevin S.

    2010-01-01

    Fear conditioning is a widely used paradigm in non-human animal research to investigate the neural mechanisms underlying fear and anxiety. A major challenge in conducting conditioning studies in humans is the ability to strongly manipulate or simulate the environmental contexts that are associated with conditioned emotional behaviors. In this regard, virtual reality (VR) technology is a promising tool. Yet, adapting this technology to meet experimental constraints requires special accommodations. Here we address the methodological issues involved when conducting fear conditioning in a fully immersive 6-sided VR environment and present fear conditioning data. In the real world, traumatic events occur in complex environments that are made up of many cues, engaging all of our sensory modalities. For example, cues that form the environmental configuration include not only visual elements, but aural, olfactory, and even tactile. In rodent studies of fear conditioning animals are fully immersed in a context that is rich with novel visual, tactile and olfactory cues. However, standard laboratory tests of fear conditioning in humans are typically conducted in a nondescript room in front of a flat or 2D computer screen and do not replicate the complexity of real world experiences. On the other hand, a major limitation of clinical studies aimed at reducing (extinguishing) fear and preventing relapse in anxiety disorders is that treatment occurs after participants have acquired a fear in an uncontrolled and largely unknown context. Thus the experimenters are left without information about the duration of exposure, the true nature of the stimulus, and associated background cues in the environment1. In the absence of this information it can be difficult to truly extinguish a fear that is both cue and context-dependent. Virtual reality environments address these issues by providing the complexity of the real world, and at the same time allowing experimenters to constrain fear conditioning and extinction parameters to yield empirical data that can suggest better treatment options and/or analyze mechanistic hypotheses. In order to test the hypothesis that fear conditioning may be richly encoded and context specific when conducted in a fully immersive environment, we developed distinct virtual reality 3-D contexts in which participants experienced fear conditioning to virtual snakes or spiders. Auditory cues co-occurred with the CS in order to further evoke orienting responses and a feeling of "presence" in subjects 2 . Skin conductance response served as the dependent measure of fear acquisition, memory retention and extinction. PMID:20736913

  9. Learner Presence, Perception, and Learning Achievements in Augmented-Reality-Mediated Learning Environments

    ERIC Educational Resources Information Center

    Chen, Yu-Hsuan; Wang, Chang-Hwa

    2018-01-01

    Although research has indicated that augmented reality (AR)-facilitated instruction improves learning performance, further investigation of the usefulness of AR from a psychological perspective has been recommended. Researchers consider presence a major psychological effect when users are immersed in virtual reality environments. However, most…

  10. Learning in Transformational Computer Games: Exploring Design Principles for a Nanotechnology Game

    ERIC Educational Resources Information Center

    Masek, Martin; Murcia, Karen; Morrison, Jason; Newhouse, Paul; Hackling, Mark

    2012-01-01

    Transformational games are digital computer and video applications purposefully designed to create engaging and immersive learning environments for delivering specified learning goals, outcomes and experiences. The virtual world of a transformational game becomes the social environment within which learning occurs as an outcome of the complex…

  11. Making Learning Fun: Quest Atlantis, A Game Without Guns

    ERIC Educational Resources Information Center

    Barab, Sasha; Thomas, Michael; Dodge, Tyler; Carteaux, Robert; Tuzun, Hakan

    2005-01-01

    This article describes the Quest Atlantis (QA) project, a learning and teaching project that employs a multiuser, virtual environment to immerse children, ages 9-12, in educational tasks. QA combines strategies used in commercial gaming environments with lessons from educational research on learning and motivation. It allows users at participating…

  12. Mobile Learning: At the Tipping Point

    ERIC Educational Resources Information Center

    Franklin, Teresa

    2011-01-01

    Mobile technologies are interfacing with all aspects of our lives including Web 2.0 tools and applications, immersive virtual world environments, and online environments to present educational opportunities for 24/7 learning at the learner's discretion. Mobile devices are allowing educators to build new community learning ecosystems for and by…

  13. Problem-Based Learning Spanning Real and Virtual Words: A Case Study in Second Life

    ERIC Educational Resources Information Center

    Good, Judith; Howland, Katherine; Thackray, Liz

    2008-01-01

    There is a growing use of immersive virtual environments for educational purposes. However, much of this activity is not yet documented in the public domain, or is descriptive rather than analytical. This paper presents a case study in which university students were tasked with building an interactive learning experience using Second Life as a…

  14. High sensitivity to multisensory conflicts in agoraphobia exhibited by virtual reality.

    PubMed

    Viaud-Delmon, Isabelle; Warusfel, Olivier; Seguelas, Angeline; Rio, Emmanuel; Jouvent, Roland

    2006-10-01

    The primary aim of this study was to evaluate the effect of auditory feedback in a VR system planned for clinical use and to address the different factors that should be taken into account in building a bimodal virtual environment (VE). We conducted an experiment in which we assessed spatial performances in agoraphobic patients and normal subjects comparing two kinds of VEs, visual alone (Vis) and auditory-visual (AVis), during separate sessions. Subjects were equipped with a head-mounted display coupled with an electromagnetic sensor system and immersed in a virtual town. Their task was to locate different landmarks and become familiar with the town. In the AVis condition subjects were equipped with the head-mounted display and headphones, which delivered a soundscape updated in real-time according to their movement in the virtual town. While general performances remained comparable across the conditions, the reported feeling of immersion was more compelling in the AVis environment. However, patients exhibited more cybersickness symptoms in this condition. The result of this study points to the multisensory integration deficit of agoraphobic patients and underline the need for further research on multimodal VR systems for clinical use.

  15. A Case Study in User Support for Managing OpenSim Based Multi User Learning Environments

    ERIC Educational Resources Information Center

    Perera, Indika; Miller, Alan; Allison, Colin

    2017-01-01

    Immersive 3D Multi User Learning Environments (MULE) have shown sufficient success to warrant their consideration as a mainstream educational paradigm. These are based on 3D Multi User Virtual Environment platforms (MUVE), and although they have been used for various innovative educational projects their complex permission systems and large…

  16. ConfocalVR: Immersive Visualization Applied to Confocal Microscopy.

    PubMed

    Stefani, Caroline; Lacy-Hulbert, Adam; Skillman, Thomas

    2018-06-24

    ConfocalVR is a virtual reality (VR) application created to improve the ability of researchers to study the complexity of cell architecture. Confocal microscopes take pictures of fluorescently labeled proteins or molecules at different focal planes to create a stack of 2D images throughout the specimen. Current software applications reconstruct the 3D image and render it as a 2D projection onto a computer screen where users need to rotate the image to expose the full 3D structure. This process is mentally taxing, breaks down if you stop the rotation, and does not take advantage of the eye's full field of view. ConfocalVR exploits consumer-grade virtual reality (VR) systems to fully immerse the user in the 3D cellular image. In this virtual environment the user can: 1) adjust image viewing parameters without leaving the virtual space, 2) reach out and grab the image to quickly rotate and scale the image to focus on key features, and 3) interact with other users in a shared virtual space enabling real-time collaborative exploration and discussion. We found that immersive VR technology allows the user to rapidly understand cellular architecture and protein or molecule distribution. We note that it is impossible to understand the value of immersive visualization without experiencing it first hand, so we encourage readers to get access to a VR system, download this software, and evaluate it for yourself. The ConfocalVR software is available for download at http://www.confocalvr.com, and is free for nonprofits. Copyright © 2018. Published by Elsevier Ltd.

  17. Immersive virtual reality simulations in nursing education.

    PubMed

    Kilmon, Carol A; Brown, Leonard; Ghosh, Sumit; Mikitiuk, Artur

    2010-01-01

    This article explores immersive virtual reality as a potential educational strategy for nursing education and describes an immersive learning experience now being developed for nurses. This pioneering project is a virtual reality application targeting speed and accuracy of nurse response in emergency situations requiring cardiopulmonary resuscitation. Other potential uses and implications for the development of virtual reality learning programs are discussed.

  18. Towards photorealistic and immersive virtual-reality environments for simulated prosthetic vision: integrating recent breakthroughs in consumer hardware and software.

    PubMed

    Zapf, Marc P; Matteucci, Paul B; Lovell, Nigel H; Zheng, Steven; Suaning, Gregg J

    2014-01-01

    Simulated prosthetic vision (SPV) in normally sighted subjects is an established way of investigating the prospective efficacy of visual prosthesis designs in visually guided tasks such as mobility. To perform meaningful SPV mobility studies in computer-based environments, a credible representation of both the virtual scene to navigate and the experienced artificial vision has to be established. It is therefore prudent to make optimal use of existing hardware and software solutions when establishing a testing framework. The authors aimed at improving the realism and immersion of SPV by integrating state-of-the-art yet low-cost consumer technology. The feasibility of body motion tracking to control movement in photo-realistic virtual environments was evaluated in a pilot study. Five subjects were recruited and performed an obstacle avoidance and wayfinding task using either keyboard and mouse, gamepad or Kinect motion tracking. Walking speed and collisions were analyzed as basic measures for task performance. Kinect motion tracking resulted in lower performance as compared to classical input methods, yet results were more uniform across vision conditions. The chosen framework was successfully applied in a basic virtual task and is suited to realistically simulate real-world scenes under SPV in mobility research. Classical input peripherals remain a feasible and effective way of controlling the virtual movement. Motion tracking, despite its limitations and early state of implementation, is intuitive and can eliminate between-subject differences due to familiarity to established input methods.

  19. Formalizing and Promoting Collaboration in 3D Virtual Environments - A Blueprint for the Creation of Group Interaction Patterns

    NASA Astrophysics Data System (ADS)

    Schmeil, Andreas; Eppler, Martin J.

    Despite the fact that virtual worlds and other types of multi-user 3D collaboration spaces have long been subjects of research and of application experiences, it still remains unclear how to best benefit from meeting with colleagues and peers in a virtual environment with the aim of working together. Making use of the potential of virtual embodiment, i.e. being immersed in a space as a personal avatar, allows for innovative new forms of collaboration. In this paper, we present a framework that serves as a systematic formalization of collaboration elements in virtual environments. The framework is based on the semiotic distinctions among pragmatic, semantic and syntactic perspectives. It serves as a blueprint to guide users in designing, implementing, and executing virtual collaboration patterns tailored to their needs. We present two team and two community collaboration pattern examples as a result of the application of the framework: Virtual Meeting, Virtual Design Studio, Spatial Group Configuration, and Virtual Knowledge Fair. In conclusion, we also point out future research directions for this emerging domain.

  20. BIM based virtual environment for fire emergency evacuation.

    PubMed

    Wang, Bin; Li, Haijiang; Rezgui, Yacine; Bradley, Alex; Ong, Hoang N

    2014-01-01

    Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management.

  1. Real-time interactive virtual tour on the World Wide Web (WWW)

    NASA Astrophysics Data System (ADS)

    Yoon, Sanghyuk; Chen, Hai-jung; Hsu, Tom; Yoon, Ilmi

    2003-12-01

    Web-based Virtual Tour has become a desirable and demanded application, yet challenging due to the nature of web application's running environment such as limited bandwidth and no guarantee of high computation power on the client side. Image-based rendering approach has attractive advantages over traditional 3D rendering approach in such Web Applications. Traditional approach, such as VRML, requires labor-intensive 3D modeling process, high bandwidth and computation power especially for photo-realistic virtual scenes. QuickTime VR and IPIX as examples of image-based approach, use panoramic photos and the virtual scenes that can be generated from photos directly skipping the modeling process. But, these image-based approaches may require special cameras or effort to take panoramic views and provide only one fixed-point look-around and zooming in-out rather than 'walk around', that is a very important feature to provide immersive experience to virtual tourists. The Web-based Virtual Tour using Tour into the Picture employs pseudo 3D geometry with image-based rendering approach to provide viewers with immersive experience of walking around the virtual space with several snap shots of conventional photos.

  2. Designing Awe in Virtual Reality: An Experimental Study.

    PubMed

    Chirico, Alice; Ferrise, Francesco; Cordella, Lorenzo; Gaggioli, Andrea

    2017-01-01

    Awe is a little-studied emotion with a great transformative potential. Therefore, the interest toward the study of awe's underlying mechanisms has been increased. Specifically, researchers have been interested in how to reproduce intense feelings of awe within laboratory conditions. It has been proposed that the use of virtual reality (VR) could be an effective way to induce awe in controlled experimental settings, thanks to its ability of providing participants with a sense of "presence," that is, the subjective feeling of being displaced in another physical or imaginary place. However, the potential of VR as awe-inducing medium has not been fully tested yet. In the present study, we provided an evidence-based design and a validation of four immersive virtual environments (VEs) involving 36 participants in a within-subject design. Of these, three VEs were designed to induce awe, whereas the fourth VE was targeted as an emotionally neutral stimulus. Participants self-reported the extent to which they felt awe, general affect and sense of presence related to each environment. As expected, results showed that awe-VEs could induce significantly higher levels of awe and presence as compared to the neutral VE. Furthermore, these VEs induced significantly more positive than negative affect. These findings supported the potential of immersive VR for inducing awe and provide useful indications for the design of awe-inspiring virtual environments.

  3. Saliency in VR: How Do People Explore Virtual Environments?

    PubMed

    Sitzmann, Vincent; Serrano, Ana; Pavel, Amy; Agrawala, Maneesh; Gutierrez, Diego; Masia, Belen; Wetzstein, Gordon

    2018-04-01

    Understanding how people explore immersive virtual environments is crucial for many applications, such as designing virtual reality (VR) content, developing new compression algorithms, or learning computational models of saliency or visual attention. Whereas a body of recent work has focused on modeling saliency in desktop viewing conditions, VR is very different from these conditions in that viewing behavior is governed by stereoscopic vision and by the complex interaction of head orientation, gaze, and other kinematic constraints. To further our understanding of viewing behavior and saliency in VR, we capture and analyze gaze and head orientation data of 169 users exploring stereoscopic, static omni-directional panoramas, for a total of 1980 head and gaze trajectories for three different viewing conditions. We provide a thorough analysis of our data, which leads to several important insights, such as the existence of a particular fixation bias, which we then use to adapt existing saliency predictors to immersive VR conditions. In addition, we explore other applications of our data and analysis, including automatic alignment of VR video cuts, panorama thumbnails, panorama video synopsis, and saliency-basedcompression.

  4. Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback.

    PubMed

    Alaraj, Ali; Luciano, Cristian J; Bailey, Daniel P; Elsenousi, Abdussalam; Roitberg, Ben Z; Bernardo, Antonio; Banerjee, P Pat; Charbel, Fady T

    2015-03-01

    With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.

  5. VirSchool: The Effect of Background Music and Immersive Display Systems on Memory for Facts Learned in an Educational Virtual Environment

    ERIC Educational Resources Information Center

    Fassbender, Eric; Richards, Deborah; Bilgin, Ayse; Thompson, William Forde; Heiden, Wolfgang

    2012-01-01

    Game technology has been widely used for educational applications, however, despite the common use of background music in games, its effect on learning has been largely unexplored. This paper discusses how music played in the background of a computer-animated history lesson affected participants' memory for facts. A virtual history lesson was…

  6. The Overlapping Worlds View: Analvzing Identity Transformation in Real and Virtual Worlds and the Effects on Learning

    ERIC Educational Resources Information Center

    Evans, Michael A.; Wang, Feihong

    2008-01-01

    Of late, digital game-based learning has attracted game designers, researchers and educators alike. Immersion in the virtual 3D environment of a game may have positive effects on K-12 students' cultivation of self (Dodge et al., 2006). Currently, two opposing views related to game-based identity formation are presented in the literature: the…

  7. The Design of Immersive English Learning Environment Using Augmented Reality

    ERIC Educational Resources Information Center

    Li, Kuo-Chen; Chen, Cheng-Ting; Cheng, Shein-Yung; Tsai, Chung-Wei

    2016-01-01

    The study uses augmented reality (AR) technology to integrate virtual objects into the real learning environment for language learning. The English AR classroom is constructed using the system prototyping method and evaluated by semi-structured in-depth interviews. According to the flow theory by Csikszenmihalyi in 1975 along with the immersive…

  8. The Virtual Employment Test Bed: An Immersive Synthetic Environment Allows Engineers to Test and Evaluate Material Solutions

    DTIC Science & Technology

    2014-04-03

    synthetic environment allows engineers to test and evaluate material solutions Robert DeMarco, MSBME; Gordon Cooke, MEME ; John Riedener, MSSE...ROBERT DEMARCO, MSBME, is a Project Lead Engineer and Certified LabVIEW Associate Developer. GORDON COOKE, MEME , is a Principal Investigator at the

  9. Development and Deployment of a Library of Industrially Focused Advanced Immersive VR Learning Environments

    ERIC Educational Resources Information Center

    Cameron, Ian; Crosthwaite, Caroline; Norton, Christine; Balliu, Nicoleta; Tadé, Moses; Hoadley, Andrew; Shallcross, David; Barton, Geoff

    2008-01-01

    This work presents a unique education resource for both process engineering students and the industry workforce. The learning environment is based around spherical imagery of real operating plants coupled with interactive embedded activities and content. This Virtual Reality (VR) learning tool has been developed by applying aspects of relevant…

  10. Blended Learning Environments: Using Social Networking Sites to Enhance the First Year Experience

    ERIC Educational Resources Information Center

    McCarthy, Joshua

    2010-01-01

    This study explores blending virtual and physical learning environments to enhance the experience of first year by immersing students into university culture through social and academic interaction between peers. It reports on the progress made from 2008 to 2009 using an existing academic platform, the first year design elective course…

  11. Scalable metadata environments (MDE): artistically impelled immersive environments for large-scale data exploration

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Margolis, Todd; Prudhomme, Andrew; Schulze, Jürgen P.; Mostafavi, Iman; Lewis, J. P.; Gossmann, Joachim; Singh, Rajvikram

    2014-02-01

    Scalable Metadata Environments (MDEs) are an artistic approach for designing immersive environments for large scale data exploration in which users interact with data by forming multiscale patterns that they alternatively disrupt and reform. Developed and prototyped as part of an art-science research collaboration, we define an MDE as a 4D virtual environment structured by quantitative and qualitative metadata describing multidimensional data collections. Entire data sets (e.g.10s of millions of records) can be visualized and sonified at multiple scales and at different levels of detail so they can be explored interactively in real-time within MDEs. They are designed to reflect similarities and differences in the underlying data or metadata such that patterns can be visually/aurally sorted in an exploratory fashion by an observer who is not familiar with the details of the mapping from data to visual, auditory or dynamic attributes. While many approaches for visual and auditory data mining exist, MDEs are distinct in that they utilize qualitative and quantitative data and metadata to construct multiple interrelated conceptual coordinate systems. These "regions" function as conceptual lattices for scalable auditory and visual representations within virtual environments computationally driven by multi-GPU CUDA-enabled fluid dyamics systems.

  12. Machinima Interventions: Innovative Approaches to Immersive Virtual World Curriculum Integration

    ERIC Educational Resources Information Center

    Middleton, Andrew John; Mather, Richard

    2008-01-01

    The educational value of Immersive Virtual Worlds (IVWs) seems to be in their social immersive qualities and as an accessible simulation technology. In contrast to these synchronous applications this paper discusses the use of educational machinima developed in IVW virtual film sets. It also introduces the concept of media intervention, proposing…

  13. Design, Realization, and First Validation of an Immersive Web-Based Virtual Patient Simulator for Training Clinical Decisions in Surgery.

    PubMed

    Kleinert, Robert; Heiermann, Nadine; Wahba, Roger; Chang, De-Huan; Hölscher, Arnulf H; Stippel, Dirk L

    2015-01-01

    Immersive patient simulators (IPS) allow an illusionary immersion into a synthetic world where the user can freely navigate through a 3-dimensional environment similar to computer games. Playful learning with IPS allows internalization of medical workflows without harming real patients. Ideally, IPS show high student acceptance and can have positive effect on knowledge gain. Development of IPS with high technical quality is resource intensive. Therefore most of the "high-fidelity" IPS are commercially driven. Usage of IPS in the daily curriculum is still rare. There is no academic-driven simulator that is freely accessible to every student and combines high immersion grade with a profound amount of medical content. Therefore it was our aim to develop an academic-driven IPS prototype that is free to use and combines a high immersion grade with profound medical content. In addition, a first validation of the prototype was conducted. The conceptual design included definition of the following parameters: amount of curricular content, grade of technical quality, availability, and level of validation. A preliminary validation was done with 25 students. Students' opinion about acceptance was evaluated by a Likert-scale questionnaire. Effect on knowledge gain was determined by testing concordance and predictive validity. A custom-made simulator prototype (Artificial learning interface for clinical education [ALICE]) displays a virtual clinic environment that can be explored from a first-person view similar to a video game. By controlling an avatar, the user navigates through the environment, is able to treat virtual patients, and faces the consequence of different decisions. ALICE showed high students' acceptance. There was positive correlation for concordance validity and predictive validity. Simulator usage had positive effect on reproduction of trained content and declarative knowledge. We successfully developed a university-based, IPS prototype (ALICE) with profound medical content. ALICE is a nonprofit simulator, easy to use, and showed high students' acceptance; thus it potentially provides an additional tool for supporting student teaching in the daily clinical curriculum. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  14. Real-time recording and classification of eye movements in an immersive virtual environment.

    PubMed

    Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry; Hayhoe, Mary

    2013-10-10

    Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.

  15. Real-time recording and classification of eye movements in an immersive virtual environment

    PubMed Central

    Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry; Hayhoe, Mary

    2013-01-01

    Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements. PMID:24113087

  16. Virtual reality training improves students' knowledge structures of medical concepts.

    PubMed

    Stevens, Susan M; Goldsmith, Timothy E; Summers, Kenneth L; Sherstyuk, Andrei; Kihmm, Kathleen; Holten, James R; Davis, Christopher; Speitel, Daniel; Maris, Christina; Stewart, Randall; Wilks, David; Saland, Linda; Wax, Diane; Panaiotis; Saiki, Stanley; Alverson, Dale; Caudell, Thomas P

    2005-01-01

    Virtual environments can provide training that is difficult to achieve under normal circumstances. Medical students can work on high-risk cases in a realistic, time-critical environment, where students practice skills in a cognitively demanding and emotionally compelling situation. Research from cognitive science has shown that as students acquire domain expertise, their semantic organization of core domain concepts become more similar to those of an expert's. In the current study, we hypothesized that students' knowledge structures would become more expert-like as a result of their diagnosing and treating a patient experiencing a hematoma within a virtual environment. Forty-eight medical students diagnosed and treated a hematoma case within a fully immersed virtual environment. Student's semantic organization of 25 case-related concepts was assessed prior to and after training. Students' knowledge structures became more integrated and similar to an expert knowledge structure of the concepts as a result of the learning experience. The methods used here for eliciting, representing, and evaluating knowledge structures offer a sensitive and objective means for evaluating student learning in virtual environments and medical simulations.

  17. The Virtual Pelvic Floor, a tele-immersive educational environment.

    PubMed Central

    Pearl, R. K.; Evenhouse, R.; Rasmussen, M.; Dech, F.; Silverstein, J. C.; Prokasy, S.; Panko, W. B.

    1999-01-01

    This paper describes the development of the Virtual Pelvic Floor, a new method of teaching the complex anatomy of the pelvic region utilizing virtual reality and advanced networking technology. Virtual reality technology allows improved visualization of three-dimensional structures over conventional media because it supports stereo vision, viewer-centered perspective, large angles of view, and interactivity. Two or more ImmersaDesk systems, drafting table format virtual reality displays, are networked together providing an environment where teacher and students share a high quality three-dimensional anatomical model, and are able to converse, see each other, and to point in three dimensions to indicate areas of interest. This project was realized by the teamwork of surgeons, medical artists and sculptors, computer scientists, and computer visualization experts. It demonstrates the future of virtual reality for surgical education and applications for the Next Generation Internet. Images Figure 1 Figure 2 Figure 3 PMID:10566378

  18. New tools for sculpting cranial implants in a shared haptic augmented reality environment.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2006-01-01

    New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.

  19. Usability Comparisons of Head-Mounted vs. Stereoscopic Desktop Displays in a Virtual Reality Environment with Pain Patients.

    PubMed

    Tong, Xin; Gromala, Diane; Gupta, Dimple; Squire, Pam

    2016-01-01

    Researchers have shown that immersive Virtual Reality (VR) can serve as an unusually powerful pain control technique. However, research assessing the reported symptoms and negative effects of VR systems indicate that it is important to ascertain if these symptoms arise from the use of particular VR display devices, particularly for users who are deemed "at risk," such as chronic pain patients Moreover, these patients have specific and often complex needs and requirements, and because basic issues such as 'comfort' may trigger anxiety or panic attacks, it is important to examine basic questions of the feasibility of using VR displays. Therefore, this repeated-measured experiment was conducted with two VR displays: the Oculus Rift's head-mounted display (HMD) and Firsthand Technologies' immersive desktop display, DeepStream3D. The characteristics of these immersive desktop displays differ: one is worn, enabling patients to move their heads, while the other is peered into, allowing less head movement. To assess the severity of physical discomforts, 20 chronic pain patients tried both displays while watching a VR pain management demo in clinical settings. Results indicated that participants experienced higher levels of Simulator Sickness using the Oculus Rift HMD. However, results also indicated other preferences of the two VR displays among patients, including physical comfort levels and a sense of immersion. Few studies have been conducted that compare usability of specific VR devices specifically with chronic pain patients using a therapeutic virtual environment in pain clinics. Thus, the results may help clinicians and researchers to choose the most appropriate VR displays for chronic pain patients and guide VR designers to enhance the usability of VR displays for long-term pain management interventions.

  20. Immersive 3D Visualization of Astronomical Data

    NASA Astrophysics Data System (ADS)

    Schaaff, A.; Berthier, J.; Da Rocha, J.; Deparis, N.; Derriere, S.; Gaultier, P.; Houpin, R.; Normand, J.; Ocvirk, P.

    2015-09-01

    The immersive-3D visualization, or Virtual Reality in our study, was previously dedicated to specific uses (research, flight simulators, etc.) The investment in infrastructure and its cost was reserved to large laboratories or companies. Lately we saw the development of immersive-3D masks intended for wide distribution, for example the Oculus Rift and the Sony Morpheus projects. The usual reaction is to say that these tools are primarily intended for games since it is easy to imagine a player in a virtual environment and the added value to conventional 2D screens. Yet it is likely that there are many applications in the professional field if these tools are becoming common. Introducing this technology into existing applications or new developments makes sense only if interest is properly evaluated. The use in Astronomy is clear for education, it is easy to imagine mobile and light planetariums or to reproduce poorly accessible environments (e.g., large instruments). In contrast, in the field of professional astronomy the use is probably less obvious and it requires to conduct studies to determine the most appropriate ones and to assess the contributions compared to the other display modes.

  1. Design and Implementation of a 3D Multi-User Virtual World for Language Learning

    ERIC Educational Resources Information Center

    Ibanez, Maria Blanca; Garcia, Jose Jesus; Galan, Sergio; Maroto, David; Morillo, Diego; Kloos, Carlos Delgado

    2011-01-01

    The best way to learn is by having a good teacher and the best language learning takes place when the learner is immersed in an environment where the language is natively spoken. 3D multi-user virtual worlds have been claimed to be useful for learning, and the field of exploiting them for education is becoming more and more active thanks to the…

  2. Virtualized Traffic: reconstructing traffic flows from discrete spatiotemporal data.

    PubMed

    Sewall, Jason; van den Berg, Jur; Lin, Ming C; Manocha, Dinesh

    2011-01-01

    We present a novel concept, Virtualized Traffic, to reconstruct and visualize continuous traffic flows from discrete spatiotemporal data provided by traffic sensors or generated artificially to enhance a sense of immersion in a dynamic virtual world. Given the positions of each car at two recorded locations on a highway and the corresponding time instances, our approach can reconstruct the traffic flows (i.e., the dynamic motions of multiple cars over time) between the two locations along the highway for immersive visualization of virtual cities or other environments. Our algorithm is applicable to high-density traffic on highways with an arbitrary number of lanes and takes into account the geometric, kinematic, and dynamic constraints on the cars. Our method reconstructs the car motion that automatically minimizes the number of lane changes, respects safety distance to other cars, and computes the acceleration necessary to obtain a smooth traffic flow subject to the given constraints. Furthermore, our framework can process a continuous stream of input data in real time, enabling the users to view virtualized traffic events in a virtual world as they occur. We demonstrate our reconstruction technique with both synthetic and real-world input. © 2011 IEEE Published by the IEEE Computer Society

  3. The impact of self-avatars on trust and collaboration in shared virtual environments.

    PubMed

    Pan, Ye; Steed, Anthony

    2017-01-01

    A self-avatar is known to have a potentially significant impact on the user's experience of the immersive content but it can also affect how users interact with each other in a shared virtual environment (SVE). We implemented an SVE for a consumer virtual reality system where each user's body could be represented by a jointed self-avatar that was dynamically controlled by head and hand controllers. We investigated the impact of a self-avatar on collaborative outcomes such as completion time and trust formation during competitive and cooperative tasks. We used two different embodiment levels: no self-avatar and self-avatar, and compared these to an in-person face to face version of the tasks. We found that participants could finish the task more quickly when they cooperated than when they competed, for both the self-avatar condition and the face to face condition, but not for the no self-avatar condition. In terms of trust formation, both the self-avatar condition and the face to face condition led to higher scores than the no self-avatar condition; however, collaboration style had no significant effect on trust built between partners. The results are further evidence of the importance of a self-avatar representation in immersive virtual reality.

  4. The impact of self-avatars on trust and collaboration in shared virtual environments

    PubMed Central

    Steed, Anthony

    2017-01-01

    A self-avatar is known to have a potentially significant impact on the user’s experience of the immersive content but it can also affect how users interact with each other in a shared virtual environment (SVE). We implemented an SVE for a consumer virtual reality system where each user’s body could be represented by a jointed self-avatar that was dynamically controlled by head and hand controllers. We investigated the impact of a self-avatar on collaborative outcomes such as completion time and trust formation during competitive and cooperative tasks. We used two different embodiment levels: no self-avatar and self-avatar, and compared these to an in-person face to face version of the tasks. We found that participants could finish the task more quickly when they cooperated than when they competed, for both the self-avatar condition and the face to face condition, but not for the no self-avatar condition. In terms of trust formation, both the self-avatar condition and the face to face condition led to higher scores than the no self-avatar condition; however, collaboration style had no significant effect on trust built between partners. The results are further evidence of the importance of a self-avatar representation in immersive virtual reality. PMID:29240837

  5. Perceiving interpersonally-mediated risk in virtual environments

    PubMed Central

    Portnoy, David B.; Smoak, Natalie D.; Marsh, Kerry L.

    2009-01-01

    Using virtual reality (VR) to examine risky behavior that is mediated by interpersonal contact, such as agreeing to have sex, drink, or smoke with someone, offers particular promise and challenges. Social contextual stimuli that might trigger impulsive responses can be carefully controlled in virtual environments (VE), and yet manipulations of risk might be implausible to participants if they do not feel sufficiently immersed in the environment. The current study examined whether individuals can display adequate evidence of presence in a VE that involved potential interpersonally-induced risk: meeting a potential dating partner. Results offered some evidence for the potential of VR for the study of such interpersonal risk situations. Participants’ reaction to the scenario and risk-associated responses to the situation suggested that the embodied nature of virtual reality override the reality of the risk’s impossibility, allowing participants to experience adequate situational embedding, or presence. PMID:20228871

  6. Perceiving interpersonally-mediated risk in virtual environments.

    PubMed

    Portnoy, David B; Smoak, Natalie D; Marsh, Kerry L

    2010-03-01

    Using virtual reality (VR) to examine risky behavior that is mediated by interpersonal contact, such as agreeing to have sex, drink, or smoke with someone, offers particular promise and challenges. Social contextual stimuli that might trigger impulsive responses can be carefully controlled in virtual environments (VE), and yet manipulations of risk might be implausible to participants if they do not feel sufficiently immersed in the environment. The current study examined whether individuals can display adequate evidence of presence in a VE that involved potential interpersonally-induced risk: meeting a potential dating partner. Results offered some evidence for the potential of VR for the study of such interpersonal risk situations. Participants' reaction to the scenario and risk-associated responses to the situation suggested that the embodied nature of virtual reality override the reality of the risk's impossibility, allowing participants to experience adequate situational embedding, or presence.

  7. Global Channels of Evidence for Learning and Assessment in Complex Game Environments

    ERIC Educational Resources Information Center

    Nelson, Brian C.; Erlandson, Benjamin; Denham, Andre

    2011-01-01

    In this paper, we take a designer's look at how the activities and data of learning and assessment can be structured in immersive virtual game environments called Massively Multi-Player Online Games (MMOG). In doing so, we examine the channels of evidence through which learning and assessment activities are derived in MMOGs, offering examples of…

  8. A Brave New World: Synchronous Environments in the Literature Classroom.

    ERIC Educational Resources Information Center

    Rozema, Robert

    The Internet may be the ultimate immersive and participatory medium, opening doors as it does to countless story worlds. As such, it has much to offer reading instruction in both elementary and secondary classrooms. This paper explores how a teacher used one web application--a text-based virtual environment called a MOO--to encourage his high…

  9. An Investigation of the Role of Background Music in IVWs for Learning

    ERIC Educational Resources Information Center

    Richards, Debbie; Fassbender, Eric; Bilgin, Ayse; Thompson, William Forde

    2008-01-01

    Empirical evidence is needed to corroborate the intuitions of gamers and game developers in understanding the benefits of Immersive Virtual Worlds (IVWs) as a learning environment and the role that music plays within these environments. We report an investigation to determine if background music of the genre typically found in computer-based…

  10. Force Feedback Joystick

    NASA Technical Reports Server (NTRS)

    1997-01-01

    I-FORCE, a computer peripheral from Immersion Corporation, was derived from virtual environment and human factors research at the Advanced Displays and Spatial Perception Laboratory at Ames Research Center in collaboration with Stanford University Center for Design Research. Entrepreneur Louis Rosenberg, a former Stanford researcher, now president of Immersion, collaborated with Dr. Bernard Adelstein at Ames on studies of perception in virtual reality. The result was an inexpensive way to incorporate motors and a sophisticated microprocessor into joysticks and other game controllers. These devices can emulate the feel of a car on the skid, a crashing plane, the bounce of a ball, compressed springs, or other physical phenomenon. The first products incorporating I-FORCE technology include CH- Products' line of FlightStick and CombatStick controllers.

  11. Virtually driving: are the driving environments "real enough" for exposure therapy with accident victims? An explorative study.

    PubMed

    Walshe, David; Lewis, Elizabeth; O'Sullivan, Kathleen; Kim, Sun I

    2005-12-01

    There is a small but growing body of research supporting the effectiveness of computer-generated environments in exposure therapy for driving phobia. However, research also suggests that difficulties can readily arise whereby patients do not immerse in simulated driving scenes. The simulated driving environments are not "real enough" to undertake exposure therapy. This sets a limitation to the use of virtual reality (VR) exposure therapy as a treatment modality for driving phobia. The aim of this study was to investigate if a clinically acceptable immersion/presence rate of >80% could be achieved for driving phobia subjects in computer generated environments by modifying external factors in the driving environment. Eleven patients referred from the Accident and Emergency Department of a general hospital or from their General Practitioner following a motor vehicle accident, who met DSM-IV criteria for Specific Phobia-driving were exposed to a computer-generated driving environment using computer driving games (London Racer/Midtown Madness). In an attempt to make the driving environments "real enough," external factors were modified by (a) projection of images onto a large screen, (b) viewing the scene through a windscreen, (c) using car seats for both driver and passenger, and (d) increasing vibration sense through use of more powerful subwoofers. Patients undertook a trial session involving driving through computer environments with graded risk of an accident. "Immersion/presence" was operationally defined as a subjective rating by the subject that the environment "feels real," together with an increase in subjective units of distress (SUD) ratings of >3 and/or an increase of heart rate of >15 beats per minute (BPM). Ten of 11 (91%) of the driving phobic subjects met the criteria for immersion/presence in the driving environment enabling progression to VR exposure therapy. These provisional findings suggest that the paradigm adopted in this study might be an effective and relatively inexpensive means of developing driving environments "real enough," to make VR exposure therapy a viable treatment modality for driving phobia following a motor vehicle accident (MVA).

  12. Validation of an immersive virtual reality system for training near and far space neglect in individuals with stroke: a pilot study.

    PubMed

    Yasuda, Kazuhiro; Muroi, Daisuke; Ohira, Masahiro; Iwata, Hiroyasu

    2017-10-01

    Unilateral spatial neglect (USN) is defined as impaired ability to attend and see on one side, and when present, it interferes seriously with daily life. These symptoms can exist for near and far spaces combined or independently, and it is important to provide effective intervention for near and far space neglect. The purpose of this pilot study was to propose an immersive virtual reality (VR) rehabilitation program using a head-mounted display that is able to train both near and far space neglect, and to validate the immediate effect of the VR program in both near and far space neglect. Ten USN patients underwent the VR program with a pre-post design and no control. In the virtual environment, we developed visual searching and reaching tasks using an immersive VR system. Behavioral inattention test (BIT) scores obtained pre- and immediate post-VR program were compared. BIT scores obtained pre- and post-VR program revealed that far space neglect but not near space neglect improved promptly after the VR program. This effect for far space neglect was observed in the cancelation task, but not in the line bisection task. Positive effects of the immersive VR program for far space neglect are suggested by the results of the present pilot study. However, further studies with rigorous designs are needed to validate its clinical effectiveness.

  13. Eyewitness Memory in Face-to-Face and Immersive Avatar-to-Avatar Contexts.

    PubMed

    Taylor, Donna A; Dando, Coral J

    2018-01-01

    Technological advances offer possibilities for innovation in the way eyewitness testimony is elicited. Typically, this occurs face-to-face. We investigated whether a virtual environment, where interviewer and eyewitness communicate as avatars, might confer advantages by attenuating the social and situational demands of a face-to-face interview, releasing more cognitive resources for invoking episodic retrieval mode. In conditions of intentional encoding, eyewitnesses were interviewed 48 h later, either face-to-face or in a virtual environment ( N = 38). Participants in the virtual environment significantly outperformed those interviewed face-to-face on all episodic performance measures - improved correct reporting reduced errors, and increased accuracy. Participants reported finding it easier to admit not remembering event information to the avatar, and finding the avatar easier to talk to. These novel findings, and our pattern of retrieval results indicates the potential of avatar-to-avatar communication in virtual environments, and provide impetus for further research investigating eyewitness cognition in contemporary retrieval contexts.

  14. BIM Based Virtual Environment for Fire Emergency Evacuation

    PubMed Central

    Rezgui, Yacine; Ong, Hoang N.

    2014-01-01

    Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management. PMID:25197704

  15. Comparison of path visualizations and cognitive measures relative to travel technique in a virtual environment.

    PubMed

    Zanbaka, Catherine A; Lok, Benjamin C; Babu, Sabarish V; Ulinski, Amy C; Hodges, Larry F

    2005-01-01

    We describe a between-subjects experiment that compared four different methods of travel and their effect on cognition and paths taken in an immersive virtual environment (IVE). Participants answered a set of questions based on Crook's condensation of Bloom's taxonomy that assessed their cognition of the IVE with respect to knowledge, understanding and application, and higher mental processes. Participants also drew a sketch map of the IVE and the objects within it. The users' sense of presence was measured using the Steed-Usoh-Slater Presence Questionnaire. The participants' position and head orientation were automatically logged during their exposure to the virtual environment. These logs were later used to create visualizations of the paths taken. Path analysis, such as exploring the overlaid path visualizations and dwell data information, revealed further differences among the travel techniques. Our results suggest that, for applications where problem solving and evaluation of information is important or where opportunity to train is minimal, then having a large tracked space so that the participant can walk around the virtual environment provides benefits over common virtual travel techniques.

  16. Manipulation of volumetric patient data in a distributed virtual reality environment.

    PubMed

    Dech, F; Ai, Z; Silverstein, J C

    2001-01-01

    Due to increases in network speed and bandwidth, distributed exploration of medical data in immersive Virtual Reality (VR) environments is becoming increasingly feasible. The volumetric display of radiological data in such environments presents a unique set of challenges. The shear size and complexity of the datasets involved not only make them difficult to transmit to remote sites, but these datasets also require extensive user interaction in order to make them understandable to the investigator and manageable to the rendering hardware. A sophisticated VR user interface is required in order for the clinician to focus on the aspects of the data that will provide educational and/or diagnostic insight. We will describe a software system of data acquisition, data display, Tele-Immersion, and data manipulation that supports interactive, collaborative investigation of large radiological datasets. The hardware required in this strategy is still at the high-end of the graphics workstation market. Future software ports to Linux and NT, along with the rapid development of PC graphics cards, open the possibility for later work with Linux or NT PCs and PC clusters.

  17. A comparison of older adults' subjective experiences with virtual and real environments during dynamic balance activities.

    PubMed

    Proffitt, Rachel; Lange, Belinda; Chen, Christina; Winstein, Carolee

    2015-01-01

    The purpose of this study was to explore the subjective experience of older adults interacting with both virtual and real environments. Thirty healthy older adults engaged with real and virtual tasks of similar motor demands: reaching to a target in standing and stepping stance. Immersive tendencies and absorption scales were administered before the session. Game engagement and experience questionnaires were completed after each task, followed by a semistructured interview at the end of the testing session. Data were analyzed respectively using paired t tests and grounded theory methodology. Participants preferred the virtual task over the real task. They also reported an increase in presence and absorption with the virtual task, describing an external focus of attention. Findings will be used to inform future development of appropriate game-based balance training applications that could be embedded in the home or community settings as part of evidence-based fall prevention programs.

  18. Image-guided surgery.

    PubMed

    Wagner, A; Ploder, O; Enislidis, G; Truppe, M; Ewers, R

    1996-04-01

    Interventional video tomography (IVT), a new imaging modality, achieves virtual visualization of anatomic structures in three dimensions for intraoperative stereotactic navigation. Partial immersion into a virtual data space, which is orthotopically coregistered to the surgical field, enhances, by means of a see-through head-mounted display (HMD), the surgeon's visual perception and technique by providing visual access to nonvisual data of anatomy, physiology, and function. The presented cases document the potential of augmented reality environments in maxillofacial surgery.

  19. The development of the virtual reality system for the treatment of the fears of public speaking.

    PubMed

    Jo, H J; Ku, J H; Jang, D P; Shin, M B; Ahn, H B; Lee, J M; Cho, B H; Kim, S I

    2001-01-01

    The fear of public speaking is a kind of social phobias. The patients having the fear of public speaking show some symptoms like shame and timidity in the daily personal relationship. They are afraid that the other person would be puzzled, feel insulted, and they also fear that they should be underestimated for their mistakes. For the treatment of the fear of public speaking, the cognitive-behavioral therapy has been generally used. The cognitive-behavioral therapy is the method that makes the patients gradually experience some situations inducing the fears and overcome those at last. Recently, the virtual reality technology has been introduced as an alternative method for providing phobic situations. In this study, we developed the public speaking simulator and the virtual environments for the treatment of the fear of public speaking. The head-mounted display, the head-tracker and the 3 dimensional sound system were used for the immersive virtual environment. The imagery of the virtual environment consists of a seminar room and 8 virtual audiences. The patient will speak in front of these virtual audiences and the therapist can control motions, facial expressions, sounds, and voices of each virtual audience.

  20. Using Auditory Cues to Perceptually Extract Visual Data in Collaborative, Immersive Big-Data Display Systems

    NASA Astrophysics Data System (ADS)

    Lee, Wendy

    The advent of multisensory display systems, such as virtual and augmented reality, has fostered a new relationship between humans and space. Not only can these systems mimic real-world environments, they have the ability to create a new space typology made solely of data. In these spaces, two-dimensional information is displayed in three dimensions, requiring human senses to be used to understand virtual, attention-based elements. Studies in the field of big data have predominately focused on visual representations and extractions of information with little focus on sounds. The goal of this research is to evaluate the most efficient methods of perceptually extracting visual data using auditory stimuli in immersive environments. Using Rensselaer's CRAIVE-Lab, a virtual reality space with 360-degree panorama visuals and an array of 128 loudspeakers, participants were asked questions based on complex visual displays using a variety of auditory cues ranging from sine tones to camera shutter sounds. Analysis of the speed and accuracy of participant responses revealed that auditory cues that were more favorable for localization and were positively perceived were best for data extraction and could help create more user-friendly systems in the future.

  1. Immersive 3D geovisualisation in higher education

    NASA Astrophysics Data System (ADS)

    Philips, Andrea; Walz, Ariane; Bergner, Andreas; Graeff, Thomas; Heistermann, Maik; Kienzler, Sarah; Korup, Oliver; Lipp, Torsten; Schwanghart, Wolfgang; Zeilinger, Gerold

    2014-05-01

    Through geovisualisation we explore spatial data, we analyse it towards a specific questions, we synthesise results, and we present and communicate them to a specific audience (MacEachren & Kraak 1997). After centuries of paper maps, the means to represent and visualise our physical environment and its abstract qualities have changed dramatically since the 1990s - and accordingly the methods how to use geovisualisation in teaching. Whereas some people might still consider the traditional classroom as ideal setting for teaching and learning geographic relationships and its mapping, we used a 3D CAVE (computer-animated virtual environment) as environment for a problem-oriented learning project called "GEOSimulator". Focussing on this project, we empirically investigated, if such a technological advance like the CAVE make 3D visualisation, including 3D geovisualisation, not only an important tool for businesses (Abulrub et al. 2012) and for the public (Wissen et al. 2008), but also for educational purposes, for which it had hardly been used yet. The 3D CAVE is a three-sided visualisation platform, that allows for immersive and stereoscopic visualisation of observed and simulated spatial data. We examined the benefits of immersive 3D visualisation for geographic research and education and synthesized three fundamental technology-based visual aspects: First, the conception and comprehension of space and location does not need to be generated, but is instantaneously and intuitively present through stereoscopy. Second, optical immersion into virtual reality strengthens this spatial perception which is in particular important for complex 3D geometries. And third, a significant benefit is interactivity, which is enhanced through immersion and allows for multi-discursive and dynamic data exploration and knowledge transfer. Based on our problem-oriented learning project, which concentrates on a case study on flood risk management at the Wilde Weisseritz in Germany, a river that significantly contributed to the hundred-year flooding in Dresden in 2002, we empirically evaluated the usefulness of this immersive 3D technology towards learning success. Results show that immersive 3D geovisualisation have educational and content-related advantages compared to 2D geovisualisations through the mentioned benefits. This innovative way of geovisualisation is thus not only entertaining and motivating for students, but can also be constructive for research studies by, for instance, facilitating the study of complex environments or decision-making processes.

  2. Building a Virtual Environment for Diabetes Self-Management Education and Support

    PubMed Central

    Johnson, Constance; Feenan, Kevin; Setliff, Glenn; Pereira, Katherine; Hassell, Nancy; Beresford, Henry F.; Epps, Shelly; Nicollerat, Janet; Tatum, William; Feinglos, Mark; Vorderstrasse, Allison

    2015-01-01

    The authors developed an immersive diabetes community to provide diabetes self-management education and support for adults with type 2 diabetes. In this article the authors describe the procedures used to develop this virtual environment (VE). Second Life Impacts Diabetes Education & Self-Management (SLIDES), the VE for our diabetes community was built in Second Life. Social Cognitive Theory, behavioral principles and key aspects of virtual environments related to usability were applied in the development in this VE. Collaboration between researchers, clinicians and information technology (IT) specialists occurred throughout the development process. An interactive community was successfully built and utilized to provide diabetes self-management education and support. VEs for health applications may be innovative and enticing, yet it must be kept in mind that there are substantial effort, expertise, and usability factors that must be considered in the development of these environments for health care consumers. PMID:25699133

  3. Restorative effects of virtual nature settings.

    PubMed

    Valtchanov, Deltcho; Barton, Kevin R; Ellard, Colin

    2010-10-01

    Previous research regarding the potential benefits of exposing individuals to surrogate nature (photographs and videos) has found that such immersion results in restorative effects such as increased positive affect, decreased negative affect, and decreased stress. In the current experiment, we examined whether immersion in a virtual computer-generated nature setting could produce restorative effects. Twenty-two participants were equally divided between two conditions, while controlling for gender. In each condition, participants performed a stress-induction task, and were then immersed in virtual reality (VR) for 10 minutes. The control condition featured a slide show in VR, and the nature experimental condition featured an active exploration of a virtual forest. Participants in the nature condition were found to exhibit increased positive affect and decreased stress after immersion in VR when compared to those in the control condition. The results suggest that immersion in virtual nature settings has similar beneficial effects as exposure to surrogate nature. These results also suggest that VR can be used as a tool to study and understand restorative effects.

  4. Influence of moving visual environment on sit-to-stand kinematics in children and adults.

    PubMed

    Slaboda, Jill C; Barton, Joseph E; Keshner, Emily A

    2009-08-01

    The effect of visual field motion on the sit-to-stand kinematics of adults and children was investigated. Children (8 to12 years of age) and adults (21 to 49 years of age) were seated in a virtual environment that rotated in the pitch and roll directions. Participants stood up either (1) concurrent with onset of visual motion or (2) after an immersion period in the moving visual environment, and (3) without visual input. Angular velocities of the head with respect to the trunk, and trunk with respect to the environment, w ere calculated as was head andtrunk center of mass. Both adults and children reduced head and trunk angular velocity after immersion in the moving visual environment. Unlike adults, children demonstrated significant differences in displacement of the head center of mass during the immersion and concurrent trials when compared to trials without visual input. Results suggest a time-dependent effect of vision on sit-to-stand kinematics in adults, whereas children are influenced by the immediate presence or absence of vision.

  5. A Nationwide Experimental Multi-Gigabit Network

    DTIC Science & Technology

    2003-03-01

    television and cinema , and to real- time interactive teleconferencing. There is another variable which affects this happy growth in network bandwidth and...render large scientific data sets with interactive frame rates on the desktop or in an immersive virtual reality ( VR ) environment. In our design, we

  6. Considerations for the future development of virtual technology as a rehabilitation tool

    PubMed Central

    Kenyon, Robert V; Leigh, Jason; Keshner, Emily A

    2004-01-01

    Background Virtual environments (VE) are a powerful tool for various forms of rehabilitation. Coupling VE with high-speed networking [Tele-Immersion] that approaches speeds of 100 Gb/sec can greatly expand its influence in rehabilitation. Accordingly, these new networks will permit various peripherals attached to computers on this network to be connected and to act as fast as if connected to a local PC. This innovation may soon allow the development of previously unheard of networked rehabilitation systems. Rapid advances in this technology need to be coupled with an understanding of how human behavior is affected when immersed in the VE. Methods This paper will discuss various forms of VE that are currently available for rehabilitation. The characteristic of these new networks and examine how such networks might be used for extending the rehabilitation clinic to remote areas will be explained. In addition, we will present data from an immersive dynamic virtual environment united with motion of a posture platform to record biomechanical and physiological responses to combined visual, vestibular, and proprioceptive inputs. A 6 degree-of-freedom force plate provides measurements of moments exerted on the base of support. Kinematic data from the head, trunk, and lower limb was collected using 3-D video motion analysis. Results Our data suggest that when there is a confluence of meaningful inputs, neither vision, vestibular, or proprioceptive inputs are suppressed in healthy adults; the postural response is modulated by all existing sensory signals in a non-additive fashion. Individual perception of the sensory structure appears to be a significant component of the response to these protocols and underlies much of the observed response variability. Conclusion The ability to provide new technology for rehabilitation services is emerging as an important option for clinicians and patients. The use of data mining software would help analyze the incoming data to provide both the patient and the therapist with evaluation of the current treatment and modifications needed for future therapies. Quantification of individual perceptual styles in the VE will support development of individualized treatment programs. The virtual environment can be a valuable tool for therapeutic interventions that require adaptation to complex, multimodal environments. PMID:15679951

  7. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  8. Playing in or out of character: user role differences in the experience of interactive storytelling.

    PubMed

    Roth, Christian; Vermeulen, Ivar; Vorderer, Peter; Klimmt, Christoph; Pizzi, David; Lugrin, Jean-Luc; Cavazza, Marc

    2012-11-01

    Interactive storytelling (IS) is a promising new entertainment technology synthesizing preauthored narrative with dynamic user interaction. Existing IS prototypes employ different modes to involve users in a story, ranging from individual avatar control to comprehensive control over the virtual environment. The current experiment tested whether different player modes (exerting local vs. global influence) yield different user experiences (e.g., senses of immersion vs. control). A within-subject design involved 34 participants playing the cinematic IS drama "Emo Emma"( 1 ) both in the local (actor) and in global (ghost) mode. The latter mode allowed free movement in the virtual environment and hidden influence on characters, objects, and story development. As expected, control-related experiential qualities (effectance, autonomy, flow, and pride) were more intense for players in the global (ghost) mode. Immersion-related experiences did not differ over modes. Additionally, men preferred the sense of command facilitated by the ghost mode, whereas women preferred the sense of involvement facilitated by the actor mode.

  9. Behavioral compliance for dynamic versus static signs in an immersive virtual environment.

    PubMed

    Duarte, Emília; Rebelo, Francisco; Teles, Júlia; Wogalter, Michael S

    2014-09-01

    This study used an immersive virtual environment (IVE) to examine how dynamic features in signage affect behavioral compliance during a work-related task and an emergency egress. Ninety participants performed a work-related task followed by an emergency egress. Compliance with uncued and cued safety signs was assessed prior to an explosion/fire involving egress with exit signs. Although dynamic presentation produced the highest compliance, the difference between dynamic and static presentation was only statistically significant for uncued signs. Uncued signs, both static and dynamic, were effective in changing behavior compared to no/minimal signs. Findings are explained based on sign salience and on task differences. If signs must capture attention while individuals are attending to other tasks, salient (e.g., dynamic) signs are useful in benefiting compliance. This study demonstrates the potential for IVEs to serve as a useful tool in behavioral compliance research. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. Virtual reality: a reality for future military pilotage?

    NASA Astrophysics Data System (ADS)

    McIntire, John P.; Martinsen, Gary L.; Marasco, Peter L.; Havig, Paul R.

    2009-05-01

    Virtual reality (VR) systems provide exciting new ways to interact with information and with the world. The visual VR environment can be synthetic (computer generated) or be an indirect view of the real world using sensors and displays. With the potential opportunities of a VR system, the question arises about what benefits or detriments a military pilot might incur by operating in such an environment. Immersive and compelling VR displays could be accomplished with an HMD (e.g., imagery on the visor), large area collimated displays, or by putting the imagery on an opaque canopy. But what issues arise when, instead of viewing the world directly, a pilot views a "virtual" image of the world? Is 20/20 visual acuity in a VR system good enough? To deliver this acuity over the entire visual field would require over 43 megapixels (MP) of display surface for an HMD or about 150 MP for an immersive CAVE system, either of which presents a serious challenge with current technology. Additionally, the same number of sensor pixels would be required to drive the displays to this resolution (and formidable network architectures required to relay this information), or massive computer clusters are necessary to create an entirely computer-generated virtual reality with this resolution. Can we presently implement such a system? What other visual requirements or engineering issues should be considered? With the evolving technology, there are many technological issues and human factors considerations that need to be addressed before a pilot is placed within a virtual cockpit.

  11. STRIVE: Stress Resilience In Virtual Environments: a pre-deployment VR system for training emotional coping skills and assessing chronic and acute stress responses.

    PubMed

    Rizzo, Albert; Buckwalter, J Galen; John, Bruce; Newman, Brad; Parsons, Thomas; Kenny, Patrick; Williams, Josh

    2012-01-01

    The incidence of posttraumatic stress disorder (PTSD) in returning OEF/OIF military personnel is creating a significant healthcare challenge. This has served to motivate research on how to better develop and disseminate evidence-based treatments for PTSD. One emerging form of treatment for combat-related PTSD that has shown promise involves the delivery of exposure therapy using immersive Virtual Reality (VR). Initial outcomes from open clinical trials have been positive and fully randomized controlled trials are currently in progress to further validate this approach. Based on our research group's initial positive outcomes using VR to emotionally engage and successfully treat persons undergoing exposure therapy for PTSD, we have begun development in a similar VR-based approach to deliver stress resilience training with military service members prior to their initial deployment. The Stress Resilience In Virtual Environments (STRIVE) project aims to create a set of combat simulations (derived from our existing Virtual Iraq/Afghanistan exposure therapy system) that are part of a multi-episode narrative experience. Users can be immersed within challenging combat contexts and interact with virtual characters within these episodes as part of an experiential learning approach for training a range of psychoeducational and cognitive-behavioral emotional coping strategies believed to enhance stress resilience. The STRIVE project aims to present this approach to service members prior to deployment as part of a program designed to better prepare military personnel for the types of emotional challenges that are inherent in the combat environment. During these virtual training experiences users are monitored physiologically as part of a larger investigation into the biomarkers of the stress response. One such construct, Allostatic Load, is being directly investigated via physiological and neuro-hormonal analysis from specimen collections taken immediately before and after engagement in the STRIVE virtual experience.

  12. Workshop Report on Virtual Worlds and Immersive Environments

    NASA Technical Reports Server (NTRS)

    Langhoff, Stephanie R.; Cowan-Sharp, Jessy; Dodson, Karen E.; Damer, Bruce; Ketner, Bob

    2009-01-01

    The workshop revolved around three framing ideas or scenarios about the evolution of virtual environments: 1. Remote exploration: The ability to create high fidelity environments rendered from external data or models such that exploration, design and analysis that is truly interoperable with the physical world can take place within them. 2. We all get to go: The ability to engage anyone in being a part of or contributing to an experience (such as a space mission), no matter their training or location. It is the creation of a new paradigm for education, outreach, and the conduct of science in society that is truly participatory. 3. Become the data: A vision of a future where boundaries between the physical and the virtual have ceased to be meaningful. What would this future look like? Is this plausible? Is it desirable? Why and why not?

  13. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  14. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  15. Design of an immersive simulator for assisted power wheelchair driving.

    PubMed

    Devigne, Louise; Babel, Marie; Nouviale, Florian; Narayanan, Vishnu K; Pasteau, Francois; Gallien, Philippe

    2017-07-01

    Driving a power wheelchair is a difficult and complex visual-cognitive task. As a result, some people with visual and/or cognitive disabilities cannot access the benefits of a power wheelchair because their impairments prevent them from driving safely. In order to improve their access to mobility, we have previously designed a semi-autonomous assistive wheelchair system which progressively corrects the trajectory as the user manually drives the wheelchair and smoothly avoids obstacles. Developing and testing such systems for wheelchair driving assistance requires a significant amount of material resources and clinician time. With Virtual Reality technology, prototypes can be developed and tested in a risk-free and highly flexible Virtual Environment before equipping and testing a physical prototype. Additionally, users can "virtually" test and train more easily during the development process. In this paper, we introduce a power wheelchair driving simulator allowing the user to navigate with a standard wheelchair in an immersive 3D Virtual Environment. The simulation framework is designed to be flexible so that we can use different control inputs. In order to validate the framework, we first performed tests on the simulator with able-bodied participants during which the user's Quality of Experience (QoE) was assessed through a set of questionnaires. Results show that the simulator is a promising tool for future works as it generates a good sense of presence and requires rather low cognitive effort from users.

  16. Evaluating an immersive virtual environment prototyping and simulation system

    NASA Astrophysics Data System (ADS)

    Nemire, Kenneth

    1997-05-01

    An immersive virtual environment (IVE) modeling and simulation tool is being developed for designing advanced weapon and training systems. One unique feature of the tool is that the design, and not just visualization of the design is accomplished with the IVE tool. Acceptance of IVE tools requires comparisons with current commercial applications. In this pilot study, expert users of a popular desktop 3D graphics application performed identical modeling and simulation tasks using both the desktop and IVE applications. The IVE tool consisted of a head-mounted display, 3D spatialized sound, spatial trackers on head and hands, instrumented gloves, and a simulated speech recognition system. The results are preliminary because performance from only four users has been examined. When using the IVE system, users completed the tasks to criteria in less time than when using the desktop application. Subjective ratings of the visual displays in each system were similar. Ratings for the desktop controls were higher than for the IVE controls. Ratings of immersion and user enjoyment were higher for the IVE than for the desktop application. These results are particular remarkable because participants had used the desktop application regularly for three to five years and the prototype IVE tool for only three to six hours.

  17. Experiencing Soil Science from your office through virtual experiences

    NASA Astrophysics Data System (ADS)

    Beato, M. Carmen; González-Merino, Ramón; Campillo, M. Carmen; Fernández-Ahumada, Elvira; Ortiz, Leovigilda; Taguas, Encarnación V.; Guerrero, José Emilio

    2017-04-01

    Currently, numerous tools based on the new information and communication technologies offer a wide range of possibilities for the implementation of interactive methodologies in Education and Science. In particular, virtual reality and immersive worlds - artificially generated computer environments where users interact through a figurative individual that represents them in that environment (their "avatar") - have been identified as the technology that will change the way we live, particularly in educational terms, product development and entertainment areas (Schmorrow, 2009). Gisbert-Cervera et al. (2011) consider that the 3D worlds in education, among others, provide a unique training and exchange of knowledge environment which allows a goal reflection to support activities and achieve learning outcomes. In Soil Sciences, the experimental component is essential to acquire the necessary knowledge to understand the biogeochemical processes taking place and their interactions with time, climate, topography and living organisms present. In this work, an immersive virtual environment which reproduces a series of pits have been developed to evaluate and differentiate soil characteristics such as texture, structure, consistency, color and other physical-chemical and biological properties for educational purposes. Bibliographical material such as pictures, books, papers and were collected in order to classify the information needed and to build the soil profiles into the virtual environment. The programming language for the virtual recreation was Unreal Engine4 (UE4; https://www.unrealengine.com/unreal-engine-4). This program was chosen because it provides two toolsets for programmers and it can also be used in tandem to accelerate development workflows. In addition, Unreal Engine4 technology powers hundreds of games as well as real-time 3D films, training simulations, visualizations and it creates very realistic graphics. For the evaluation of its impact and its usefulness in teaching, a series of surveys will be presented to undergraduate students and teachers. REFERENCES: Gisbert-Cervera M, Esteve-Gonzalez V., Camacho-Marti M.M. (2011). Delve into the Deep: Learning Potential in Metaverses and 3D Worlds. eLearning (25) Papers ISSN: 1887-1542 Schmorrow D.D. (2009). Why virtual? Theoretical Issues in Ergonomics Science 10(3): 279-282.

  18. Immersive and interactive virtual reality to improve learning and retention of neuroanatomy in medical students: a randomized controlled study.

    PubMed

    Ekstrand, Chelsea; Jamal, Ali; Nguyen, Ron; Kudryk, Annalise; Mann, Jennifer; Mendez, Ivar

    2018-02-23

    Spatial 3-dimensional understanding of the brain is essential to learning neuroanatomy, and 3-dimensional learning techniques have been proposed as tools to enhance neuroanatomy training. The aim of this study was to examine the impact of immersive virtual-reality neuroanatomy training and compare it to traditional paper-based methods. In this randomized controlled study, participants consisted of first- or second-year medical students from the University of Saskatchewan recruited via email and posters displayed throughout the medical school. Participants were randomly assigned to the virtual-reality group or the paper-based group and studied the spatial relations between neural structures for 12 minutes after performing a neuroanatomy baseline test, with both test and control questions. A postintervention test was administered immediately after the study period and 5-9 days later. Satisfaction measures were obtained. Of the 66 participants randomly assigned to the study groups, 64 were included in the final analysis, 31 in the virtual-reality group and 33 in the paper-based group. The 2 groups performed comparably on the baseline questions and showed significant performance improvement on the test questions following study. There were no significant differences between groups for the control questions, the postintervention test questions or the 7-day postintervention test questions. Satisfaction survey results indicated that neurophobia was decreased. Results from this study provide evidence that training in neuroanatomy in an immersive and interactive virtual-reality environment may be an effective neuroanatomy learning tool that warrants further study. They also suggest that integration of virtual-reality into neuroanatomy training may improve knowledge retention, increase study motivation and decrease neurophobia. Copyright 2018, Joule Inc. or its licensors.

  19. Immersive and interactive virtual reality to improve learning and retention of neuroanatomy in medical students: a randomized controlled study

    PubMed Central

    Ekstrand, Chelsea; Jamal, Ali; Nguyen, Ron; Kudryk, Annalise; Mann, Jennifer; Mendez, Ivar

    2018-01-01

    Background: Spatial 3-dimensional understanding of the brain is essential to learning neuroanatomy, and 3-dimensional learning techniques have been proposed as tools to enhance neuroanatomy training. The aim of this study was to examine the impact of immersive virtual-reality neuroanatomy training and compare it to traditional paper-based methods. Methods: In this randomized controlled study, participants consisted of first- or second-year medical students from the University of Saskatchewan recruited via email and posters displayed throughout the medical school. Participants were randomly assigned to the virtual-reality group or the paper-based group and studied the spatial relations between neural structures for 12 minutes after performing a neuroanatomy baseline test, with both test and control questions. A postintervention test was administered immediately after the study period and 5-9 days later. Satisfaction measures were obtained. Results: Of the 66 participants randomly assigned to the study groups, 64 were included in the final analysis, 31 in the virtual-reality group and 33 in the paper-based group. The 2 groups performed comparably on the baseline questions and showed significant performance improvement on the test questions following study. There were no significant differences between groups for the control questions, the postintervention test questions or the 7-day postintervention test questions. Satisfaction survey results indicated that neurophobia was decreased. Interpretation: Results from this study provide evidence that training in neuroanatomy in an immersive and interactive virtual-reality environment may be an effective neuroanatomy learning tool that warrants further study. They also suggest that integration of virtual-reality into neuroanatomy training may improve knowledge retention, increase study motivation and decrease neurophobia. PMID:29510979

  20. Immersive participation: Smartphone-Apps and Virtual Reality - tools for knowledge transfer, citizen science and interactive collaboration

    NASA Astrophysics Data System (ADS)

    Dotterweich, Markus

    2017-04-01

    In the last few years, the use of smartphone-apps has become a daily routine in our life. However, only a few approaches have been undertaken to use apps for transferring scientific knowledge to the public audience. The development of learning apps or serious games requires large efforts and several levels of simplification which is different to traditional text books or learning webpages. Current approaches often lack a connection to the real life and/or innovative gamification concepts. Another almost untapped potential is the use of Virtual Reality, a fast growing technology which replicates a virtual environment in order to simulate physical experiences in artificial or real worlds. Hence, smartphone-apps and VR provides new opportunities for capacity building, knowledge transfer, citizen science or interactive engagement in the realm of environmental sciences. This presentation will show some examples and discuss the advantages of these immersive approaches to improve the knowledge transfer between scientists and citizens and to stimulate actions in the real world.

  1. Coupled auralization and virtual video for immersive multimedia displays

    NASA Astrophysics Data System (ADS)

    Henderson, Paul D.; Torres, Rendell R.; Shimizu, Yasushi; Radke, Richard; Lonsway, Brian

    2003-04-01

    The implementation of maximally-immersive interactive multimedia in exhibit spaces requires not only the presentation of realistic visual imagery but also the creation of a perceptually accurate aural experience. While conventional implementations treat audio and video problems as essentially independent, this research seeks to couple the visual sensory information with dynamic auralization in order to enhance perceptual accuracy. An implemented system has been developed for integrating accurate auralizations with virtual video techniques for both interactive presentation and multi-way communication. The current system utilizes a multi-channel loudspeaker array and real-time signal processing techniques for synthesizing the direct sound, early reflections, and reverberant field excited by a moving sound source whose path may be interactively defined in real-time or derived from coupled video tracking data. In this implementation, any virtual acoustic environment may be synthesized and presented in a perceptually-accurate fashion to many participants over a large listening and viewing area. Subject tests support the hypothesis that the cross-modal coupling of aural and visual displays significantly affects perceptual localization accuracy.

  2. Planning, Implementation and Optimization of Future space Missions using an Immersive Visualization Environement (IVE) Machine

    NASA Astrophysics Data System (ADS)

    Harris, E.

    Planning, Implementation and Optimization of Future Space Missions using an Immersive Visualization Environment (IVE) Machine E. N. Harris, Lockheed Martin Space Systems, Denver, CO and George.W. Morgenthaler, U. of Colorado at Boulder History: A team of 3-D engineering visualization experts at the Lockheed Martin Space Systems Company have developed innovative virtual prototyping simulation solutions for ground processing and real-time visualization of design and planning of aerospace missions over the past 6 years. At the University of Colorado, a team of 3-D visualization experts are developing the science of 3-D visualization and immersive visualization at the newly founded BP Center for Visualization, which began operations in October, 2001. (See IAF/IAA-01-13.2.09, "The Use of 3-D Immersive Visualization Environments (IVEs) to Plan Space Missions," G. A. Dorn and G. W. Morgenthaler.) Progressing from Today's 3-D Engineering Simulations to Tomorrow's 3-D IVE Mission Planning, Simulation and Optimization Techniques: 3-D (IVEs) and visualization simulation tools can be combined for efficient planning and design engineering of future aerospace exploration and commercial missions. This technology is currently being developed and will be demonstrated by Lockheed Martin in the (IVE) at the BP Center using virtual simulation for clearance checks, collision detection, ergonomics and reach-ability analyses to develop fabrication and processing flows for spacecraft and launch vehicle ground support operations and to optimize mission architecture and vehicle design subject to realistic constraints. Demonstrations: Immediate aerospace applications to be demonstrated include developing streamlined processing flows for Reusable Space Transportation Systems and Atlas Launch Vehicle operations and Mars Polar Lander visual work instructions. Long-range goals include future international human and robotic space exploration missions such as the development of a Mars Reconnaissance Orbiter and Lunar Base construction scenarios. Innovative solutions utilizing Immersive Visualization provide the key to streamlining the mission planning and optimizing engineering design phases of future aerospace missions.

  3. Science and Technology for Communication and Persuasion Abroad: Gap Analysis and Survey

    DTIC Science & Technology

    2012-03-01

    technology are heavily influenced by studying technology use in the West—which, some argue, biases the field toward individualist rather than collectivist ...serious games, particularly for non-Western cultures , and to develop new technologies to that end. Such investment should include immersive virtual...environments, which favor different strategies of influence than text-based environments.27 Rilla Khaled, " Culturally -Relevant Persuasive Technology

  4. The Martian: Examining Human Physical Judgments across Virtual Gravity Fields.

    PubMed

    Ye, Tian; Qi, Siyuan; Kubricht, James; Zhu, Yixin; Lu, Hongjing; Zhu, Song-Chun

    2017-04-01

    This paper examines how humans adapt to novel physical situations with unknown gravitational acceleration in immersive virtual environments. We designed four virtual reality experiments with different tasks for participants to complete: strike a ball to hit a target, trigger a ball to hit a target, predict the landing location of a projectile, and estimate the flight duration of a projectile. The first two experiments compared human behavior in the virtual environment with real-world performance reported in the literature. The last two experiments aimed to test the human ability to adapt to novel gravity fields by measuring their performance in trajectory prediction and time estimation tasks. The experiment results show that: 1) based on brief observation of a projectile's initial trajectory, humans are accurate at predicting the landing location even under novel gravity fields, and 2) humans' time estimation in a familiar earth environment fluctuates around the ground truth flight duration, although the time estimation in unknown gravity fields indicates a bias toward earth's gravity.

  5. Moving virtuality into reality: A comparison study of the effectiveness of traditional and alternative assessments of learning in a multisensory, fully immersive physics program

    NASA Astrophysics Data System (ADS)

    Gamor, Keysha Ingram

    This paper contains a research study that investigated the relative efficacy of using both a traditional paper-and-pencil assessment instrument and an alternative, virtual reality (VR) assessment instrument to assist educators and/or instructional designers in measuring learning in a virtual reality learning environment. To this end, this research study investigated assessment in VR, with the goal of analyzing aspects of student learning in VR that are feasible to access or capture by traditional assessments and alternative assessments. The researcher also examined what additional types of learning alternative assessments may offer. More specifically, this study compared the effectiveness of a traditional method with an alternative (performance-based) method of assessment that was used to examine the ability of the tools to accurately evidence the levels of students' understanding and learning. The domain area was electrostatics, a complex, abstract multidimensional concept, with which students often experience difficulty. Outcomes of the study suggest that, in the evaluation of learning in an immersive VR learning environment, assessments would most accurately manifest student learning if the assessment measure matched the learning environment itself. In this study, learning and assessing in the VR environment yielded higher final test scores than learning in VR and testing with traditional paper-and-pencil. Being able to transfer knowledge from a VR environment to other situations is critical in demonstrating the overall level of understanding of a concept. For this reason, the researcher recommends a combination of testing measures to enhance understanding of complex, abstract concepts.

  6. Locomotive Recalibration and Prism Adaptation of Children and Teens in Immersive Virtual Environments.

    PubMed

    Adams, Haley; Narasimham, Gayathri; Rieser, John; Creem-Regehr, Sarah; Stefanucci, Jeanine; Bodenheimer, Bobby

    2018-04-01

    As virtual reality expands in popularity, an increasingly diverse audience is gaining exposure to immersive virtual environments (IVEs). A significant body of research has demonstrated how perception and action work in such environments, but most of this work has been done studying adults. Less is known about how physical and cognitive development affect perception and action in IVEs, particularly as applied to preteen and teenage children. Accordingly, in the current study we assess how preteens (children aged 8-12 years) and teenagers (children aged 15-18 years) respond to mismatches between their motor behavior and the visual information presented by an IVE. Over two experiments, we evaluate how these individuals recalibrate their actions across functionally distinct systems of movement. The first experiment analyzed forward walking recalibration after exposure to an IVE with either increased or decreased visual flow. Visual flow during normal bipedal locomotion was manipulated to be either twice or half as fast as the physical gait. The second experiment leveraged a prism throwing adaptation paradigm to test the effect of recalibration on throwing movement. In the first experiment, our results show no differences across age groups, although subjects generally experienced a post-exposure effect of shortened distance estimation after experiencing visually faster flow and longer distance estimation after experiencing visually slower flow. In the second experiment, subjects generally showed the typical prism adaptation behavior of a throwing after-effect error. The error lasted longer for preteens than older children. Our results have implications for the design of virtual systems with children as a target audience.

  7. Impossible spaces: maximizing natural walking in virtual environments with self-overlapping architecture.

    PubMed

    Suma, Evan A; Lipps, Zachary; Finkelstein, Samantha; Krum, David M; Bolas, Mark

    2012-04-01

    Walking is only possible within immersive virtual environments that fit inside the boundaries of the user's physical workspace. To reduce the severity of the restrictions imposed by limited physical area, we introduce "impossible spaces," a new design mechanic for virtual environments that wish to maximize the size of the virtual environment that can be explored with natural locomotion. Such environments make use of self-overlapping architectural layouts, effectively compressing comparatively large interior environments into smaller physical areas. We conducted two formal user studies to explore the perception and experience of impossible spaces. In the first experiment, we showed that reasonably small virtual rooms may overlap by as much as 56% before users begin to detect that they are in an impossible space, and that the larger virtual rooms that expanded to maximally fill our available 9.14 m x 9.14 m workspace may overlap by up to 31%. Our results also demonstrate that users perceive distances to objects in adjacent overlapping rooms as if the overall space was uncompressed, even at overlap levels that were overtly noticeable. In our second experiment, we combined several well-known redirection techniques to string together a chain of impossible spaces in an expansive outdoor scene. We then conducted an exploratory analysis of users' verbal feedback during exploration, which indicated that impossible spaces provide an even more powerful illusion when users are naive to the manipulation.

  8. Development of a low-cost virtual reality workstation for training and education

    NASA Technical Reports Server (NTRS)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) it involves 3-dimensional computer graphics; (2) it includes real-time feedback and response to user actions; and (3) it must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, but the high cost of VR technology has limited its practical application to fields with big budgets, such as military combat simulation, commercial pilot training, and certain projects within the space program. However, in the last year there has been a revolution in the cost of VR technology. The speed of inexpensive personal computers has increased dramatically, especially with the introduction of the Pentium processor and the PCI bus for IBM-compatibles, and the cost of high-quality virtual reality peripherals has plummeted. The result is that many public schools, colleges, and universities can afford a PC-based workstation capable of running immersive virtual reality applications. My goal this summer was to assemble and evaluate such a system.

  9. Immersive Education, an Annotated Webliography

    ERIC Educational Resources Information Center

    Pricer, Wayne F.

    2011-01-01

    In this second installment of a two-part feature on immersive education a webliography will provide resources discussing the use of various types of computer simulations including: (a) augmented reality, (b) virtual reality programs, (c) gaming resources for teaching with technology, (d) virtual reality lab resources, (e) virtual reality standards…

  10. Immersive Earth Science: Data Visualization in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Skolnik, S.; Ramirez-Linan, R.

    2017-12-01

    Utilizing next generation technology, Navteca's exploration of 3D and volumetric temporal data in Virtual Reality (VR) takes advantage of immersive user experiences where stakeholders are literally inside the data. No longer restricted by the edges of a screen, VR provides an innovative way of viewing spatially distributed 2D and 3D data that leverages a 360 field of view and positional-tracking input, allowing users to see and experience data differently. These concepts are relevant to many sectors, industries, and fields of study, as real-time collaboration in VR can enhance understanding and mission with VR visualizations that display temporally-aware 3D, meteorological, and other volumetric datasets. The ability to view data that is traditionally "difficult" to visualize, such as subsurface features or air columns, is a particularly compelling use of the technology. Various development iterations have resulted in Navteca's proof of concept that imports and renders volumetric point-cloud data in the virtual reality environment by interfacing PC-based VR hardware to a back-end server and popular GIS software. The integration of the geo-located data in VR and subsequent display of changeable basemaps, overlaid datasets, and the ability to zoom, navigate, and select specific areas show the potential for immersive VR to revolutionize the way Earth data is viewed, analyzed, and communicated.

  11. Virtual Viewing Time: The Relationship between Presence and Sexual Interest in Androphilic and Gynephilic Men

    PubMed Central

    Fromberger, Peter; Meyer, Sabrina; Kempf, Christina; Jordan, Kirsten; Müller, Jürgen L.

    2015-01-01

    Virtual Reality (VR) has successfully been used in the research of human behavior for more than twenty years. The main advantage of VR is its capability to induce a high sense of presence. This results in emotions and behavior which are very close to those shown in real situations. In the context of sex research, only a few studies have used high-immersive VR so far. The ones that did can be found mostly in the field of forensic psychology. Nevertheless, the relationship between presence and sexual interest still remains unclear. The present study is the first to examine the advantages of high-immersive VR in comparison to a conventional standard desktop system regarding their capability to measure sexual interest. 25 gynephilic and 20 androphilic healthy men underwent three experimental conditions, which differed in their ability to induce a sense of presence. In each condition, participants were asked to rate ten male and ten female virtual human characters regarding their sexual attractiveness. Without their knowledge, the subjects’ viewing time was assessed throughout the rating. Subjects were then asked to rate the sense of presence they had experienced as well as their perceived realism of the characters. Results suggested that stereoscopic viewing can significantly enhance the subjective sexual attractiveness of sexually relevant characters. Furthermore, in all three conditions participants looked significantly longer at sexually relevant virtual characters than at sexually non-relevant ones. The high immersion condition provided the best discriminant validity. From a statistical point of view, however, the sense of presence had no significant influence on the discriminant validity of the viewing time task. The study showed that high-immersive virtual environments enhance realism ratings as well as ratings of sexual attractiveness of three-dimensional human stimuli in comparison to standard desktop systems. Results also show that viewing time seems to be influenced neither by sexual attractiveness nor by realism of stimuli. This indicates how important task specific mechanisms of the viewing time effect are. PMID:25992790

  12. Evaluation of knowledge transfer in an immersive virtual learning environment for the transportation community : research project capsule.

    DOT National Transportation Integrated Search

    2010-01-01

    In 2008 alone, 720 individuals were killed in a : construction or maintenance work zone in the : United States. However, since 2003, the total : number of individuals killed in a construction or : maintenance work zone in the US reached a : staggerin...

  13. Studying the Effectiveness of Multi-User Immersive Environments for Collaborative Evaluation Tasks

    ERIC Educational Resources Information Center

    Lorenzo, Carlos-Miguel; Sicilia, Miguel Angel; Sanchez, Salvador

    2012-01-01

    Massively Multiuser On-line Learning (MMOL) Platforms, often called "virtual learning worlds", constitute a still unexplored context for communication-enhanced learning, where synchronous communication skills in an explicit social setting enhance the potential of effective collaboration. In this paper, we report on an experimental study of…

  14. Growth and Decline of Second Life as an Educational Platform

    ERIC Educational Resources Information Center

    Mark, Christine Libby

    2014-01-01

    "Second Life," a 3D online immersive virtual environment, emerged in 2003 and was predicted to become the predominant online course delivery platform by 2013. Educational institutions initially rushed to create a presence in the "Second Life;" however, after 2009 those same institutions were disappointed by their experiences…

  15. Seeing an Embodied Virtual Hand is Analgesic Contingent on Colocation.

    PubMed

    Nierula, Birgit; Martini, Matteo; Matamala-Gomez, Marta; Slater, Mel; Sanchez-Vives, Maria V

    2017-06-01

    Seeing one's own body has been reported to have analgesic properties. Analgesia has also been described when seeing an embodied virtual body colocated with the real one. However, there is controversy regarding whether this effect holds true when seeing an illusory-owned body part, such as during the rubber-hand illusion. A critical difference between these paradigms is the distance between the real and surrogate body part. Colocation of the real and surrogate arm is possible in an immersive virtual environment, but not during illusory ownership of a rubber arm. The present study aimed at testing whether the distance between a real and a virtual arm can explain such differences in terms of pain modulation. Using a paradigm of embodiment of a virtual body allowed us to evaluate heat pain thresholds at colocation and at a 30-cm distance between the real and the virtual arm. We observed a significantly higher heat pain threshold at colocation than at a 30-cm distance. The analgesic effects of seeing a virtual colocated arm were eliminated by increasing the distance between the real and the virtual arm, which explains why seeing an illusorily owned rubber arm does not consistently result in analgesia. These findings are relevant for the use of virtual reality in pain management. Looking at a virtual body has analgesic properties similar to looking at one's real body. We identify the importance of colocation between a real and a surrogate body for this to occur and thereby resolve a scientific controversy. This information is useful for exploiting immersive virtual reality in pain management. Copyright © 2017. Published by Elsevier Inc.

  16. A serious gaming/immersion environment to teach clinical cancer genetics.

    PubMed

    Nosek, Thomas M; Cohen, Mark; Matthews, Anne; Papp, Klara; Wolf, Nancy; Wrenn, Gregg; Sher, Andrew; Coulter, Kenneth; Martin, Jessica; Wiesner, Georgia L

    2007-01-01

    We are creating an interactive, simulated "Cancer Genetics Tower" for the self-paced learning of Clinical Cancer Genetics by medical students (go to: http://casemed.case.edu/cancergenetics). The environment uses gaming theory to engage the students into achieving specific learning objectives. The first few levels contain virtual laboratories where students achieve the basic underpinnings of Cancer Genetics. The next levels apply these principles to clinical practice. A virtual attending physician and four virtual patients, available for questioning through virtual video conferencing, enrich each floor. The pinnacle clinical simulation challenges the learner to integrate all information and demonstrate mastery, thus "winning" the game. A pilot test of the program by 17 medical students yielded very favorable feedback; the students found the Tower a "great way to teach", it held their attention, and it made learning fun. A majority of the students preferred the Tower over other resources to learn Cancer Genetics.

  17. A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills

    NASA Astrophysics Data System (ADS)

    Choi, Kup-Sze

    This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.

  18. Designing for Virtual Windows in a Deep Space Habitat

    NASA Technical Reports Server (NTRS)

    Howe, A. Scott; Howard, Robert L.; Moore, Nathan; Amoroso, Michael

    2013-01-01

    This paper discusses configurations and test analogs toward the design of a virtual window capability in a Deep Space Habitat. Long-duration space missions will require crews to remain in the confines of a spacecraft for extended periods of time, with possible harmful effects if a crewmember cannot cope with the small habitable volume. Virtual windows expand perceived volume using a minimal amount of image projection equipment and computing resources, and allow a limited immersion in remote environments. Uses for the virtual window include: live or augmented reality views of the external environment; flight deck, piloting, observation, or other participation in remote missions through live transmission of cameras mounted to remote vehicles; pre-recorded background views of nature areas, seasonal occurrences, or cultural events; and pre-recorded events such as birthdays, anniversaries, and other meaningful events prepared by ground support and families of the crewmembers.

  19. Virtually the ultimate research lab.

    PubMed

    Kulik, Alexander

    2018-04-26

    Virtual reality (VR) can serve as a viable platform for psychological research. The real world with many uncontrolled variables can be masked to immerse participants in complex interactive environments that are under full experimental control. However, as any other laboratory setting, these simulations are not perceived equally to reality and they also afford different behaviour. We need a better understanding of these differences, which are often related to parameters of the technical setup, to support valid interpretations of experimental results. © 2018 The British Psychological Society.

  20. Initial Assessment of Human Performance Using the Gaiter Interaction Technique to Control Locomotion in Fully Immersive Virtual Environments

    DTIC Science & Technology

    2004-06-30

    virtual space, as well as to match specific attributes of natural locomotion, such as perceived velocity and caloric expenditure . Moreover, the...wide range of postural motions (e.g., crouching, jumping , and bending to look around objects) with gestural stepping motions. The attributes of in...approximately 8 ¥ 8 ¥ 8 ft. The harness itself was an initial design made out of PVC pipe at the waist and above the head, with rope connecting the

  1. Manifold compositions, music visualization, and scientific sonification in an immersive virtual-reality environment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaper, H. G.

    1998-01-05

    An interdisciplinary project encompassing sound synthesis, music composition, sonification, and visualization of music is facilitated by the high-performance computing capabilities and the virtual-reality environments available at Argonne National Laboratory. The paper describes the main features of the project's centerpiece, DIASS (Digital Instrument for Additive Sound Synthesis); ''A.N.L.-folds'', an equivalence class of compositions produced with DIASS; and application of DIASS in two experiments in the sonification of complex scientific data. Some of the larger issues connected with this project, such as the changing ways in which both scientists and composers perform their tasks, are briefly discussed.

  2. eduCRATE--a Virtual Hospital architecture.

    PubMed

    Stoicu-Tivadar, Lăcrimioara; Stoicu-Tivadar, Vasile; Berian, Dorin; Drăgan, Simona; Serban, Alexandru; Serban, Corina

    2014-01-01

    eduCRATE is a complex project proposal which aims to develop a virtual learning environment offering interactive digital content through original and integrated solutions using cloud computing, complex multimedia systems in virtual space and personalized design with avatars. Compared to existing similar products the project brings the novelty of using languages for medical guides in order to ensure a maximum of flexibility. The Virtual Hospital simulations will create interactive clinical scenarios for which students will find solutions for positive diagnosis and therapeutic management. The solution based on cloud computing and immersive multimedia is an attractive option in education because is economical and it matches the current working style of the young generation to whom it addresses.

  3. Sexual self-regulation and cognitive absorption as factors of sexual response toward virtual characters.

    PubMed

    Renaud, Patrice; Trottier, Dominique; Nolet, Kevin; Rouleau, Joanne L; Goyette, Mathieu; Bouchard, Stéphane

    2014-04-01

    The eye movements and penile responses of 20 male participants were recorded while they were immersed with virtual sexual stimuli. These participants were divided into two groups according to their capacity to focus their attention in immersion (high and low focus). In order to understand sexual self-regulation better, we subjected participants to three experimental conditions: (a) immersion with a preferred sexual stimulus, without sexual inhibition; (b) immersion with a preferred sexual stimulus, with sexual inhibition; and (c) immersion with a neutral stimulus. A significant difference was observed between the effects of each condition on erectile response and scanpath. The groups differed on self-regulation of their erectile responses and on their scanpath patterns. High focus participants had more difficulties than low focus participants with inhibiting their sexual responses and displayed less scattered eye movement trajectories over the critical areas of the virtual sexual stimuli. Results are interpreted in terms of sexual self-regulation and cognitive absorption in virtual immersion. In addition, the use of validated virtual sexual stimuli is presented as a methodological improvement over static and moving pictures, since it paves the way for the study of the role of social interaction in an ecologically valid and well-controlled way.

  4. Local and Remote Cooperation With Virtual and Robotic Agents: A P300 BCI Study in Healthy and People Living With Spinal Cord Injury.

    PubMed

    Tidoni, Emmanuele; Abu-Alqumsan, Mohammad; Leonardis, Daniele; Kapeller, Christoph; Fusco, Gabriele; Guger, Cristoph; Hintermuller, Cristoph; Peer, Angelika; Frisoli, Antonio; Tecchia, Franco; Bergamasco, Massimo; Aglioti, Salvatore Maria

    2017-09-01

    The development of technological applications that allow people to control and embody external devices within social interaction settings represents a major goal for current and future brain-computer interface (BCI) systems. Prior research has suggested that embodied systems may ameliorate BCI end-user's experience and accuracy in controlling external devices. Along these lines, we developed an immersive P300-based BCI application with a head-mounted display for virtual-local and robotic-remote social interactions and explored in a group of healthy participants the role of proprioceptive feedback in the control of a virtual surrogate (Study 1). Moreover, we compared the performance of a small group of people with spinal cord injury (SCI) to a control group of healthy subjects during virtual and robotic social interactions (Study 2), where both groups received a proprioceptive stimulation. Our attempt to combine immersive environments, BCI technologies and neuroscience of body ownership suggests that providing realistic multisensory feedback still represents a challenge. Results have shown that healthy and people living with SCI used the BCI within the immersive scenarios with good levels of performance (as indexed by task accuracy, optimizations calls and Information Transfer Rate) and perceived control of the surrogates. Proprioceptive feedback did not contribute to alter performance measures and body ownership sensations. Further studies are necessary to test whether sensorimotor experience represents an opportunity to improve the use of future embodied BCI applications.

  5. Anxiety provocation and measurement using virtual reality in patients with obsessive-compulsive disorder.

    PubMed

    Kim, Kwanguk; Kim, Chan-Hyung; Cha, Kyung Ryeol; Park, Junyoung; Han, Kiwan; Kim, Yun Ki; Kim, Jae-Jin; Kim, In Young; Kim, Sun I

    2008-12-01

    The current study is a preliminary test of a virtual reality (VR) anxiety-provoking tool using a sample of participants with obsessive-compulsive disorder (OCD). The tasks were administrated to 33 participants with OCD and 30 healthy control participants. In the VR task, participants navigated through a virtual environment using a joystick and head-mounted display. The virtual environment consisted of three phases: training, distraction, and the main task. After the training and distraction phases, participants were allowed to check (a common OCD behavior) freely, as they would in the real world, and a visual analogy scale of anxiety was recorded during VR. Participants' anxiety in the virtual environment was measured with a validated measure of psychiatric symptoms and functions and analyzed with a VR questionnaire. Results revealed that those with OCD had significantly higher anxiety in the virtual environment than did healthy controls, and the decreased ratio of anxiety in participants with OCD was also higher than that of healthy controls. Moreover, the degree of anxiety of an individual with OCD was positively correlated with a his or her symptom score and immersive tendency score. These results suggest the possibility that VR technology has a value as an anxiety-provoking or treatment tool for OCD.

  6. Implementation of 3d Tools and Immersive Experience Interaction for Supporting Learning in a Library-Archive Environment. Visions and Challenges

    NASA Astrophysics Data System (ADS)

    Angeletaki, A.; Carrozzino, M.; Johansen, S.

    2013-07-01

    In this paper we present an experimental environment of 3D books combined with a game application that has been developed by a collaboration project between the Norwegian University of Science and Technology in Trondheim, Norway the NTNU University Library, and the Percro laboratory of Santa Anna University in Pisa, Italy. MUBIL is an international research project involving museums, libraries and ICT academy partners aiming to develop a consistent methodology enabling the use of Virtual Environments as a metaphor to present manuscripts content through the paradigms of interaction and immersion, evaluating different possible alternatives. This paper presents the results of the application of two prototypes of books augmented with the use of XVR and IL technology. We explore immersive-reality design strategies in archive and library contexts for attracting new users. Our newly established Mubil-lab has invited school classes to test the books augmented with 3D models and other multimedia content in order to investigate whether the immersion in such environments can create wider engagement and support learning. The metaphor of 3D books and game designs in a combination allows the digital books to be handled through a tactile experience and substitute the physical browsing. In this paper we present some preliminary results about the enrichment of the user experience in such environment.

  7. A comparison of older adults' subjective experience with virtual and real environments during dynamic balance activities

    PubMed Central

    Proffitt, Rachel; Lange, Belinda; Chen, Christina; Winstein, Carolee

    2014-01-01

    The purpose of this study was to explore the subjective experience of older adults interacting with both virtual and real environments. Thirty healthy older adults engaged with real and virtual tasks of similar motor demands: reaching to a target in standing and stepping stance. Immersive tendencies and absorption scales were administered before the session. Game engagement and experience questionnaires were completed after each task, followed by a semi-structured interview at the end of the testing session. Data were analyzed respectively using paired t-tests and grounded theory methodology. Participants preferred the virtual task over the real task. They also reported an increase in presence and absorption with the virtual task, describing an external focus of attention. Findings will be used to inform future development of appropriate game-based balance training applications that could be embedded in the home or community settings as part of evidence-based fall prevention programs. PMID:24334299

  8. Applied virtual reality at the Research Triangle Institute

    NASA Technical Reports Server (NTRS)

    Montoya, R. Jorge

    1994-01-01

    Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.

  9. Virtual reality simulation in neurosurgery: technologies and evolution.

    PubMed

    Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H

    2013-01-01

    Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.

  10. A Proposed Treatment for Visual Field Loss caused by Traumatic Brain Injury using Interactive Visuotactile Virtual Environment

    NASA Astrophysics Data System (ADS)

    Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella

    In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.

  11. The Integrated Virtual Environment Rehabilitation Treadmill System

    PubMed Central

    Feasel, Jeff; Whitton, Mary C.; Kassler, Laura; Brooks, Frederick P.; Lewek, Michael D.

    2015-01-01

    Slow gait speed and interlimb asymmetry are prevalent in a variety of disorders. Current approaches to locomotor retraining emphasize the need for appropriate feedback during intensive, task-specific practice. This paper describes the design and feasibility testing of the integrated virtual environment rehabilitation treadmill (IVERT) system intended to provide real-time, intuitive feedback regarding gait speed and asymmetry during training. The IVERT system integrates an instrumented, split-belt treadmill with a front-projection, immersive virtual environment. The novel adaptive control system uses only ground reaction force data from the treadmill to continuously update the speeds of the two treadmill belts independently, as well as to control the speed and heading in the virtual environment in real time. Feedback regarding gait asymmetry is presented 1) visually as walking a curved trajectory through the virtual environment and 2) proprioceptively in the form of different belt speeds on the split-belt treadmill. A feasibility study involving five individuals with asymmetric gait found that these individuals could effectively control the speed of locomotion and perceive gait asymmetry during the training session. Although minimal changes in overground gait symmetry were observed immediately following a single training session, further studies should be done to determine the IVERT’s potential as a tool for rehabilitation of asymmetric gait by providing patients with congruent visual and proprioceptive feedback. PMID:21652279

  12. Virtual environment display for a 3D audio room simulation

    NASA Technical Reports Server (NTRS)

    Chapin, William L.; Foster, Scott H.

    1992-01-01

    The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.

  13. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution

    PubMed Central

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks–walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience. PMID:26882473

  14. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution.

    PubMed

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks-walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience.

  15. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  16. Immersion and the illusion of presence in virtual reality.

    PubMed

    Slater, Mel

    2018-05-21

    This commentary briefly reviews the history of virtual reality and its use for psychology research, and clarifies the concepts of immersion and the illusion of presence. © 2018 The British Psychological Society.

  17. Use of immersive virtual reality to assess episodic memory: A validation study in older adults.

    PubMed

    Corriveau Lecavalier, Nick; Ouellet, Émilie; Boller, Benjamin; Belleville, Sylvie

    2018-05-29

    Virtual reality (VR) allows for the creation of ecological environments that could be used for cognitive assessment and intervention. This study comprises two parts that describe and assess an immersive VR task, the Virtual Shop, which can be used to measure episodic memory. Part 1 addresses its applicability in healthy older adults by measuring presence, motivation, and cybersickness symptoms. Part 2 addresses its construct validity by investigating correlations between performance in the VR task and on a traditional experimental memory task, and by measuring whether the VR task is sensitive to age-related memory differences. Fifty-seven older and 20 younger adults were assessed in the Virtual Shop, in which they memorised and fetched 12 familiar items. Part 1 showed high levels of presence, higher levels of motivation for the VR than for the traditional task, and negligible cybersickness symptoms. Part 2 indicates that memory performance in the VR task is positively correlated with performance on a traditional memory task for both age groups, and age-related differences were found on the VR and traditional memory tasks. Thus, the use of VR is feasible in older adults and the Virtual Shop is a valid task to assess and train episodic memory in this population.

  18. Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments

    PubMed Central

    Slater, Mel

    2009-01-01

    In this paper, I address the question as to why participants tend to respond realistically to situations and events portrayed within an immersive virtual reality system. The idea is put forward, based on the experience of a large number of experimental studies, that there are two orthogonal components that contribute to this realistic response. The first is ‘being there’, often called ‘presence’, the qualia of having a sensation of being in a real place. We call this place illusion (PI). Second, plausibility illusion (Psi) refers to the illusion that the scenario being depicted is actually occurring. In the case of both PI and Psi the participant knows for sure that they are not ‘there’ and that the events are not occurring. PI is constrained by the sensorimotor contingencies afforded by the virtual reality system. Psi is determined by the extent to which the system can produce events that directly relate to the participant, the overall credibility of the scenario being depicted in comparison with expectations. We argue that when both PI and Psi occur, participants will respond realistically to the virtual reality. PMID:19884149

  19. Artificial Versus Video-Based Immersive Virtual Surroundings: Analysis of Performance and User's Preference.

    PubMed

    Huber, Tobias; Paschold, Markus; Hansen, Christian; Lang, Hauke; Kneist, Werner

    2018-06-01

    Immersive virtual reality (VR) laparoscopy simulation connects VR simulation with head-mounted displays to increase presence during VR training. The goal of the present study was the comparison of 2 different surroundings according to performance and users' preference. With a custom immersive virtual reality laparoscopy simulator, an artificially created VR operating room (AVR) and a highly immersive VR operating room (IVR) were compared. Participants (n = 30) performed 3 tasks (peg transfer, fine dissection, and cholecystectomy) in AVR and IVR in a crossover study design. No overall difference in virtual laparoscopic performance was obtained when comparing results from AVR with IVR. Most participants preferred the IVR surrounding (n = 24). Experienced participants (n = 10) performed significantly better than novices (n = 10) in all tasks regardless of the surrounding ( P < .05). Participants with limited experience (n = 10) showed differing results. Presence, immersion, and exhilaration were significantly higher in IVR. Two thirds assumed that IVR would have a positive influence on their laparoscopic simulator use. This first study comparing AVR and IVR did not reveal differences in virtual laparoscopic performance. IVR is considered the more realistic surrounding and is therefore preferred by the participants.

  20. Altering User Movement Behaviour in Virtual Environments.

    PubMed

    Simeone, Adalberto L; Mavridou, Ifigeneia; Powell, Wendy

    2017-04-01

    In immersive Virtual Reality systems, users tend to move in a Virtual Environment as they would in an analogous physical environment. In this work, we investigated how user behaviour is affected when the Virtual Environment differs from the physical space. We created two sets of four environments each, plus a virtual replica of the physical environment as a baseline. The first focused on aesthetic discrepancies, such as a water surface in place of solid ground. The second focused on mixing immaterial objects together with those paired to tangible objects. For example, barring an area with walls or obstacles. We designed a study where participants had to reach three waypoints laid out in such a way to prompt a decision on which path to follow based on the conflict between the mismatching visual stimuli and their awareness of the real layout of the room. We analysed their performances to determine whether their trajectories were altered significantly from the shortest route. Our results indicate that participants altered their trajectories in presence of surfaces representing higher walking difficulty (for example, water instead of grass). However, when the graphical appearance was found to be ambiguous, there was no significant trajectory alteration. The environments mixing immaterial with physical objects had the most impact on trajectories with a mean deviation from the shortest route of 60 cm against the 37 cm of environments with aesthetic alterations. The co-existance of paired and unpaired virtual objects was reported to support the idea that all objects participants saw were backed by physical props. From these results and our observations, we derive guidelines on how to alter user movement behaviour in Virtual Environments.

  1. HTC Vive MeVisLab integration via OpenVR for medical applications

    PubMed Central

    Egger, Jan; Gall, Markus; Wallner, Jürgen; Boechat, Pedro; Hann, Alexander; Li, Xing; Chen, Xiaojun; Schmalstieg, Dieter

    2017-01-01

    Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection. PMID:28323840

  2. HTC Vive MeVisLab integration via OpenVR for medical applications.

    PubMed

    Egger, Jan; Gall, Markus; Wallner, Jürgen; Boechat, Pedro; Hann, Alexander; Li, Xing; Chen, Xiaojun; Schmalstieg, Dieter

    2017-01-01

    Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection.

  3. WeaVR: a self-contained and wearable immersive virtual environment simulation system.

    PubMed

    Hodgson, Eric; Bachmann, Eric R; Vincent, David; Zmuda, Michael; Waller, David; Calusdian, James

    2015-03-01

    We describe WeaVR, a computer simulation system that takes virtual reality technology beyond specialized laboratories and research sites and makes it available in any open space, such as a gymnasium or a public park. Novel hardware and software systems enable HMD-based immersive virtual reality simulations to be conducted in any arbitrary location, with no external infrastructure and little-to-no setup or site preparation. The ability of the WeaVR system to provide realistic motion-tracked navigation for users, to improve the study of large-scale navigation, and to generate usable behavioral data is shown in three demonstrations. First, participants navigated through a full-scale virtual grocery store while physically situated in an open grass field. Trajectory data are presented for both normal tracking and for tracking during the use of redirected walking that constrained users to a predefined area. Second, users followed a straight path within a virtual world for distances of up to 2 km while walking naturally and being redirected to stay within the field, demonstrating the ability of the system to study large-scale navigation by simulating virtual worlds that are potentially unlimited in extent. Finally, the portability and pedagogical implications of this system were demonstrated by taking it to a regional high school for live use by a computer science class on their own school campus.

  4. Perceiving and Acting on Complex Affordances: How Children and Adults Bicycle across Two Lanes of Opposing Traffic

    ERIC Educational Resources Information Center

    Grechkin, Timofey Y.; Chihak, Benjamin J.; Cremer, James F.; Kearney, Joseph K.; Plumert, Jodie M.

    2013-01-01

    This investigation examined how children and adults negotiate a challenging perceptual-motor problem with significant real-world implications--bicycling across two lanes of opposing traffic. Twelve- and 14-year-olds and adults rode a bicycling simulator through an immersive virtual environment. Participants crossed intersections with continuous…

  5. From MMORPG to a Classroom Multiplayer Presential Role Playing Game

    ERIC Educational Resources Information Center

    Susaeta, Heinz; Jimenez, Felipe; Nussbaum, Miguel; Gajardo, Ignacio; Andreu, Juan Jose; Villalta, Marco

    2010-01-01

    The popularity of massively multiplayer online role-playing games (MMORPGs) has grown enormously, with communities of players reaching into the millions. Their fantasy narratives present multiple challenges created by the virtual environment and/or other players. The games' potential for education stems from the fact that players are immersed in a…

  6. Cyber entertainment system using an immersive networked virtual environment

    NASA Astrophysics Data System (ADS)

    Ihara, Masayuki; Honda, Shinkuro; Kobayashi, Minoru; Ishibashi, Satoshi

    2002-05-01

    Authors are examining a cyber entertainment system that applies IPT (Immersive Projection Technology) displays to the entertainment field. This system enables users who are in remote locations to communicate with each other so that they feel as if they are together. Moreover, the system enables those users to experience a high degree of presence, this is due to provision of stereoscopic vision as well as a haptic interface and stereo sound. This paper introduces this system from the viewpoint of space sharing across the network and elucidates its operation using the theme of golf. The system is developed by integrating avatar control, an I/O device, communication links, virtual interaction, mixed reality, and physical simulations. Pairs of these environments are connected across the network. This allows the two players to experience competition. An avatar of each player is displayed by the other player's IPT display in the remote location and is driven by only two magnetic sensors. That is, in the proposed system, users don't need to wear any data suit with a lot of sensors and they are able to play golf without any encumbrance.

  7. A microbased shared virtual world prototype

    NASA Technical Reports Server (NTRS)

    Pitts, Gerald; Robinson, Mark; Strange, Steve

    1993-01-01

    Virtual reality (VR) allows sensory immersion and interaction with a computer-generated environment. The user adopts a physical interface with the computer, through Input/Output devices such as a head-mounted display, data glove, mouse, keyboard, or monitor, to experience an alternate universe. What this means is that the computer generates an environment which, in its ultimate extension, becomes indistinguishable from the real world. 'Imagine a wraparound television with three-dimensional programs, including three-dimensional sound, and solid objects that you can pick up and manipulate, even feel with your fingers and hands.... 'Imagine that you are the creator as well as the consumer of your artificial experience, with the power to use a gesture or word to remold the world you see and hear and feel. That part is not fiction... three-dimensional computer graphics, input/output devices, computer models that constitute a VR system make it possible, today, to immerse yourself in an artificial world and to reach in and reshape it.' Our research's goal was to propose a feasibility experiment in the construction of a networked virtual reality system, making use of current personal computer (PC) technology. The prototype was built using Borland C compiler, running on an IBM 486 33 MHz and a 386 33 MHz. Each game currently is represented as an IPX client on a non-dedicated Novell server. We initially posed the two questions: (1) Is there a need for networked virtual reality? (2) In what ways can the technology be made available to the most people possible?

  8. Virtual hydrology observatory: an immersive visualization of hydrology modeling

    NASA Astrophysics Data System (ADS)

    Su, Simon; Cruz-Neira, Carolina; Habib, Emad; Gerndt, Andreas

    2009-02-01

    The Virtual Hydrology Observatory will provide students with the ability to observe the integrated hydrology simulation with an instructional interface by using a desktop based or immersive virtual reality setup. It is the goal of the virtual hydrology observatory application to facilitate the introduction of field experience and observational skills into hydrology courses through innovative virtual techniques that mimic activities during actual field visits. The simulation part of the application is developed from the integrated atmospheric forecast model: Weather Research and Forecasting (WRF), and the hydrology model: Gridded Surface/Subsurface Hydrologic Analysis (GSSHA). Both the output from WRF and GSSHA models are then used to generate the final visualization components of the Virtual Hydrology Observatory. The various visualization data processing techniques provided by VTK are 2D Delaunay triangulation and data optimization. Once all the visualization components are generated, they are integrated into the simulation data using VRFlowVis and VR Juggler software toolkit. VR Juggler is used primarily to provide the Virtual Hydrology Observatory application with fully immersive and real time 3D interaction experience; while VRFlowVis provides the integration framework for the hydrologic simulation data, graphical objects and user interaction. A six-sided CAVETM like system is used to run the Virtual Hydrology Observatory to provide the students with a fully immersive experience.

  9. Virtual Reality Cerebral Aneurysm Clipping Simulation With Real-time Haptic Feedback

    PubMed Central

    Alaraj, Ali; Luciano, Cristian J.; Bailey, Daniel P.; Elsenousi, Abdussalam; Roitberg, Ben Z.; Bernardo, Antonio; Banerjee, P. Pat; Charbel, Fady T.

    2014-01-01

    Background With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. Objective To develop and evaluate the usefulness of a new haptic-based virtual reality (VR) simulator in the training of neurosurgical residents. Methods A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the Immersive Touch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomography angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-D immersive VR environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from three residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Results Residents felt that the simulation would be useful in preparing for real-life surgery. About two thirds of the residents felt that the 3-D immersive anatomical details provided a very close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They believed the simulation is useful for preoperative surgical rehearsal and neurosurgical training. One third of the residents felt that the technology in its current form provided very realistic haptic feedback for aneurysm surgery. Conclusion Neurosurgical residents felt that the novel immersive VR simulator is helpful in their training especially since they do not get a chance to perform aneurysm clippings until very late in their residency programs. PMID:25599200

  10. The Virtual Shop: A new immersive virtual reality environment and scenario for the assessment of everyday memory.

    PubMed

    Ouellet, Émilie; Boller, Benjamin; Corriveau-Lecavalier, Nick; Cloutier, Simon; Belleville, Sylvie

    2018-06-01

    Assessing and predicting memory performance in everyday life is a common assignment for neuropsychologists. However, most traditional neuropsychological tasks are not conceived to capture everyday memory performance. The Virtual Shop is a fully immersive task developed to assess memory in a more ecological way than traditional neuropsychological assessments. Two studies were undertaken to assess the feasibility of the Virtual Shop and to appraise its ecological and construct validity. In study 1, 20 younger and 19 older adults completed the Virtual Shop task to evaluate its level of difficulty and the way the participants interacted with the VR material. The construct validity was examined with the contrasted-group method, by comparing the performance of younger and older adults. In study 2, 35 individuals with subjective cognitive decline completed the Virtual Shop task. Performance was correlated with an existing questionnaire evaluating everyday memory in order to appraise its ecological validity. To add further support to its construct validity, performance was correlated with traditional episodic memory and executive tasks. All participants successfully completed the Virtual Shop. The task had an appropriate level of difficulty that helped differentiate younger and older adults, supporting the feasibility and construct validity of the task. The performance on the Virtual Shop was significantly and moderately correlated with the performance on the questionnaire and on the traditional memory and executive tasks. Results support the feasibility and both the ecological and construct validity of the Virtual Shop. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Virtual exertions: evoking the sense of exerting forces in virtual reality using gestures and muscle activity.

    PubMed

    Chen, Karen B; Ponto, Kevin; Tredinnick, Ross D; Radwin, Robert G

    2015-06-01

    This study was a proof of concept for virtual exertions, a novel method that involves the use of body tracking and electromyography for grasping and moving projections of objects in virtual reality (VR). The user views objects in his or her hands during rehearsed co-contractions of the same agonist-antagonist muscles normally used for the desired activities to suggest exerting forces. Unlike physical objects, virtual objects are images and lack mass. There is currently no practical physically demanding way to interact with virtual objects to simulate strenuous activities. Eleven participants grasped and lifted similar physical and virtual objects of various weights in an immersive 3-D Cave Automatic Virtual Environment. Muscle activity, localized muscle fatigue, ratings of perceived exertions, and NASA Task Load Index were measured. Additionally, the relationship between levels of immersion (2-D vs. 3-D) was studied. Although the overall magnitude of biceps activity and workload were greater in VR, muscle activity trends and fatigue patterns for varying weights within VR and physical conditions were the same. Perceived exertions for varying weights were not significantly different between VR and physical conditions. Perceived exertion levels and muscle activity patterns corresponded to the assigned virtual loads, which supported the hypothesis that the method evoked the perception of physical exertions and showed that the method was promising. Ultimately this approach may offer opportunities for research and training individuals to perform strenuous activities under potentially safer conditions that mimic situations while seeing their own body and hands relative to the scene. © 2014, Human Factors and Ergonomics Society.

  12. A novel semi-immersive virtual reality visuo-motor task activates ventrolateral prefrontal cortex: a functional near-infrared spectroscopy study

    NASA Astrophysics Data System (ADS)

    Basso Moro, Sara; Carrieri, Marika; Avola, Danilo; Brigadoi, Sabrina; Lancia, Stefania; Petracca, Andrea; Spezialetti, Matteo; Ferrari, Marco; Placidi, Giuseppe; Quaresima, Valentina

    2016-06-01

    Objective. In the last few years, the interest in applying virtual reality systems for neurorehabilitation is increasing. Their compatibility with neuroimaging techniques, such as functional near-infrared spectroscopy (fNIRS), allows for the investigation of brain reorganization with multimodal stimulation and real-time control of the changes occurring in brain activity. The present study was aimed at testing a novel semi-immersive visuo-motor task (VMT), which has the features of being adopted in the field of neurorehabilitation of the upper limb motor function. Approach. A virtual environment was simulated through a three-dimensional hand-sensing device (the LEAP Motion Controller), and the concomitant VMT-related prefrontal cortex (PFC) response was monitored non-invasively by fNIRS. Upon the VMT, performed at three different levels of difficulty, it was hypothesized that the PFC would be activated with an expected greater level of activation in the ventrolateral PFC (VLPFC), given its involvement in the motor action planning and in the allocation of the attentional resources to generate goals from current contexts. Twenty-one subjects were asked to move their right hand/forearm with the purpose of guiding a virtual sphere over a virtual path. A twenty-channel fNIRS system was employed for measuring changes in PFC oxygenated-deoxygenated hemoglobin (O2Hb/HHb, respectively). Main results. A VLPFC O2Hb increase and a concomitant HHb decrease were observed during the VMT performance, without any difference in relation to the task difficulty. Significance. The present study has revealed a particular involvement of the VLPFC in the execution of the novel proposed semi-immersive VMT adoptable in the neurorehabilitation field.

  13. A novel semi-immersive virtual reality visuo-motor task activates ventrolateral prefrontal cortex: a functional near-infrared spectroscopy study.

    PubMed

    Moro, Sara Basso; Carrieri, Marika; Avola, Danilo; Brigadoi, Sabrina; Lancia, Stefania; Petracca, Andrea; Spezialetti, Matteo; Ferrari, Marco; Placidi, Giuseppe; Quaresima, Valentina

    2016-06-01

    In the last few years, the interest in applying virtual reality systems for neurorehabilitation is increasing. Their compatibility with neuroimaging techniques, such as functional near-infrared spectroscopy (fNIRS), allows for the investigation of brain reorganization with multimodal stimulation and real-time control of the changes occurring in brain activity. The present study was aimed at testing a novel semi-immersive visuo-motor task (VMT), which has the features of being adopted in the field of neurorehabilitation of the upper limb motor function. A virtual environment was simulated through a three-dimensional hand-sensing device (the LEAP Motion Controller), and the concomitant VMT-related prefrontal cortex (PFC) response was monitored non-invasively by fNIRS. Upon the VMT, performed at three different levels of difficulty, it was hypothesized that the PFC would be activated with an expected greater level of activation in the ventrolateral PFC (VLPFC), given its involvement in the motor action planning and in the allocation of the attentional resources to generate goals from current contexts. Twenty-one subjects were asked to move their right hand/forearm with the purpose of guiding a virtual sphere over a virtual path. A twenty-channel fNIRS system was employed for measuring changes in PFC oxygenated-deoxygenated hemoglobin (O2Hb/HHb, respectively). A VLPFC O2Hb increase and a concomitant HHb decrease were observed during the VMT performance, without any difference in relation to the task difficulty. The present study has revealed a particular involvement of the VLPFC in the execution of the novel proposed semi-immersive VMT adoptable in the neurorehabilitation field.

  14. OnSight: Multi-platform Visualization of the Surface of Mars

    NASA Astrophysics Data System (ADS)

    Abercrombie, S. P.; Menzies, A.; Winter, A.; Clausen, M.; Duran, B.; Jorritsma, M.; Goddard, C.; Lidawer, A.

    2017-12-01

    A key challenge of planetary geology is to develop an understanding of an environment that humans cannot (yet) visit. Instead, scientists rely on visualizations created from images sent back by robotic explorers, such as the Curiosity Mars rover. OnSight is a multi-platform visualization tool that helps scientists and engineers to visualize the surface of Mars. Terrain visualization allows scientists to understand the scale and geometric relationships of the environment around the Curiosity rover, both for scientific understanding and for tactical consideration in safely operating the rover. OnSight includes a web-based 2D/3D visualization tool, as well as an immersive mixed reality visualization. In addition, OnSight offers a novel feature for communication among the science team. Using the multiuser feature of OnSight, scientists can meet virtually on Mars, to discuss geology in a shared spatial context. Combining web-based visualization with immersive visualization allows OnSight to leverage strengths of both platforms. This project demonstrates how 3D visualization can be adapted to either an immersive environment or a computer screen, and will discuss advantages and disadvantages of both platforms.

  15. Enhancements to VTK enabling Scientific Visualization in Immersive Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick; Jhaveri, Sankhesh; Chaudhary, Aashish

    Modern scientific, engineering and medical computational sim- ulations, as well as experimental and observational data sens- ing/measuring devices, produce enormous amounts of data. While statistical analysis provides insight into this data, scientific vi- sualization is tactically important for scientific discovery, prod- uct design and data analysis. These benefits are impeded, how- ever, when scientific visualization algorithms are implemented from scratch—a time-consuming and redundant process in im- mersive application development. This process can greatly ben- efit from leveraging the state-of-the-art open-source Visualization Toolkit (VTK) and its community. Over the past two (almost three) decades, integrating VTK with a virtual reality (VR)more » environment has only been attempted to varying degrees of success. In this pa- per, we demonstrate two new approaches to simplify this amalga- mation of an immersive interface with visualization rendering from VTK. In addition, we cover several enhancements to VTK that pro- vide near real-time updates and efficient interaction. Finally, we demonstrate the combination of VTK with both Vrui and OpenVR immersive environments in example applications.« less

  16. NASA's Hybrid Reality Lab: One Giant Leap for Full Dive

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2017-01-01

    This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.

  17. Use of 3D techniques for virtual production

    NASA Astrophysics Data System (ADS)

    Grau, Oliver; Price, Marc C.; Thomas, Graham A.

    2000-12-01

    Virtual production for broadcast is currently mainly used in the form of virtual studios, where the resulting media is a sequence of 2D images. With the steady increase of 3D computing power in home PCs and the technical progress in 3D display technology, the content industry is looking for new kinds of program material, which makes use of 3D technology. The applications range form analysis of sport scenes, 3DTV, up to the creation of fully immersive content. In a virtual studio a camera films one or more actors in a controlled environment. The pictures of the actors can be segmented very accurately in real time using chroma keying techniques. The isolated silhouette can be integrated into a new synthetic virtual environment using a studio mixer. The resulting shape description of the actors is 2D so far. For the realization of more sophisticated optical interactions of the actors with the virtual environment, such as occlusions and shadows, an object-based 3D description of scenes is needed. However, the requirements of shape accuracy, and the kind of representation, differ in accordance with the application. This contribution gives an overview of requirements and approaches for the generation of an object-based 3D description in various applications studied by the BBC R and D department. An enhanced Virtual Studio for 3D programs is proposed that covers a range of applications for virtual production.

  18. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  19. Defense applications of the CAVE (CAVE automatic virtual environment)

    NASA Astrophysics Data System (ADS)

    Isabelle, Scott K.; Gilkey, Robert H.; Kenyon, Robert V.; Valentino, George; Flach, John M.; Spenny, Curtis H.; Anderson, Timothy R.

    1997-07-01

    The CAVE is a multi-person, room-sized, high-resolution, 3D video and auditory environment, which can be used to present very immersive virtual environment experiences. This paper describes the CAVE technology and the capability of the CAVE system as originally developed at the Electronics Visualization Laboratory of the University of Illinois- Chicago and as more recently implemented by Wright State University (WSU) in the Armstrong Laboratory at Wright- Patterson Air Force Base (WPAFB). One planned use of the WSU/WPAFB CAVE is research addressing the appropriate design of display and control interfaces for controlling uninhabited aerial vehicles. The WSU/WPAFB CAVE has a number of features that make it well-suited to this work: (1) 360 degrees surround, plus floor, high resolution visual displays, (2) virtual spatialized audio, (3) the ability to integrate real and virtual objects, and (4) rapid and flexible reconfiguration. However, even though the CAVE is likely to have broad utility for military applications, it does have certain limitations that may make it less well- suited to applications that require 'natural' haptic feedback, vestibular stimulation, or an ability to interact with close detailed objects.

  20. Studying and Treating Schizophrenia Using Virtual Reality: A New Paradigm

    PubMed Central

    Freeman, Daniel

    2008-01-01

    Understanding schizophrenia requires consideration of patients’ interactions in the social world. Misinterpretation of other peoples’ behavior is a key feature of persecutory ideation. The occurrence and intensity of hallucinations is affected by the social context. Negative symptoms such as anhedonia, asociality, and blunted affect reflect difficulties in social interactions. Withdrawal and avoidance of other people is frequent in schizophrenia, leading to isolation and rumination. The use of virtual reality (VR)—interactive immersive computer environments—allows one of the key variables in understanding psychosis, social environments, to be controlled, providing exciting applications to research and treatment. Seven applications of virtual social environments to schizophrenia are set out: symptom assessment, identification of symptom markers, establishment of predictive factors, tests of putative causal factors, investigation of the differential prediction of symptoms, determination of toxic elements in the environment, and development of treatment. The initial VR studies of persecutory ideation, which illustrate the ascription of personalities and mental states to virtual people, are highlighted. VR, suitably applied, holds great promise in furthering the understanding and treatment of psychosis. PMID:18375568

  1. Task performance in virtual environments used for cognitive rehabilitation after traumatic brain injury.

    PubMed

    Christiansen, C; Abreu, B; Ottenbacher, K; Huffman, K; Masel, B; Culpepper, R

    1998-08-01

    This report describes a reliability study using a prototype computer-simulated virtual environment to assess basic daily living skills in a sample of persons with traumatic brain injury (TBI). The benefits of using virtual reality in training for situations where safety is a factor have been established in defense and industry, but have not been demonstrated in rehabilitation. Thirty subjects with TBI receiving comprehensive rehabilitation services at a residential facility. An immersive virtual kitchen was developed in which a meal preparation task involving multiple steps could be performed. The prototype was tested using subjects who completed the task twice within 7 days. The stability of performance was estimated using intraclass correlation coefficients (ICCs). The ICC value for total performance based on all steps involved in the meal preparation task was .73. When three items with low variance were removed the ICC improved to .81. Little evidence of vestibular optical side-effects was noted in the subjects tested. Adequate initial reliability exists to continue development of the environment as an assessment and training prototype for persons with brain injury.

  2. Ergonomic approaches to designing educational materials for immersive multi-projection system

    NASA Astrophysics Data System (ADS)

    Shibata, Takashi; Lee, JaeLin; Inoue, Tetsuri

    2014-02-01

    Rapid advances in computer and display technologies have made it possible to present high quality virtual reality (VR) environment. To use such virtual environments effectively, research should be performed into how users perceive and react to virtual environment in view of particular human factors. We created a VR simulation of sea fish for science education, and we conducted an experiment to examine how observers perceive the size and depth of an object within their reach and evaluated their visual fatigue. We chose a multi-projection system for presenting the educational VR simulation, because this system can provide actual-size objects and produce stereo images located close to the observer. The results of the experiment show that estimation of size and depth was relatively accurate when subjects used physical actions to assess them. Presenting images within the observer's reach is suggested to be useful for education in VR environment. Evaluation of visual fatigue shows that the level of symptoms from viewing stereo images with a large disparity in VR environment was low in a short time.

  3. Scripting human animations in a virtual environment

    NASA Technical Reports Server (NTRS)

    Goldsby, Michael E.; Pandya, Abhilash K.; Maida, James C.

    1994-01-01

    The current deficiencies of virtual environment (VE) are well known: annoying lag time in drawing the current view, drastically simplified environments to reduce that time lag, low resolution and narrow field of view. Animation scripting is an application of VE technology which can be carried out successfully despite these deficiencies. The final product is a smoothly moving high resolution animation displaying detailed models. In this system, the user is represented by a human computer model with the same body proportions. Using magnetic tracking, the motions of the model's upper torso, head and arms are controlled by the user's movements (18 degrees of freedom). The model's lower torso and global position and orientation are controlled by a spaceball and keypad (12 degrees of freedom). Using this system human motion scripts can be extracted from the user's movements while immersed in a simplified virtual environment. Recorded data is used to define key frames; motion is interpolated between them and post processing adds a more detailed environment. The result is a considerable savings in time and a much more natural-looking movement of a human figure in a smooth and seamless animation.

  4. Isolated core vs. superficial cooling effects on virtual maze navigation.

    PubMed

    Payne, Jennifer; Cheung, Stephen S

    2007-07-01

    Cold impairs cognitive performance and is a common occurrence in many survival situations. Altered behavior patterns due to impaired navigation abilities in cold environments are potential problems in lost-person situations. We investigated the separate effects of low core temperature and superficial cooling on a spatially demanding virtual navigation task. There were 12 healthy men who were passively cooled via 15 degrees C water immersion to a core temperature of 36.0 degrees C, then transferred to a warm (40 degrees C) water bath to eliminate superficial shivering while completing a series of 20 virtual computer mazes. In a control condition, subjects rested in a thermoneutral (approximately 35 degrees C) bath for a time-matched period before being transferred to a warm bath for testing. Superficial cooling and distraction were achieved by whole-body immersion in 35 degree water for a time-matched period, followed by lower leg immersion in 10 degree C water for the duration of the navigational tests. Mean completion time and mean error scores for the mazes were not significantly different (p > 0.05) across the core cooling (16.59 +/- 11.54 s, 0.91 +/- 1.86 errors), control (15.40 +/- 8.85 s, 0.82 +/- 1.76 errors), and superficial cooling (15.19 +/- 7.80 s, 0.77 +/- 1.40 errors) conditions. Separately reducing core temperature or increasing cold sensation in the lower extremities did not influence performance on virtual computer mazes, suggesting that navigation is more resistive to cooling than other, simpler cognitive tasks. Further research is warranted to explore navigational ability at progressively lower core and skin temperatures, and in different populations.

  5. Sensor Spatial Distortion, Visual Latency, and Update Rate Effects on 3D Tracking in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Ellis, S. R.; Adelstein, B. D.; Baumeler, S.; Jense, G. J.; Jacoby, R. H.; Trejo, Leonard (Technical Monitor)

    1998-01-01

    Several common defects that we have sought to minimize in immersing virtual environments are: static sensor spatial distortion, visual latency, and low update rates. Human performance within our environments during large amplitude 3D tracking was assessed by objective and subjective methods in the presence and absence of these defects. Results show that 1) removal of our relatively small spatial sensor distortion had minor effects on the tracking activity, 2) an Adapted Cooper-Harper controllability scale proved the most sensitive subjective indicator of the degradation of dynamic fidelity caused by increasing latency and decreasing frame rates, and 3) performance, as measured by normalized RMS tracking error or subjective impressions, was more markedly influenced by changing visual latency than by update rate.

  6. Development of an audio-based virtual gaming environment to assist with navigation skills in the blind.

    PubMed

    Connors, Erin C; Yazzolino, Lindsay A; Sánchez, Jaime; Merabet, Lotfi B

    2013-03-27

    Audio-based Environment Simulator (AbES) is virtual environment software designed to improve real world navigation skills in the blind. Using only audio based cues and set within the context of a video game metaphor, users gather relevant spatial information regarding a building's layout. This allows the user to develop an accurate spatial cognitive map of a large-scale three-dimensional space that can be manipulated for the purposes of a real indoor navigation task. After game play, participants are then assessed on their ability to navigate within the target physical building represented in the game. Preliminary results suggest that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building as indexed by their performance on a series of navigation tasks. These tasks included path finding through the virtual and physical building, as well as a series of drop off tasks. We find that the immersive and highly interactive nature of the AbES software appears to greatly engage the blind user to actively explore the virtual environment. Applications of this approach may extend to larger populations of visually impaired individuals.

  7. Comparing two types of navigational interfaces for Virtual Reality.

    PubMed

    Teixeira, Luís; Vilar, Elisângela; Duarte, Emília; Rebelo, Francisco; da Silva, Fernando Moreira

    2012-01-01

    Previous studies suggest significant differences between navigating virtual environments in a life-like walking manner (i.e., using treadmills or walk-in-place techniques) and virtual navigation (i.e., flying while really standing). The latter option, which usually involves hand-centric devices (e.g., joysticks), is the most common in Virtual Reality-based studies, mostly due to low costs, less space and technology demands. However, recently, new interaction devices, originally conceived for videogames have become available offering interesting potentialities for research. This study aimed to explore the potentialities of the Nintendo Wii Balance Board as a navigation interface in a Virtual Environment presented in an immersive Virtual Reality system. Comparing participants' performance while engaged in a simulated emergency egress allows determining the adequacy of such alternative navigation interface on the basis of empirical results. Forty university students participated in this study. Results show that participants were more efficient when performing navigation tasks using the Joystick than with the Balance Board. However there were no significantly differences in the behavioral compliance with exit signs. Therefore, this study suggests that, at least for tasks similar to the studied, the Balance Board have good potentiality to be used as a navigation interface for Virtual Reality systems.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markidis, S.; Rizwan, U.

    The use of virtual nuclear control room can be an effective and powerful tool for training personnel working in the nuclear power plants. Operators could experience and simulate the functioning of the plant, even in critical situations, without being in a real power plant or running any risk. 3D models can be exported to Virtual Reality formats and then displayed in the Virtual Reality environment providing an immersive 3D experience. However, two major limitations of this approach are that 3D models exhibit static textures, and they are not fully interactive and therefore cannot be used effectively in training personnel. Inmore » this paper we first describe a possible solution for embedding the output of a computer application in a 3D virtual scene, coupling real-world applications and VR systems. The VR system reported here grabs the output of an application running on an X server; creates a texture with the output and then displays it on a screen or a wall in the virtual reality environment. We then propose a simple model for providing interaction between the user in the VR system and the running simulator. This approach is based on the use of internet-based application that can be commanded by a laptop or tablet-pc added to the virtual environment. (authors)« less

  9. Second Life®: A 3D Virtual Immersive Environment for Teacher Preparation Courses in a Distance Education Program

    ERIC Educational Resources Information Center

    Hartley, Melissa D.; Ludlow, Barbara L.; Duff, Michael C.

    2015-01-01

    Many colleges and universities rely upon online programs to support distance delivery of personnel preparation programs in special education and related services. These distance education programs enable individuals who live or work in rural communities to access training programs to earn teaching certification and assist rural schools in…

  10. Emboldened by Embodiment: Six Precepts for Research on Embodied Learning and Mixed Reality

    ERIC Educational Resources Information Center

    Lindgren, Robb; Johnson-Glenberg, Mina

    2013-01-01

    The authors describe an emerging paradigm of educational research that pairs theories of embodied learning with a class of immersive technologies referred to as "mixed reality" (MR). MR environments merge the digital with the physical, where, for example, students can use their bodies to simulate an orbit around a virtual planet. Recent…

  11. Postural Hypo-Reactivity in Autism Is Contingent on Development and Visual Environment: A Fully Immersive Virtual Reality Study

    ERIC Educational Resources Information Center

    Greffou, Selma; Bertone, Armando; Hahler, Eva-Maria; Hanssens, Jean-Marie; Mottron, Laurent; Faubert, Jocelyn

    2012-01-01

    Although atypical motor behaviors have been associated with autism, investigations regarding their possible origins are scarce. This study assessed the visual and vestibular components involved in atypical postural reactivity in autism. Postural reactivity and stability were measured for younger (12-15 years) and older (16-33 years) autistic…

  12. Virtual worlds: a new frontier for nurse education?

    PubMed

    Green, Janet; Wyllie, Aileen; Jackson, Debra

    2014-01-01

    Virtual worlds have the potential to offer nursing students social networking and, learning, opportunities through the use of collaborative and immersive learning. If nursing educators, are to stay, abreast of contemporary learning opportunities an exploration of the potential benefits of, virtual, worlds and their possibilities is needed. Literature was sourced that explored virtual worlds, and their, use in education, but nursing education specifically. It is clear that immersive learning has, positive, benefits for nursing, however the best way to approach virtual reality in nursing education, has yet to, be ascertained.

  13. Interfacing modeling suite Physics Of Eclipsing Binaries 2.0 with a Virtual Reality Platform

    NASA Astrophysics Data System (ADS)

    Harriett, Edward; Conroy, Kyle; Prša, Andrej; Klassner, Frank

    2018-01-01

    To explore alternate methods for modeling eclipsing binary stars, we extrapolate upon PHOEBE’s (PHysics Of Eclipsing BinariEs) capabilities in a virtual reality (VR) environment to create an immersive and interactive experience for users. The application used is Vizard, a python-scripted VR development platform for environments such as Cave Automatic Virtual Environment (CAVE) and other off-the-shelf VR headsets. Vizard allows the freedom for all modeling to be precompiled without compromising functionality or usage on its part. The system requires five arguments to be precomputed using PHOEBE’s python front-end: the effective temperature, flux, relative intensity, vertex coordinates, and orbits; the user can opt to implement other features from PHOEBE to be accessed within the simulation as well. Here we present the method for making the data observables accessible in real time. An Occulus Rift will be available for a live showcase of various cases of VR rendering of PHOEBE binary systems including detached and contact binary stars.

  14. Effect of an Immersive Preoperative Virtual Reality Experience on Patient Reported Outcomes: A Randomized Controlled Trial.

    PubMed

    Bekelis, Kimon; Calnan, Daniel; Simmons, Nathan; MacKenzie, Todd A; Kakoulides, George

    2017-06-01

    To investigate the effect of exposure to a virtual reality (VR) environment preoperatively on patient-reported outcomes for surgical operations. There is a scarcity of well-developed quality improvement initiatives targeting patient satisfaction. We performed a randomized controlled trial of patients undergoing cranial and spinal operations in a tertiary referral center. Patients underwent a 1:1 randomization to an immersive preoperative VR experience or standard preoperative experience stratified on type of operation. The primary outcome measures were the Evaluation du Vecu de l'Anesthesie Generale (EVAN-G) score and the Amsterdam Preoperative Anxiety and Information (APAIS) score, as markers of the patient's experience during the surgical encounter. During the study period, a total of 127 patients (mean age 55.3 years, 41.9% females) underwent randomization. The average EVAN-G score was 84.3 (standard deviation, SD, 6.4) after VR, and 64.3 (SD, 11.7) after standard preoperative experience (difference, 20.0; 95% confidence interval, CI, 16.6-23.3). Exposure to an immersive VR experience also led to higher APAIS score (difference, 29.9; 95% CI, 24.5-35.2). In addition, VR led to lower preoperative VAS stress score (difference, -41.7; 95% CI, -33.1 to -50.2), and higher preoperative VAS preparedness (difference, 32.4; 95% CI, 24.9-39.8), and VAS satisfaction (difference, 33.2; 95% CI, 25.4-41.0) scores. No association was identified with VAS stress score (difference, -1.6; 95% CI, -13.4 to 10.2). In a randomized controlled trial, we demonstrated that patients exposed to preoperative VR had increased satisfaction during the surgical encounter. Harnessing the power of this technology, hospitals can create an immersive environment that minimizes stress, and enhances the perioperative experience.

  15. A hardware and software architecture to deal with multimodal and collaborative interactions in multiuser virtual reality environments

    NASA Astrophysics Data System (ADS)

    Martin, P.; Tseu, A.; Férey, N.; Touraine, D.; Bourdot, P.

    2014-02-01

    Most advanced immersive devices provide collaborative environment within several users have their distinct head-tracked stereoscopic point of view. Combining with common used interactive features such as voice and gesture recognition, 3D mouse, haptic feedback, and spatialized audio rendering, these environments should faithfully reproduce a real context. However, even if many studies have been carried out on multimodal systems, we are far to definitively solve the issue of multimodal fusion, which consists in merging multimodal events coming from users and devices, into interpretable commands performed by the application. Multimodality and collaboration was often studied separately, despite of the fact that these two aspects share interesting similarities. We discuss how we address this problem, thought the design and implementation of a supervisor that is able to deal with both multimodal fusion and collaborative aspects. The aim of this supervisor is to ensure the merge of user's input from virtual reality devices in order to control immersive multi-user applications. We deal with this problem according to a practical point of view, because the main requirements of this supervisor was defined according to a industrial task proposed by our automotive partner, that as to be performed with multimodal and collaborative interactions in a co-located multi-user environment. In this task, two co-located workers of a virtual assembly chain has to cooperate to insert a seat into the bodywork of a car, using haptic devices to feel collision and to manipulate objects, combining speech recognition and two hands gesture recognition as multimodal instructions. Besides the architectural aspect of this supervisor, we described how we ensure the modularity of our solution that could apply on different virtual reality platforms, interactive contexts and virtual contents. A virtual context observer included in this supervisor in was especially designed to be independent to the content of the virtual scene of targeted application, and is use to report high-level interactive and collaborative events. This context observer allows the supervisor to merge these interactive and collaborative events, but is also used to deal with new issues coming from our observation of two co-located users in an immersive device performing this assembly task. We highlight the fact that when speech recognition features are provided to the two users, it is required to automatically detect according to the interactive context, whether the vocal instructions must be translated into commands that have to be performed by the machine, or whether they take a part of the natural communication necessary for collaboration. Information coming from this context observer that indicates a user is looking at its collaborator, is important to detect if the user is talking to its partner. Moreover, as the users are physically co-localised and head-tracking is used to provide high fidelity stereoscopic rendering, and natural walking navigation in the virtual scene, we have to deals with collision and screen occlusion between the co-located users in the physical work space. Working area and focus of each user, computed and reported by the context observer is necessary to prevent or avoid these situations.

  16. Innovating Training through Immersive Environments: Generation Y, Exploratory Learning, and Serious Games

    NASA Technical Reports Server (NTRS)

    Gendron, Gerald

    2012-01-01

    Over the next decade, those entering Service and Joint Staff positions within the military will come from a different generation than the current leadership. They will come from Generation Y and have differing preferences for learning. Immersive learning environments like serious games and virtual world initiatives can complement traditional training methods to provide a better overall training program for staffs. Generation Y members desire learning methods which are relevant and interactive, regardless of whether they are delivered over the internet or in person. This paper focuses on a project undertaken to assess alternative training methods to teach special operations staffs. It provides a summary of the needs analysis used to consider alternatives and to better posture the Department of Defense for future training development.

  17. Examining Work Performance in Immersive Virtual Environments versus Face-to-Face Physical Environments through Laboratory Experimentation

    DTIC Science & Technology

    2011-01-01

    J. R. Galbraith. (1977, Organization Design . [24] R. L. Daft and N. B. Macintosh, "A Tentative Exploration into the Amount and Equivocality of... Daft and R. H. Lengel, "Organizational Information Requirements, Media Richness and Structural Design ," Management Science, vol. 32, pp. 554-571...Psychology of Organizing . Addison-Wesley, 1979. [27] R. L. Daft and R. H. Lengel. (1984, Information richness: A new approach to managerial

  18. Game engines and immersive displays

    NASA Astrophysics Data System (ADS)

    Chang, Benjamin; Destefano, Marc

    2014-02-01

    While virtual reality and digital games share many core technologies, the programming environments, toolkits, and workflows for developing games and VR environments are often distinct. VR toolkits designed for applications in visualization and simulation often have a different feature set or design philosophy than game engines, while popular game engines often lack support for VR hardware. Extending a game engine to support systems such as the CAVE gives developers a unified development environment and the ability to easily port projects, but involves challenges beyond just adding stereo 3D visuals. In this paper we outline the issues involved in adapting a game engine for use with an immersive display system including stereoscopy, tracking, and clustering, and present example implementation details using Unity3D. We discuss application development and workflow approaches including camera management, rendering synchronization, GUI design, and issues specific to Unity3D, and present examples of projects created for a multi-wall, clustered, stereoscopic display.

  19. Creating virtual humans for simulation-based training and planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stansfield, S.; Sobel, A.

    1998-05-12

    Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system formore » planning, rehearsing and training assault operations.« less

  20. Virtual Reality Hysteroscopy

    PubMed

    Levy

    1996-08-01

    New interactive computer technologies are having a significant influence on medical education, training, and practice. The newest innovation in computer technology, virtual reality, allows an individual to be immersed in a dynamic computer-generated, three-dimensional environment and can provide realistic simulations of surgical procedures. A new virtual reality hysteroscope passes through a sensing device that synchronizes movements with a three-dimensional model of a uterus. Force feedback is incorporated into this model, so the user actually experiences the collision of an instrument against the uterine wall or the sensation of the resistance or drag of a resectoscope as it cuts through a myoma in a virtual environment. A variety of intrauterine pathologies and procedures are simulated, including hyperplasia, cancer, resection of a uterine septum, polyp, or myoma, and endometrial ablation. This technology will be incorporated into comprehensive training programs that will objectively assess hand-eye coordination and procedural skills. It is possible that by incorporating virtual reality into hysteroscopic training programs, a decrease in the learning curve and the number of complications presently associated with the procedures may be realized. Prospective studies are required to assess these potential benefits.

  1. Distributed interactive virtual environments for collaborative experiential learning and training independent of distance over Internet2.

    PubMed

    Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P

    2004-01-01

    Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully between Western Australia and UNM. We successfully demonstrated the ability to fully immerse participants in a distributed virtual environment independent of distance for collaborative team interaction in medical simulation designed for education and training. The ability to make mistakes in a safe environment is well received by students and has a positive impact on their understanding, as well as memory of the principles involved in correcting those mistakes. Bringing people together as virtual teams for interactive experiential learning and collaborative training, independent of distance, provides a platform for distributed "just-in-time" training, performance assessment and credentialing. Further validation is necessary to determine the potential value of the distributed VRE in knowledge transfer, improved future performance and should entail training participants to competence in using these tools.

  2. Comparison of grasping movements made by healthy subjects in a 3-dimensional immersive virtual versus physical environment.

    PubMed

    Magdalon, Eliane C; Michaelsen, Stella M; Quevedo, Antonio A; Levin, Mindy F

    2011-09-01

    Virtual reality (VR) technology is being used with increasing frequency as a training medium for motor rehabilitation. However, before addressing training effectiveness in virtual environments (VEs), it is necessary to identify if movements made in such environments are kinematically similar to those made in physical environments (PEs) and the effect of provision of haptic feedback on these movement patterns. These questions are important since reach-to-grasp movements may be inaccurate when visual or haptic feedback is altered or absent. Our goal was to compare kinematics of reaching and grasping movements to three objects performed in an immersive three-dimensional (3D) VE with haptic feedback (cyberglove/grasp system) viewed through a head-mounted display to those made in an equivalent physical environment (PE). We also compared movements in PE made with and without wearing the cyberglove/grasp haptic feedback system. Ten healthy subjects (8 women, 62.1±8.8years) reached and grasped objects requiring 3 different grasp types (can, diameter 65.6mm, cylindrical grasp; screwdriver, diameter 31.6mm, power grasp; pen, diameter 7.5mm, precision grasp) in PE and visually similar virtual objects in VE. Temporal and spatial arm and trunk kinematics were analyzed. Movements were slower and grip apertures were wider when wearing the glove in both the PE and the VE compared to movements made in the PE without the glove. When wearing the glove, subjects used similar reaching trajectories in both environments, preserved the coordination between reaching and grasping and scaled grip aperture to object size for the larger object (cylindrical grasp). However, in VE compared to PE, movements were slower and had longer deceleration times, elbow extension was greater when reaching to the smallest object and apertures were wider for the power and precision grip tasks. Overall, the differences in spatial and temporal kinematics of movements between environments were greater than those due only to wearing the cyberglove/grasp system. Differences in movement kinematics due to the viewing environment were likely due to a lack of prior experience with the virtual environment, an uncertainty of object location and the restricted field-of-view when wearing the head-mounted display. The results can be used to inform the design and disposition of objects within 3D VEs for the study of the control of prehension and for upper limb rehabilitation. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Virtually There.

    ERIC Educational Resources Information Center

    Lanier, Jaron

    2001-01-01

    Describes tele-immersion, a new medium for human interaction enabled by digital technologies. It combines the display and interaction techniques of virtual reality with new vision technologies that transcend the traditional limitations of a camera. Tele-immersion stations observe people as moving sculptures without favoring a single point of view.…

  4. Versatile, Immersive, Creative and Dynamic Virtual 3-D Healthcare Learning Environments: A Review of the Literature

    PubMed Central

    2008-01-01

    The author provides a critical overview of three-dimensional (3-D) virtual worlds and “serious gaming” that are currently being developed and used in healthcare professional education and medicine. The relevance of this e-learning innovation for teaching students and professionals is debatable and variables influencing adoption, such as increased knowledge, self-directed learning, and peer collaboration, by academics, healthcare professionals, and business executives are examined while looking at various Web 2.0/3.0 applications. There is a need for more empirical research in order to unearth the pedagogical outcomes and advantages associated with this e-learning technology. A brief description of Roger’s Diffusion of Innovations Theory and Siemens’ Connectivism Theory for today’s learners is presented as potential underlying pedagogical tenets to support the use of virtual 3-D learning environments in higher education and healthcare. PMID:18762473

  5. Versatile, immersive, creative and dynamic virtual 3-D healthcare learning environments: a review of the literature.

    PubMed

    Hansen, Margaret M

    2008-09-01

    The author provides a critical overview of three-dimensional (3-D) virtual worlds and "serious gaming" that are currently being developed and used in healthcare professional education and medicine. The relevance of this e-learning innovation for teaching students and professionals is debatable and variables influencing adoption, such as increased knowledge, self-directed learning, and peer collaboration, by academics, healthcare professionals, and business executives are examined while looking at various Web 2.0/3.0 applications. There is a need for more empirical research in order to unearth the pedagogical outcomes and advantages associated with this e-learning technology. A brief description of Roger's Diffusion of Innovations Theory and Siemens' Connectivism Theory for today's learners is presented as potential underlying pedagogical tenets to support the use of virtual 3-D learning environments in higher education and healthcare.

  6. Modeling of luminance distribution in CAVE-type virtual reality systems

    NASA Astrophysics Data System (ADS)

    Meironke, Michał; Mazikowski, Adam

    2017-08-01

    At present, one of the most advanced virtual reality systems are CAVE-type (Cave Automatic Virtual Environment) installations. Such systems are usually consisted of four, five or six projection screens and in case of six screens arranged in form of a cube. Providing the user with a high level of immersion feeling in such systems is largely dependent of optical properties of the system. The modeling of physical phenomena plays nowadays a huge role in the most fields of science and technology. It allows to simulate work of device without a need to make any changes in the physical constructions. In this paper distribution of luminance in CAVE-type virtual reality systems were modelled. Calculations were performed for the model of 6-walled CAVE-type installation, based on Immersive 3D Visualization Laboratory, situated at the Faculty of Electronics, Telecommunications and Informatics at the Gdańsk University of Technology. Tests have been carried out for two different scattering distribution of the screen material in order to check how these characteristicinfluence on the luminance distribution of the whole CAVE. The basis assumption and simplification of modeled CAVE-type installation and results were presented. The brief discussion about the results and usefulness of developed model were also carried out.

  7. Tuning self-motion perception in virtual reality with visual illusions.

    PubMed

    Bruder, Gerd; Steinicke, Frank; Wieland, Phil; Lappe, Markus

    2012-07-01

    Motion perception in immersive virtual environments significantly differs from the real world. For example, previous work has shown that users tend to underestimate travel distances in virtual environments (VEs). As a solution to this problem, researchers proposed to scale the mapped virtual camera motion relative to the tracked real-world movement of a user until real and virtual motion are perceived as equal, i.e., real-world movements could be mapped with a larger gain to the VE in order to compensate for the underestimation. However, introducing discrepancies between real and virtual motion can become a problem, in particular, due to misalignments of both worlds and distorted space cognition. In this paper, we describe a different approach that introduces apparent self-motion illusions by manipulating optic flow fields during movements in VEs. These manipulations can affect self-motion perception in VEs, but omit a quantitative discrepancy between real and virtual motions. In particular, we consider to which regions of the virtual view these apparent self-motion illusions can be applied, i.e., the ground plane or peripheral vision. Therefore, we introduce four illusions and show in experiments that optic flow manipulation can significantly affect users' self-motion judgments. Furthermore, we show that with such manipulations of optic flow fields the underestimation of travel distances can be compensated.

  8. Virtual reality in anxiety disorders: the past and the future.

    PubMed

    Gorini, Alessandra; Riva, Giuseppe

    2008-02-01

    One of the most effective treatments of anxiety is exposure therapy: a person is exposed to specific feared situations or objects that trigger anxiety. This exposure process may be done through actual exposure, with visualization, by imagination or using virtual reality (VR), that provides users with computer simulated environments with and within which they can interact. VR is made possible by the capability of computers to synthesize a 3D graphical environment from numerical data. Furthermore, because input devices sense the subject's reactions and motions, the computer can modify the synthetic environment accordingly, creating the illusion of interacting with, and thus being immersed within the environment. Starting from 1995, different experimental studies have been conducted in order to investigate the effect of VR exposure in the treatment of subclinical fears and anxiety disorders. This review will discuss their outcome and provide guidelines for the use of VR exposure for the treatment of anxious patients.

  9. The Use of Virtual Reality in Psychology: A Case Study in Visual Perception

    PubMed Central

    Wilson, Christopher J.; Soranzo, Alessandro

    2015-01-01

    Recent proliferation of available virtual reality (VR) tools has seen increased use in psychological research. This is due to a number of advantages afforded over traditional experimental apparatus such as tighter control of the environment and the possibility of creating more ecologically valid stimulus presentation and response protocols. At the same time, higher levels of immersion and visual fidelity afforded by VR do not necessarily evoke presence or elicit a “realistic” psychological response. The current paper reviews some current uses for VR environments in psychological research and discusses some ongoing questions for researchers. Finally, we focus on the area of visual perception, where both the advantages and challenges of VR are particularly salient. PMID:26339281

  10. High-immersion three-dimensional display of the numerical computer model

    NASA Astrophysics Data System (ADS)

    Xing, Shujun; Yu, Xunbo; Zhao, Tianqi; Cai, Yuanfa; Chen, Duo; Chen, Zhidong; Sang, Xinzhu

    2013-08-01

    High-immersion three-dimensional (3D) displays making them valuable tools for many applications, such as designing and constructing desired building houses, industrial architecture design, aeronautics, scientific research, entertainment, media advertisement, military areas and so on. However, most technologies provide 3D display in the front of screens which are in parallel with the walls, and the sense of immersion is decreased. To get the right multi-view stereo ground image, cameras' photosensitive surface should be parallax to the public focus plane and the cameras' optical axes should be offset to the center of public focus plane both atvertical direction and horizontal direction. It is very common to use virtual cameras, which is an ideal pinhole camera to display 3D model in computer system. We can use virtual cameras to simulate the shooting method of multi-view ground based stereo image. Here, two virtual shooting methods for ground based high-immersion 3D display are presented. The position of virtual camera is determined by the people's eye position in the real world. When the observer stand in the circumcircle of 3D ground display, offset perspective projection virtual cameras is used. If the observer stands out the circumcircle of 3D ground display, offset perspective projection virtual cameras and the orthogonal projection virtual cameras are adopted. In this paper, we mainly discussed the parameter setting of virtual cameras. The Near Clip Plane parameter setting is the main point in the first method, while the rotation angle of virtual cameras is the main point in the second method. In order to validate the results, we use the D3D and OpenGL to render scenes of different viewpoints and generate a stereoscopic image. A realistic visualization system for 3D models is constructed and demonstrated for viewing horizontally, which provides high-immersion 3D visualization. The displayed 3D scenes are compared with the real objects in the real world.

  11. The Student Experience With Varying Immersion Levels of Virtual Reality Simulation.

    PubMed

    Farra, Sharon L; Smith, Sherrill J; Ulrich, Deborah L

    With increasing use of virtual reality simulation (VRS) in nursing education and given the vast array of technologies available, a variety of levels of immersion and experiences can be provided to students. This study explored two different levels of immersive VRS capability. Study participants included baccalaureate nursing students from three universities across four campuses. Students were trained in the skill of decontamination using traditional methods or with VRS options of mouse and keyboard or head-mounted display technology. Results of focus group interviews reflect the student experience and satisfaction with two different immersive levels of VRS.

  12. Effects of a Haptic Augmented Simulation on K-12 Students' Achievement and Their Attitudes Towards Physics

    ERIC Educational Resources Information Center

    Civelek, Turhan; Ucar, Erdem; Ustunel, Hakan; Aydin, Mehmet Kemal

    2014-01-01

    The current research aims to explore the effects of a haptic augmented simulation on students' achievement and their attitudes towards Physics in an immersive virtual reality environment (VRE). A quasi-experimental post-test design was employed utilizing experiment and control groups. The participants were 215 students from a K-12 school in…

  13. Exploring Ecosystems from the Inside: How Immersive Multi-User Virtual Environments Can Support Development of Epistemologically Grounded Modeling Practices in Ecosystem Science Instruction

    ERIC Educational Resources Information Center

    Kamarainen, Amy M.; Metcalf, Shari; Grotzer, Tina; Dede, Chris

    2015-01-01

    Recent reform efforts and the next generation science standards emphasize the importance of incorporating authentic scientific practices into science instruction. Modeling can be a particularly challenging practice to address because modeling occurs within a socially structured system of representation that is specific to a domain. Further, in the…

  14. The Effect of Immersive Virtual Environments on Student Perception and Interest in a University Graduate Program

    ERIC Educational Resources Information Center

    Serviss, Jennifer

    2016-01-01

    This study was conducted to examine the perceptions of potential college students after participating in a recruitment presentation of a university. The focus was to conduct user research to establish some causal relationship between the design of a university marketing tool and a behavior such as the interest of potential students in the…

  15. Real-time, interactive, visually updated simulator system for telepresence

    NASA Technical Reports Server (NTRS)

    Schebor, Frederick S.; Turney, Jerry L.; Marzwell, Neville I.

    1991-01-01

    Time delays and limited sensory feedback of remote telerobotic systems tend to disorient teleoperators and dramatically decrease the operator's performance. To remove the effects of time delays, key components were designed and developed of a prototype forward simulation subsystem, the Global-Local Environment Telerobotic Simulator (GLETS) that buffers the operator from the remote task. GLETS totally immerses an operator in a real-time, interactive, simulated, visually updated artificial environment of the remote telerobotic site. Using GLETS, the operator will, in effect, enter into a telerobotic virtual reality and can easily form a gestalt of the virtual 'local site' that matches the operator's normal interactions with the remote site. In addition to use in space based telerobotics, GLETS, due to its extendable architecture, can also be used in other teleoperational environments such as toxic material handling, construction, and undersea exploration.

  16. Situating Pedagogies, Positions and Practices in Immersive Virtual Worlds

    ERIC Educational Resources Information Center

    Savin-Baden, Maggi; Gourlay, Lesley; Tombs, Cathy; Steils, Nicole; Tombs, Gemma; Mawer, Matt

    2010-01-01

    Background: The literature on immersive virtual worlds and e-learning to date largely indicates that technology has led the pedagogy. Although rationales for implementing e-learning have included flexibility of provision and supporting diversity, none of these recommendations has helped to provide strong pedagogical location. Furthermore, there is…

  17. Building a Collaborative Online Literary Experience

    ERIC Educational Resources Information Center

    Essid, Joe; Wilde, Fran

    2011-01-01

    Effective virtual simulations can embed participants in imaginary worlds. Researchers working in virtual worlds and gaming often refer to "immersion," a state in which a participant or player loses track of time and becomes one with the simulation. Immersive settings have been shown to deepen learning. Ken Hudson's work with students…

  18. Virtual Reality as a Story Telling Platform for Geoscience Communication

    NASA Astrophysics Data System (ADS)

    Lazar, K.; Moysey, S. M.

    2017-12-01

    Capturing the attention of students and the public is a critical step for increasing societal interest and literacy in earth science issues. Virtual reality (VR) provides a means for geoscience engagement that is well suited to place-based learning through exciting and immersive experiences. One approach is to create fully-immersive virtual gaming environments where players interact with physical objects, such as rock samples and outcrops, to pursue geoscience learning goals. Developing an experience like this, however, can require substantial programming expertise and resources. At the other end of the development spectrum, it is possible for anyone to create immersive virtual experiences with 360-degree imagery, which can be made interactive using easy to use VR editing software to embed videos, audio, images, and other content within the 360-degree image. Accessible editing tools like these make the creation of VR experiences something that anyone can tackle. Using the VR editor ThingLink and imagery from Google Maps, for example, we were able to create an interactive tour of the Grand Canyon, complete with embedded assessments, in a matter of hours. The true power of such platforms, however, comes from the potential to engage students as content authors to create and share stories of place that explore geoscience issues from their personal perspective. For example, we have used combinations of 360-degree images with interactive mapping and web platforms to enable students with no programming experience to create complex web apps as highly engaging story telling platforms. We highlight here examples of how we have implemented such story telling approaches with students to assess learning in courses, to share geoscience research outcomes, and to communicate issues of societal importance.

  19. Virtual reality and paranoid ideations in people with an 'at-risk mental state' for psychosis.

    PubMed

    Valmaggia, Lucia R; Freeman, Daniel; Green, Catherine; Garety, Philippa; Swapp, David; Antley, Angus; Prescott, Corinne; Fowler, David; Kuipers, Elizabeth; Bebbington, Paul; Slater, Mel; Broome, Matthew; McGuire, Philip K

    2007-12-01

    Virtual reality provides a means of studying paranoid thinking in controlled laboratory conditions. However, this method has not been used with a clinical group. To establish the feasibility and safety of using virtual reality methodology in people with an at-risk mental state and to investigate the applicability of a cognitive model of paranoia to this group. Twenty-one participants with an at-risk mental state were assessed before and after entering a virtual reality environment depicting the inside of an underground train. Virtual reality did not raise levels of distress at the time of testing or cause adverse experiences over the subsequent week. Individuals attributed mental states to virtual reality characters including hostile intent. Persecutory ideation in virtual reality was predicted by higher levels of trait paranoia, anxiety, stress, immersion in virtual reality, perseveration and interpersonal sensitivity. Virtual reality is an acceptable experimental technique for use with individuals with at-risk mental states. Paranoia in virtual reality was understandable in terms of the cognitive model of persecutory delusions.

  20. Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.

    PubMed

    Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk

    2013-08-01

    Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.

  1. A haptic interface for virtual simulation of endoscopic surgery.

    PubMed

    Rosenberg, L B; Stredney, D

    1996-01-01

    Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.

  2. The Components of Effective Teacher Training in the Use of Three-Dimensional Immersive Virtual Worlds for Learning and Instruction Purposes: A Literature Review

    ERIC Educational Resources Information Center

    Nussli, Natalie; Oh, Kevin

    2014-01-01

    The overarching question that guides this review is to identify the key components of effective teacher training in virtual schooling, with a focus on three-dimensional (3D) immersive virtual worlds (IVWs). The process of identifying the essential components of effective teacher training in the use of 3D IVWs will be described step-by-step. First,…

  3. Sonic intelligence as a virtual therapeutic environment.

    PubMed

    Tarnanas, Ioannis; Adam, Dimitrios

    2003-06-01

    This paper reports on the results of a research project, on comparing one virtual collaborative environment with a first-person visual immersion (first-perspective interaction) and a second one where the user interacts through a sound-kinetic virtual representation of himself (avatar), as a stress-coping environment in real-life situations. Recent developments in coping research are proposing a shift from a trait-oriented approach of coping to a more situation-specific treatment. We defined as real-life situation a target-oriented situation that demands a complex coping skills inventory of high self-efficacy and internal or external "locus of control" strategies. The participants were 90 normal adults with healthy or impaired coping skills, 25-40 years of age, randomly spread across two groups. There was the same number of participants across groups and gender balance within groups. All two groups went through two phases. In Phase I, Solo, one participant was assessed using a three-stage assessment inspired by the transactional stress theory of Lazarus and the stress inoculation theory of Meichenbaum. In Phase I, each participant was given a coping skills measurement within the time course of various hypothetical stressful encounters performed in two different conditions and a control group. In Condition A, the participant was given a virtual stress assessment scenario relative to a first-person perspective (VRFP). In Condition B, the participant was given a virtual stress assessment scenario relative to a behaviorally realistic motion controlled avatar with sonic feedback (VRSA). In Condition C, the No Treatment Condition (NTC), the participant received just an interview. In Phase II, all three groups were mixed and exercised the same tasks but with two participants in pairs. The results showed that the VRSA group performed notably better in terms of cognitive appraisals, emotions and attributions than the other two groups in Phase I (VRSA, 92%; VRFP, 85%; NTC, 34%). In Phase II, the difference again favored the VRSA group against the other two. These results indicate that a virtual collaborative environment seems to be a consistent coping environment, tapping two classes of stress: (a) aversive or ambiguous situations, and (b) loss or failure situations in relation to the stress inoculation theory. In terms of coping behaviors, a distinction is made between self-directed and environment-directed strategies. A great advantage of the virtual collaborative environment with the behaviorally enhanced sound-kinetic avatar is the consideration of team coping intentions in different stages. Even if the aim is to tap transactional processes in real-life situations, it might be better to conduct research using a sound-kinetic avatar based collaborative environment than a virtual first-person perspective scenario alone. The VE consisted of two dual-processor PC systems, a video splitter, a digital camera and two stereoscopic CRT displays. The system was programmed in C++ and VRScape Immersive Cluster from VRCO, which created an artificial environment that encodes the user's motion from a video camera, targeted at the face of the users and physiological sensors attached to the body.

  4. Immersive volume rendering of blood vessels

    NASA Astrophysics Data System (ADS)

    Long, Gregory; Kim, Han Suk; Marsden, Alison; Bazilevs, Yuri; Schulze, Jürgen P.

    2012-03-01

    In this paper, we present a novel method of visualizing flow in blood vessels. Our approach reads unstructured tetrahedral data, resamples it, and uses slice based 3D texture volume rendering. Due to the sparse structure of blood vessels, we utilize an octree to efficiently store the resampled data by discarding empty regions of the volume. We use animation to convey time series data, wireframe surface to give structure, and utilize the StarCAVE, a 3D virtual reality environment, to add a fully immersive element to the visualization. Our tool has great value in interdisciplinary work, helping scientists collaborate with clinicians, by improving the understanding of blood flow simulations. Full immersion in the flow field allows for a more intuitive understanding of the flow phenomena, and can be a great help to medical experts for treatment planning.

  5. Desktop-VR system for preflight 3D navigation training

    NASA Astrophysics Data System (ADS)

    Aoki, Hirofumi; Oman, Charles M.; Buckland, Daniel A.; Natapoff, Alan

    Crews who inhabit spacecraft with complex 3D architecture frequently report inflight disorientation and navigation problems. Preflight virtual reality (VR) training may reduce those risks. Although immersive VR techniques may better support spatial orientation training in a local environment, a non-immersive desktop (DT) system may be more convenient for navigation training in "building scale" spaces, especially if the two methods achieve comparable results. In this study trainees' orientation and navigation performance during simulated space station emergency egress tasks was compared while using immersive head-mounted display (HMD) and DT-VR systems. Analyses showed no differences in pointing angular-error or egress time among the groups. The HMD group was significantly faster than DT group when pointing from destination to start location and from start toward different destination. However, this may be attributed to differences in the input device used (a head-tracker for HMD group vs. a keyboard touchpad or a gamepad in the DT group). All other 3D navigation performance measures were similar using the immersive and non-immersive VR systems, suggesting that the simpler desktop VR system may be useful for astronaut 3D navigation training.

  6. Human Factors Virtual Analysis Techniques for NASA's Space Launch System Ground Support using MSFC's Virtual Environments Lab (VEL)

    NASA Technical Reports Server (NTRS)

    Searcy, Brittani

    2017-01-01

    Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.

  7. Postural and Spatial Orientation Driven by Virtual Reality

    PubMed Central

    Keshner, Emily A.; Kenyon, Robert V.

    2009-01-01

    Orientation in space is a perceptual variable intimately related to postural orientation that relies on visual and vestibular signals to correctly identify our position relative to vertical. We have combined a virtual environment with motion of a posture platform to produce visual-vestibular conditions that allow us to explore how motion of the visual environment may affect perception of vertical and, consequently, affect postural stabilizing responses. In order to involve a higher level perceptual process, we needed to create a visual environment that was immersive. We did this by developing visual scenes that possess contextual information using color, texture, and 3-dimensional structures. Update latency of the visual scene was close to physiological latencies of the vestibulo-ocular reflex. Using this system we found that even when healthy young adults stand and walk on a stable support surface, they are unable to ignore wide field of view visual motion and they adapt their postural orientation to the parameters of the visual motion. Balance training within our environment elicited measurable rehabilitation outcomes. Thus we believe that virtual environments can serve as a clinical tool for evaluation and training of movement in situations that closely reflect conditions found in the physical world. PMID:19592796

  8. Enhancement of a virtual reality wheelchair simulator to include qualitative and quantitative performance metrics.

    PubMed

    Harrison, C S; Grant, P M; Conway, B A

    2010-01-01

    The increasing importance of inclusive design and in particular accessibility guidelines established in the U.K. 1996 Disability Discrimination Act (DDA) has been a prime motivation for the work on wheelchair access, a subset of the DDA guidelines, described in this article. The development of these guidelines mirrors the long-standing provisions developed in the U.S. In order to raise awareness of these guidelines and in particular to give architects, building designers, and users a physical sensation of how a planned development could be experienced, a wheelchair virtual reality system was developed. This compares with conventional methods of measuring against drawings and comparing dimensions against building regulations, established in the U.K. under British standards. Features of this approach include the marriage of an electromechanical force-feedback system with high-quality immersive graphics as well as the potential ability to generate a physiological rating of buildings that do not yet exist. The provision of this sense of "feel" augments immersion within the virtual reality environment and also provides the basis from which both qualitative and quantitative measures of a building's access performance can be gained.

  9. Software for math and science education for the deaf.

    PubMed

    Adamo-Villani, Nicoletta; Wilbur, Ronnie

    2010-01-01

    In this article, we describe the development of two novel approaches to teaching math and science concepts to deaf children using 3D animated interactive software. One approach, Mathsigner, is non-immersive and the other, SMILE, is a virtual reality immersive environment. The content is curriculum-based, and the animated signing characters are constructed with state-of-the art technology and design. We report preliminary development findings of usability and appeal based on programme features (e.g. 2D/3D, immersiveness, interaction type, avatar and interface design) and subject features (hearing status, gender and age). Programme features of 2D/3D, immersiveness and interaction type were very much affected by subject features. Among subject features, we find significant effects of hearing status (deaf children take longer time and make more mistakes than hearing children) and gender (girls take longer than boys; girls prefer immersive environments rather than desktop presentation; girls are more interested in content than technology compared to boys). For avatar type, we found a preference for seamless, deformable characters over segmented ones. For interface comparisons, there were no subject effects, but an animated interface resulted in reduced time to task completion compared to static interfaces with and without sound and highlighting. These findings identify numerous features that affect software design and appeal and suggest that designers must be careful in their assumptions during programme development.

  10. Modeling and performance analysis using extended fuzzy-timing Petri nets for networked virtual environments.

    PubMed

    Zhou, Y; Murata, T; Defanti, T A

    2000-01-01

    Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.

  11. Exploring 4D Flow Data in an Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Butkiewicz, T.

    2017-12-01

    Ocean models help us to understand and predict a wide range of intricate physical processes which comprise the atmospheric and oceanic systems of the Earth. Because these models output an abundance of complex time-varying three-dimensional (i.e., 4D) data, effectively conveying the myriad information from a given model poses a significant visualization challenge. The majority of the research effort into this problem has concentrated around synthesizing and examining methods for representing the data itself; by comparison, relatively few studies have looked into the potential merits of various viewing conditions and virtual environments. We seek to improve our understanding of the benefits offered by current consumer-grade virtual reality (VR) systems through an immersive, interactive 4D flow visualization system. Our dataset is a Regional Ocean Modeling System (ROMS) model representing a 12-hour tidal cycle of the currents within New Hampshire's Great Bay estuary. The model data was loaded into a custom VR particle system application using the OpenVR software library and the HTC Vive hardware, which tracks a headset and two six-degree-of-freedom (6DOF) controllers within a 5m-by-5m area. The resulting visualization system allows the user to coexist in the same virtual space as the data, enabling rapid and intuitive analysis of the flow model through natural interactions with the dataset and within the virtual environment. Whereas a traditional computer screen typically requires the user to reposition a virtual camera in the scene to obtain the desired view of the data, in virtual reality the user can simply move their head to the desired viewpoint, completely eliminating the mental context switches from data exploration/analysis to view adjustment and back. The tracked controllers become tools to quickly manipulate (reposition, reorient, and rescale) the dataset and to interrogate it by, e.g., releasing dye particles into the flow field, probing scalar velocities, placing a cutting plane through a region of interest, etc. It is hypothesized that the advantages afforded by head-tracked viewing and 6DOF interaction devices will lead to faster and more efficient examination of 4D flow data. A human factors study is currently being prepared to empirically evaluate this method of visualization and interaction.

  12. Applying Virtual Reality to commercial Edutainment

    NASA Technical Reports Server (NTRS)

    Grissom, F.; Goza, Sharon P.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) when defined as a computer generated, immersive, three dimensional graphics environment which provides varying degrees of interactivity, remains an expensive, highly specialized application, yet to find its way into the school, home, or business. As a novel approach to a theme park-type attraction, though, its use can be justified. This paper describes how a virtual reality 'tour of the human digestive system' was created for the Omniplex Science Museum of Oklahoma City, Oklahoma. The customers main objectives were: (1) to educate; (2) to entertain; (3) to draw visitors; and (4) to generate revenue. The 'Edutainment' system ultimately delivered met these goals. As more such systems come into existence the resulting library of licensable programs will greatly reduce development costs to individual institutions.

  13. The acceptance of virtual reality devices for cognitive rehabilitation: a report of positive results with schizophrenia.

    PubMed

    da Costa, Rosa Maria Esteves Moreira; de Carvalho, Luís Alfredo Vidal

    2004-03-01

    This study presents a process of virtual environment development supported by a cognitive model that is specific to cognitive deficits of diverse disorders or traumatic brain injury, and evaluates the acceptance of computer devices by a group of schizophrenic patients. The subjects that participated in this experiment accepted to work with computers and immersive glasses and demonstrated a high level of interest in the proposed tasks. No problems of illness have been observed. This experiment indicated that further research projects must be carried out to verify the value of virtual reality technology for cognitive rehabilitation of psychiatric patients. The results of the current study represent a small but necessary step in the realization of that potential.

  14. Analysis of brain activity and response during monoscopic and stereoscopic visualization

    NASA Astrophysics Data System (ADS)

    Calore, Enrico; Folgieri, Raffaella; Gadia, Davide; Marini, Daniele

    2012-03-01

    Stereoscopic visualization in cinematography and Virtual Reality (VR) creates an illusion of depth by means of two bidimensional images corresponding to different views of a scene. This perceptual trick is used to enhance the emotional response and the sense of presence and immersivity of the observers. An interesting question is if and how it is possible to measure and analyze the level of emotional involvement and attention of the observers during a stereoscopic visualization of a movie or of a virtual environment. The research aims represent a challenge, due to the large number of sensorial, physiological and cognitive stimuli involved. In this paper we begin this research by analyzing possible differences in the brain activity of subjects during the viewing of monoscopic or stereoscopic contents. To this aim, we have performed some preliminary experiments collecting electroencephalographic (EEG) data of a group of users using a Brain- Computer Interface (BCI) during the viewing of stereoscopic and monoscopic short movies in a VR immersive installation.

  15. Immersive Virtual Worlds in University-Level Human Geography Courses

    ERIC Educational Resources Information Center

    Dittmer, Jason

    2010-01-01

    This paper addresses the potential for increased deployment of immersive virtual worlds in higher geographic education. An account of current practice regarding popular culture in the geography classroom is offered, focusing on the objectification of popular culture rather than its constitutive role vis-a-vis place. Current e-learning practice is…

  16. Highly immersive virtual reality laparoscopy simulation: development and future aspects.

    PubMed

    Huber, Tobias; Wunderling, Tom; Paschold, Markus; Lang, Hauke; Kneist, Werner; Hansen, Christian

    2018-02-01

    Virtual reality (VR) applications with head-mounted displays (HMDs) have had an impact on information and multimedia technologies. The current work aimed to describe the process of developing a highly immersive VR simulation for laparoscopic surgery. We combined a VR laparoscopy simulator (LapSim) and a VR-HMD to create a user-friendly VR simulation scenario. Continuous clinical feedback was an essential aspect of the development process. We created an artificial VR (AVR) scenario by integrating the simulator video output with VR game components of figures and equipment in an operating room. We also created a highly immersive VR surrounding (IVR) by integrating the simulator video output with a [Formula: see text] video of a standard laparoscopy scenario in the department's operating room. Clinical feedback led to optimization of the visualization, synchronization, and resolution of the virtual operating rooms (in both the IVR and the AVR). Preliminary testing results revealed that individuals experienced a high degree of exhilaration and presence, with rare events of motion sickness. The technical performance showed no significant difference compared to that achieved with the standard LapSim. Our results provided a proof of concept for the technical feasibility of an custom highly immersive VR-HMD setup. Future technical research is needed to improve the visualization, immersion, and capability of interacting within the virtual scenario.

  17. Foreign Language Vocabulary Development through Activities in an Online 3D Environment

    ERIC Educational Resources Information Center

    Milton, James; Jonsen, Sunniva; Hirst, Steven; Lindenburn, Sharn

    2012-01-01

    On-line virtual 3D worlds offer the opportunity for users to interact in real time with native speakers of the language they are learning. In principle, this ought to be of great benefit to learners, and mimicking the opportunity for immersion that real-life travel to a foreign country offers. We have very little research to show whether this is…

  18. LLCySA: Making Sense of Cyberspace

    DTIC Science & Technology

    2014-01-01

    data center. His other activities include the development of immersive 3D environments leveraging video- game technology to provide a multiplayer ...exploring data-driven approaches to network protection. Imagine a cyber analyst navigating a three-dimen- sional (3D) game , walking down virtual office...because of information overload. One approach to this challenge leverages technol- ogy utilized in the 3D gaming industry. The video- game medium

  19. Using Interactive Visualization to Analyze Solid Earth Data and Geodynamics Models

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.; Kreylos, O.; Billen, M. I.; Hamann, B.; Jadamec, M. A.; Rundle, J. B.; van Aalsburg, J.; Yikilmaz, M. B.

    2008-12-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. Major projects such as EarthScope and GeoEarthScope are producing the data needed to characterize the structure and kinematics of Earth's surface and interior at unprecedented resolution. At the same time, high-performance computing enables high-precision and fine- detail simulation of geodynamics processes, complementing the observational data. To facilitate interpretation and analysis of these datasets, to evaluate models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. VR has traditionally been used primarily as a presentation tool allowing active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for accelerated scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. Our approach to VR takes advantage of the specialized skills of geoscientists who are trained to interpret geological and geophysical data generated from field observations. Interactive tools allow the scientist to explore and interpret geodynamic models, tomographic models, and topographic observations, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulations or field observations. The use of VR technology enables us to improve our interpretation of crust and mantle structure and of geodynamical processes. Mapping tools based on computer visualization allow virtual "field studies" in inaccessible regions, and an interactive tool allows us to construct digital fault models for use in numerical models. Using the interactive tools on a high-end platform such as an immersive virtual reality room known as a Cave Automatic Virtual Environment (CAVE), enables the scientist to stand in data three-dimensional dataset while taking measurements. The CAVE involves three or more projection surfaces arranged as walls in a room. Stereo projectors combined with a motion tracking system and immersion recreates the experience of carrying out research in the field. This high-end system provides significant advantages for scientists working with complex volumetric data. The interactive tools also work on low-cost platforms that provide stereo views and the potential for interactivity such as a Geowall or a 3D enabled TV. The Geowall is also a well-established tool for education, and in combination with the tools we have developed, enables the rapid transfer of research data and new knowledge to the classroom. The interactive visualization tools can also be used on a desktop or laptop with or without stereo capability. Further information about the Virtual Reality User Interface (VRUI), the 3DVisualizer, the Virtual mapping tools, and the LIDAR viewer, can be found on the KeckCAVES website, www.keckcaves.org.

  20. Resting-State fMRI Activity Predicts Unsupervised Learning and Memory in an Immersive Virtual Reality Environment

    PubMed Central

    Wong, Chi Wah; Olafsson, Valur; Plank, Markus; Snider, Joseph; Halgren, Eric; Poizner, Howard; Liu, Thomas T.

    2014-01-01

    In the real world, learning often proceeds in an unsupervised manner without explicit instructions or feedback. In this study, we employed an experimental paradigm in which subjects explored an immersive virtual reality environment on each of two days. On day 1, subjects implicitly learned the location of 39 objects in an unsupervised fashion. On day 2, the locations of some of the objects were changed, and object location recall performance was assessed and found to vary across subjects. As prior work had shown that functional magnetic resonance imaging (fMRI) measures of resting-state brain activity can predict various measures of brain performance across individuals, we examined whether resting-state fMRI measures could be used to predict object location recall performance. We found a significant correlation between performance and the variability of the resting-state fMRI signal in the basal ganglia, hippocampus, amygdala, thalamus, insula, and regions in the frontal and temporal lobes, regions important for spatial exploration, learning, memory, and decision making. In addition, performance was significantly correlated with resting-state fMRI connectivity between the left caudate and the right fusiform gyrus, lateral occipital complex, and superior temporal gyrus. Given the basal ganglia's role in exploration, these findings suggest that tighter integration of the brain systems responsible for exploration and visuospatial processing may be critical for learning in a complex environment. PMID:25286145

  1. Rats in Virtual Space: The development and implementation of a multimodal virtual reality system for small animals

    NASA Astrophysics Data System (ADS)

    Aharoni, Daniel Benjamin

    The integration of multimodal sensory information into a common neural code is a critical function of all complex nervous systems. This process is required for adaptive responding to incoming stimuli as well as the formation of a cognitive map of the external sensory environment. The underlying neural mechanisms of multimodal integration are poorly understood due, in part, to the technical difficulties of manipulating multimodal sensory information in combination with simultaneous in-vivo electrophysiological recording in awake behaving animals. We therefore developed a non-invasive multimodal virtual reality system that is conducive to wired electrophysiological recording techniques. This system allows for the dynamic presentation of highly immersive audiovisual virtual environments to rats maintained in a body fixed position on top of a quiet spherical treadmill. Notably, this allows the rats to remain at the same spatial location in the real world without the need for head fixation. This method opens the door for a wide array of future studies aimed at elucidating the underlying neural mechanisms of multimodal integration.

  2. Building Virtual Mars

    NASA Astrophysics Data System (ADS)

    Abercrombie, S. P.; Menzies, A.; Goddard, C.

    2017-12-01

    Virtual and augmented reality enable scientists to visualize environments that are very difficult, or even impossible to visit, such as the surface of Mars. A useful immersive visualization begins with a high quality reconstruction of the environment under study. This presentation will discuss a photogrammetry pipeline developed at the Jet Propulsion Laboratory to reconstruct 3D models of the surface of Mars using stereo images sent back to Earth by the Curiosity Mars rover. The resulting models are used to support a virtual reality tool (OnSight) that allows scientists and engineers to visualize the surface of Mars as if they were standing on the red planet. Images of Mars present challenges to existing scene reconstruction solutions. Surface images of Mars are sparse with minimal overlap, and are often taken from extremely different viewpoints. In addition, the specialized cameras used by Mars rovers are significantly different than consumer cameras, and GPS localization data is not available on Mars. This presentation will discuss scene reconstruction with an emphasis on coping with limited input data, and on creating models suitable for rendering in virtual reality at high frame rate.

  3. Emergent Capabilities Converging into M and S 2.0

    NASA Technical Reports Server (NTRS)

    Reitz, Emilie; Reist, Jay

    2012-01-01

    The continued operational environment complexity faced by the Department of Defense, despite a restricted resource environment, is a mandate for greater adaptability and availability in joint training. To address these constraints, this paper proposes a model for the potential integration of adaptability training, virtual world capabilities and immersive training into the wider Joint Live Virtual and Constructive (JLVC) Federation, supported by human, social, cultural and behavior modeling, and measurement and assessment. By fusing those capabilities and modeling and simulation enhancements into the JLVC federation, it will create a force who is more apt to arrive at and implement correct decisions, and more able to appropriately seize initiative in the field. The model would allow for the testing and training of capabilities and TTPs that cannot be reasonably explored to their logical conclusions in a 'live' environment, as well as enhance training fidelity for all echelons and tasks.

  4. The Use of Virtual Reality in Patients with Eating Disorders: Systematic Review

    PubMed Central

    Clus, Damien; Larsen, Mark Erik; Lemey, Christophe

    2018-01-01

    Background Patients with eating disorders are characterized by pathological eating habits and a tendency to overestimate their weight and body shape. Virtual reality shows promise for the evaluation and management of patients with eating disorders. This technology, when accepted by this population, allows immersion in virtual environments, assessment, and therapeutic approaches, by exposing users to high-calorie foods or changes in body shape. Objective To better understand the value of virtual reality, we conducted a review of the literature, including clinical studies proposing the use of virtual reality for the evaluation and management of patients with eating disorders. Methods We searched PubMed, PsycINFO, ScienceDirect, the Cochrane Library, Scopus, and Web of Science up to April 2017. We created the list of keywords based on two domains: virtual reality and eating disorders. We used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify, select, and critically appraise relevant research while minimizing bias. Results The initial database searches identified 311 articles, 149 of which we removed as duplicates. We analyzed the resulting set of 26 unique studies that met the inclusion criteria. Of these, 8 studies were randomized controlled trials, 13 were nonrandomized studies, and 5 were clinical trials with only 1 participant. Most articles focused on clinical populations (19/26, 73%), with the remainder reporting case-control studies (7/26, 27%). Most of the studies used visual immersive equipment (16/26, 62%) with a head-mounted display (15/16, 94%). Two main areas of interest emerged from these studies: virtual work on patients’ body image (7/26, 27%) and exposure to virtual food stimuli (10/26, 38%). Conclusions We conducted a broad analysis of studies on the use of virtual reality in patients with eating disorders. This review of the literature showed that virtual reality is an acceptable and promising therapeutic tool for patients with eating disorders. PMID:29703715

  5. Interactive Immersive Virtualmuseum: Digital Documentation for Virtual Interaction

    NASA Astrophysics Data System (ADS)

    Clini, P.; Ruggeri, L.; Angeloni, R.; Sasso, M.

    2018-05-01

    Thanks to their playful and educational approach Virtual Museum systems are very effective for the communication of Cultural Heritage. Among the latest technologies Immersive Virtual Reality is probably the most appealing and potentially effective to serve this purpose; nevertheless, due to a poor user-system interaction, caused by an incomplete maturity of a specific technology for museum applications, it is still quite uncommon to find immersive installations in museums. This paper explore the possibilities offered by this technology and presents a workflow that, starting from digital documentation, makes possible an interaction with archaeological finds or any other cultural heritage inside different kinds of immersive virtual reality spaces. Two different cases studies are presented: the National Archaeological Museum of Marche in Ancona and the 3D reconstruction of the Roman Forum of Fanum Fortunae. Two different approaches not only conceptually but also in contents; while the Archaeological Museum is represented in the application simply using spherical panoramas to give the perception of the third dimension, the Roman Forum is a 3D model that allows visitors to move in the virtual space as in the real one. In both cases, the acquisition phase of the artefacts is central; artefacts are digitized with the photogrammetric technique Structure for Motion then they are integrated inside the immersive virtual space using a PC with a HTC Vive system that allows the user to interact with the 3D models turning the manipulation of objects into a fun and exciting experience. The challenge, taking advantage of the latest opportunities made available by photogrammetry and ICT, is to enrich visitors' experience in Real Museum making possible the interaction with perishable, damaged or lost objects and the public access to inaccessible or no longer existing places promoting in this way the preservation of fragile sites.

  6. Collaborative virtual environments art exhibition

    NASA Astrophysics Data System (ADS)

    Dolinsky, Margaret; Anstey, Josephine; Pape, Dave E.; Aguilera, Julieta C.; Kostis, Helen-Nicole; Tsoupikova, Daria

    2005-03-01

    This panel presentation will exhibit artwork developed in CAVEs and discuss how art methodologies enhance the science of VR through collaboration, interaction and aesthetics. Artists and scientists work alongside one another to expand scientific research and artistic expression and are motivated by exhibiting collaborative virtual environments. Looking towards the arts, such as painting and sculpture, computer graphics captures a visual tradition. Virtual reality expands this tradition to not only what we face, but to what surrounds us and even what responds to our body and its gestures. Art making that once was isolated to the static frame and an optimal point of view is now out and about, in fully immersive mode within CAVEs. Art knowledge is a guide to how the aesthetics of 2D and 3D worlds affect, transform, and influence the social, intellectual and physical condition of the human body through attention to psychology, spiritual thinking, education, and cognition. The psychological interacts with the physical in the virtual in such a way that each facilitates, enhances and extends the other, culminating in a "go together" world. Attention to sharing art experience across high-speed networks introduces a dimension of liveliness and aliveness when we "become virtual" in real time with others.

  7. Can hazard risk be communicated through a virtual experience?

    PubMed

    Mitchell, J T

    1997-09-01

    Cyberspace, defined by William Gibson as a consensual hallucination, now refers to all computer-generated interactive environments. Virtual reality, one of a class of interactive cyberspaces, allows us to create and interact directly with objects not available in the everyday world. Despite successes in the entertainment and aviation industries, this technology has been called a 'solution in search of a problem'. The purpose of this commentary is to suggest such a problem: the inability to acquire experience with a hazard to motivate mitigation. Direct experience with a hazard has been demonstrated as a powerful incentive to adopt mitigation measures. While we lack the ability to summon hazard events at will in order to gain access to that experience, a virtual environment can provide an arena where potential victims are exposed to a hazard's effects. Immersion as an active participant within the hazard event through virtual reality may stimulate users to undertake mitigation steps that might otherwise remain undone. This paper details the possible direction in which virtual reality may be applied to hazards mitigation through a discussion of the technology, the role of hazard experience, the creation of a hazard stimulation and the issues constraining implementation.

  8. How virtual reality works: illusions of vision in "real" and virtual environments

    NASA Astrophysics Data System (ADS)

    Stark, Lawrence W.

    1995-04-01

    Visual illusions abound in normal vision--illusions of clarity and completeness, of continuity in time and space, of presence and vivacity--and are part and parcel of the visual world inwhich we live. These illusions are discussed in terms of the human visual system, with its high- resolution fovea, moved from point to point in the visual scene by rapid saccadic eye movements (EMs). This sampling of visual information is supplemented by a low-resolution, wide peripheral field of view, especially sensitive to motion. Cognitive-spatial models controlling perception, imagery, and 'seeing,' also control the EMs that shift the fovea in the Scanpath mode. These illusions provide for presence, the sense off being within an environment. They equally well lead to 'Telepresence,' the sense of being within a virtual display, especially if the operator is intensely interacting within an eye-hand and head-eye human-machine interface that provides for congruent visual and motor frames of reference. Interaction, immersion, and interest compel telepresence; intuitive functioning and engineered information flows can optimize human adaptation to the artificial new world of virtual reality, as virtual reality expands into entertainment, simulation, telerobotics, and scientific visualization and other professional work.

  9. Can walking motions improve visually induced rotational self-motion illusions in virtual reality?

    PubMed

    Riecke, Bernhard E; Freiberg, Jacob B; Grechkin, Timofey Y

    2015-02-04

    Illusions of self-motion (vection) can provide compelling sensations of moving through virtual environments without the need for complex motion simulators or large tracked physical walking spaces. Here we explore the interaction between biomechanical cues (stepping along a rotating circular treadmill) and visual cues (viewing simulated self-rotation) for providing stationary users a compelling sensation of rotational self-motion (circular vection). When tested individually, biomechanical and visual cues were similarly effective in eliciting self-motion illusions. However, in combination they yielded significantly more intense self-motion illusions. These findings provide the first compelling evidence that walking motions can be used to significantly enhance visually induced rotational self-motion perception in virtual environments (and vice versa) without having to provide for physical self-motion or motion platforms. This is noteworthy, as linear treadmills have been found to actually impair visually induced translational self-motion perception (Ash, Palmisano, Apthorp, & Allison, 2013). Given the predominant focus on linear walking interfaces for virtual-reality locomotion, our findings suggest that investigating circular and curvilinear walking interfaces offers a promising direction for future research and development and can help to enhance self-motion illusions, presence and immersion in virtual-reality systems. © 2015 ARVO.

  10. 3DUI assisted lower and upper member therapy.

    PubMed

    Uribe-Quevedo, Alvaro; Perez-Gutierrez, Byron

    2012-01-01

    3DUIs are becoming very popular among researchers, developers and users as they allow more immersive and interactive experiences by taking advantage of the human dexterity. The features offered by these interfaces outside the gaming environment, have allowed the development of applications in the medical area by enhancing the user experience and aiding the therapy process in controlled and monitored environments. Using mainstream videogame 3DUIs based on inertial and image sensors available in the market, this work presents the development of a virtual environment and its navigation through lower member captured gestures for assisting motion during therapy.

  11. Fieldwork Skills in Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Craven, Benjamin; Lloyd, Geoffrey; Gordon, Clare; Houghton, Jacqueline; Morgan, Daniel

    2017-04-01

    Virtual reality has an increasingly significant role to play in teaching and research, but for geological applications realistic landscapes are required that contain sufficient detail to prove viable for investigation by both inquisitive students and critical researchers. To create such virtual landscapes, we combine DTM data with digitally modelled outcrops in the game engine Unity. Our current landscapes are fictional worlds, invented to focus on generation techniques and the strategic and spatial immersion within a digital environment. These have proved very successful in undergraduate teaching; however, we are now moving onto recreating real landscapes for more advanced teaching and research. The first of these is focussed on Rhoscolyn, situated within the Ynys Mon Geopark on Anglesey, UK. It is a popular area for both teaching and research in structural geology so has a wide usage demographic. The base of the model is created from DTM data, both 1 m LiDAR and 5 m GPS point data, and manipulated with QGIS before import to Unity. Substance is added to the world via models of architectural elements (e.g. walls and buildings) and appropriate flora and fauna, including sounds. Texturing of these models is performed using 25 cm aerial imagery and field photographs. Whilst such elements enhance immersion, it is the use of digital outcrop models that fully completes the experience. From fieldwork, we have a library of photogrammetric outcrops that can be modelled into 3D features using free (VisualSFM and MeshLab) and non-free (AgiSoft Photoscan) tools. These models are then refined and converted in Maya to create models for better insertion into the Unity environment. The finished product is a virtual landscape; a Rhoscolyn `world' that is sufficiently detailed to provide a base not only for geological teaching and training but also for geological research. Additionally, the `Rhoscolyn World' represents a significant tool for those students who are unable to attend conventional field classes and really enhances their learning experience. This project is part of the larger Virtual Landscapes project, which is a collaboration between The University of Leeds and Leeds College of Art, UK. All our current virtual landscapes are freely available online at www.see.leeds.ac.uk/virtual-landscapes/.

  12. Immersive virtual reality used as a platform for perioperative training for surgical residents.

    PubMed

    Witzke, D B; Hoskins, J D; Mastrangelo, M J; Witzke, W O; Chu, U B; Pande, S; Park, A E

    2001-01-01

    Perioperative preparations such as operating room setup, patient and equipment positioning, and operating port placement are essential to operative success in minimally invasive surgery. We developed an immersive virtual reality-based training system (REMIS) to provide residents (and other health professionals) with training and evaluation in these perioperative skills. Our program uses the qualities of immersive VR that are available today for inclusion in an ongoing training curriculum for surgical residents. The current application consists of a primary platform for patient positioning for a laparoscopic cholecystectomy. Having completed this module we can create many different simulated problems for other procedures. As a part of the simulation, we have devised a computer-driven real-time data collection system to help us in evaluating trainees and providing feedback during the simulation. The REMIS program trains and evaluates surgical residents and obviates the need to use expensive operating room and surgeon time. It also allows residents to train based on their schedule and does not put patients at increased risk. The method is standardized, allows for repetition if needed, evaluates individual performance, provides the possible complications of incorrect choices, provides training in 3-D environment, and has the capability of being used for various scenarios and professions.

  13. A Learning Evaluation for an Immersive Virtual Laboratory for Technical Training Applied into a Welding Workshop

    ERIC Educational Resources Information Center

    Torres, Francisco; Neira Tovar, Leticia A.; del Rio, Marta Sylvia

    2017-01-01

    This study aims to explore the results of welding virtual training performance, designed using a learning model based on cognitive and usability techniques, applying an immersive concept focused on person attention. Moreover, it also intended to demonstrate that exits a moderating effect of performance improvement when the user experience is taken…

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drouhard, Margaret MEG G; Steed, Chad A; Hahn, Steven E

    In this paper, we propose strategies and objectives for immersive data visualization with applications in materials science using the Oculus Rift virtual reality headset. We provide background on currently available analysis tools for neutron scattering data and other large-scale materials science projects. In the context of the current challenges facing scientists, we discuss immersive virtual reality visualization as a potentially powerful solution. We introduce a prototype immersive visual- ization system, developed in conjunction with materials scientists at the Spallation Neutron Source, which we have used to explore large crystal structures and neutron scattering data. Finally, we offer our perspective onmore » the greatest challenges that must be addressed to build effective and intuitive virtual reality analysis tools that will be useful for scientists in a wide range of fields.« less

  15. D Modelling and Mapping for Virtual Exploration of Underwater Archaeology Assets

    NASA Astrophysics Data System (ADS)

    Liarokapis, F.; Kouřil, P.; Agrafiotis, P.; Demesticha, S.; Chmelík, J.; Skarlatos, D.

    2017-02-01

    This paper investigates immersive technologies to increase exploration time in an underwater archaeological site, both for the public, as well as, for researchers and scholars. Focus is on the Mazotos shipwreck site in Cyprus, which is located 44 meters underwater. The aim of this work is two-fold: (a) realistic modelling and mapping of the site and (b) an immersive virtual reality visit. For 3D modelling and mapping optical data were used. The underwater exploration is composed of a variety of sea elements including: plants, fish, stones, and artefacts, which are randomly positioned. Users can experience an immersive virtual underwater visit in Mazotos shipwreck site and get some information about the shipwreck and its contents for raising their archaeological knowledge and cultural awareness.

  16. Wayfinding and Glaucoma: A Virtual Reality Experiment.

    PubMed

    Daga, Fábio B; Macagno, Eduardo; Stevenson, Cory; Elhosseiny, Ahmed; Diniz-Filho, Alberto; Boer, Erwin R; Schulze, Jürgen; Medeiros, Felipe A

    2017-07-01

    Wayfinding, the process of determining and following a route between an origin and a destination, is an integral part of everyday tasks. The purpose of this study was to investigate the impact of glaucomatous visual field loss on wayfinding behavior using an immersive virtual reality (VR) environment. This cross-sectional study included 31 glaucomatous patients and 20 healthy subjects without evidence of overall cognitive impairment. Wayfinding experiments were modeled after the Morris water maze navigation task and conducted in an immersive VR environment. Two rooms were built varying only in the complexity of the visual scene in order to promote allocentric-based (room A, with multiple visual cues) versus egocentric-based (room B, with single visual cue) spatial representations of the environment. Wayfinding tasks in each room consisted of revisiting previously visible targets that subsequently became invisible. For room A, glaucoma patients spent on average 35.0 seconds to perform the wayfinding task, whereas healthy subjects spent an average of 24.4 seconds (P = 0.001). For room B, no statistically significant difference was seen on average time to complete the task (26.2 seconds versus 23.4 seconds, respectively; P = 0.514). For room A, each 1-dB worse binocular mean sensitivity was associated with 3.4% (P = 0.001) increase in time to complete the task. Glaucoma patients performed significantly worse on allocentric-based wayfinding tasks conducted in a VR environment, suggesting visual field loss may affect the construction of spatial cognitive maps relevant to successful wayfinding. VR environments may represent a useful approach for assessing functional vision endpoints for clinical trials of emerging therapies in ophthalmology.

  17. Immersive Interaction, Manipulation and Analysis of Large 3D Datasets for Planetary and Earth Sciences

    NASA Astrophysics Data System (ADS)

    Pariser, O.; Calef, F.; Manning, E. M.; Ardulov, V.

    2017-12-01

    We will present implementation and study of several use-cases of utilizing Virtual Reality (VR) for immersive display, interaction and analysis of large and complex 3D datasets. These datasets have been acquired by the instruments across several Earth, Planetary and Solar Space Robotics Missions. First, we will describe the architecture of the common application framework that was developed to input data, interface with VR display devices and program input controllers in various computing environments. Tethered and portable VR technologies will be contrasted and advantages of each highlighted. We'll proceed to presenting experimental immersive analytics visual constructs that enable augmentation of 3D datasets with 2D ones such as images and statistical and abstract data. We will conclude by presenting comparative analysis with traditional visualization applications and share the feedback provided by our users: scientists and engineers.

  18. Visualization of reservoir simulation data with an immersive virtual reality system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, B.K.

    1996-10-01

    This paper discusses an investigation into the use of an immersive virtual reality (VR) system to visualize reservoir simulation output data. The hardware and software configurations of the test-immersive VR system are described and compared to a nonimmersive VR system and to an existing workstation screen-based visualization system. The structure of 3D reservoir simulation data and the actions to be performed on the data within the VR system are discussed. The subjective results of the investigation are then presented, followed by a discussion of possible future work.

  19. Science Education Using a Computer Model-Virtual Puget Sound

    NASA Astrophysics Data System (ADS)

    Fruland, R.; Winn, W.; Oppenheimer, P.; Stahr, F.; Sarason, C.

    2002-12-01

    We created an interactive learning environment based on an oceanographic computer model of Puget Sound-Virtual Puget Sound (VPS)-as an alternative to traditional teaching methods. Students immersed in this navigable 3-D virtual environment observed tidal movements and salinity changes, and performed tracer and buoyancy experiments. Scientific concepts were embedded in a goal-based scenario to locate a new sewage outfall in Puget Sound. Traditional science teaching methods focus on distilled representations of agreed-upon knowledge removed from real-world context and scientific debate. Our strategy leverages students' natural interest in their environment, provides meaningful context and engages students in scientific debate and knowledge creation. Results show that VPS provides a powerful learning environment, but highlights the need for research on how to most effectively represent concepts and organize interactions to support scientific inquiry and understanding. Research is also needed to ensure that new technologies and visualizations do not foster misconceptions, including the impression that the model represents reality rather than being a useful tool. In this presentation we review results from prior work with VPS and outline new work for a modeling partnership recently formed with funding from the National Ocean Partnership Program (NOPP).

  20. [Memory assessment by means of virtual reality: its present and future].

    PubMed

    Diaz-Orueta, Unai; Climent, Gema; Cardas-Ibanez, Jaione; Alonso, Laura; Olmo-Osa, Juan; Tirapu-Ustarroz, Javier

    2016-01-16

    The human memory is a complex cognitive system whose close relationship with executive functions implies that, in many occasions, a mnemonic deficit comprises difficulties to operate with correctly stored contents. Traditional memory tests, more focused in the information storage than in its processing, may be poorly sensitive both to subjects' daily life functioning and to changes originated by rehabilitation programs. In memory assessment, there is plenty evidence with regards to the need of improving it by means of tests which offer a higher ecological validity, with information that may be presented in various sensorial modalities and produced in a simultaneous way. Virtual reality reproduces three-dimensional environments with which the patient interacts in a dynamic way, with a sense of immersion in the environment similar to the presence and exposure to a real environment, and in which presentation of such stimuli, distractors and other variables may be systematically controlled. The current review aims to go deeply into the trajectory of neuropsychological assessment of memory based in virtual reality environments, making a tour through existing tests designed for assessing learning, prospective, episodic and spatial memory, as well as the most recent attempts to perform a comprehensive evaluation of all memory components.

  1. Developing a Novel Measure of Body Satisfaction Using Virtual Reality

    PubMed Central

    Purvis, Clare K.; Jones, Megan; Bailey, Jakki O.; Bailenson, Jeremy; Taylor, C. Barr

    2015-01-01

    Body image disturbance (BID), considered a key feature in eating disorders, is a pervasive issue among young women. Accurate assessment of BID is critical, but the field is currently limited to self-report assessment methods. In the present study, we build upon existing research, and explore the utility of virtual reality (VR) to elicit and detect changes in BID across various immersive virtual environments. College-aged women with elevated weight and shape concerns (n = 38) and a non-weight and shape concerned control group (n = 40) were randomly exposed to four distinct virtual environments with high or low levels of body salience and social presence (i.e., presence of virtual others). Participants interacted with avatars of thin, normal weight, and overweight body size (BMI of approximately 18, 22, and 27 respectively) in virtual social settings (i.e., beach, party). We measured state-level body satisfaction (state BD) immediately after exposure to each environment. In addition, we measured participants’ minimum interpersonal distance, visual attention, and approach preference toward avatars of each size. Women with higher baseline BID reported significantly higher state BD in all settings compared to controls. Both groups reported significantly higher state BD in a beach with avatars as compared to other environments. In addition, women with elevated BID approached closer to normal weight avatars and looked longer at thin avatars compared to women in the control group. Our findings indicate that VR may serve as a novel tool for measuring state-level BID, with applications for measuring treatment outcomes. Implications for future research and clinical interventions are discussed. PMID:26469860

  2. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.

  3. NASA Virtual Glovebox (VBX): Emerging Simulation Technology for Space Station Experiment Design, Development, Training and Troubleshooting

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey D.; Twombly, I. Alexander; Maese, A. Christopher; Cagle, Yvonne; Boyle, Richard

    2003-01-01

    The International Space Station demonstrates the greatest capabilities of human ingenuity, international cooperation and technology development. The complexity of this space structure is unprecedented; and training astronaut crews to maintain all its systems, as well as perform a multitude of research experiments, requires the most advanced training tools and techniques. Computer simulation and virtual environments are currently used by astronauts to train for robotic arm manipulations and extravehicular activities; but now, with the latest computer technologies and recent successes in areas of medical simulation, the capability exists to train astronauts for more hands-on research tasks using immersive virtual environments. We have developed a new technology, the Virtual Glovebox (VGX), for simulation of experimental tasks that astronauts will perform aboard the Space Station. The VGX may also be used by crew support teams for design of experiments, testing equipment integration capability and optimizing the procedures astronauts will use. This is done through the 3D, desk-top sized, reach-in virtual environment that can simulate the microgravity environment in space. Additional features of the VGX allow for networking multiple users over the internet and operation of tele-robotic devices through an intuitive user interface. Although the system was developed for astronaut training and assisting support crews, Earth-bound applications, many emphasizing homeland security, have also been identified. Examples include training experts to handle hazardous biological and/or chemical agents in a safe simulation, operation of tele-robotic systems for assessing and diffusing threats such as bombs, and providing remote medical assistance to field personnel through a collaborative virtual environment. Thus, the emerging VGX simulation technology, while developed for space- based applications, can serve a dual use facilitating homeland security here on Earth.

  4. Perceptual Fidelity vs. Engineering Compromises In Virtual Acoustic Displays

    NASA Technical Reports Server (NTRS)

    Wenzel, Elizabeth M.; Ahumada, Albert (Technical Monitor)

    1997-01-01

    Immersive, three-dimensional displays are increasingly becoming a goal of advanced human-machine interfaces. While the technology for achieving truly useful multisensory environments is still being developed, techniques for generating three-dimensional sound are now both sophisticated and practical enough to be applied to acoustic displays. The ultimate goal of virtual acoustics is to simulate the complex acoustic field experienced by a listener freely moving around within an environment. Of course, such complexity, freedom of movement and interactively is not always possible in a "true" virtual environment, much less in lower-fidelity multimedia systems. However, many of the perceptual and engineering constraints (and frustrations) that researchers, engineers and listeners have experienced in virtual audio are relevant to multimedia. In fact, some of the problems that have been studied will be even more of an issue for lower fidelity systems that are attempting to address the requirements of a huge, diverse and ultimately unknown audience. Examples include individual differences in head-related transfer functions, a lack of real interactively (head-tracking) in many multimedia displays, and perceptual degradation due to low sampling rates and/or low-bit compression. This paper discusses some of the engineering Constraints faced during implementation of virtual acoustic environments and the perceptual consequences of these constraints. Specific examples are given for NASA applications such as telerobotic control, aeronautical displays, and shuttle launch communications. An attempt will also be made to relate these issues to low-fidelity implementations such as the internet.

  5. Perceptual Fidelity Versus Engineering Compromises in Virtual Acoustic Displays

    NASA Technical Reports Server (NTRS)

    Wenzel, Elizabeth M.; Ellis, Stephen R. (Technical Monitor); Frey, Mary Anne (Technical Monitor); Schneider, Victor S. (Technical Monitor)

    1997-01-01

    Immersive, three-dimensional displays are increasingly becoming a goal of advanced human-machine interfaces. While the technology for achieving truly useful multisensory environments is still being developed, techniques for generating three-dimensional sound are now both sophisticated and practical enough to be applied to acoustic displays. The ultimate goal of virtual acoustics is to simulate the complex acoustic field experienced by a listener freely moving around within an environment. Of course, such complexity, freedom of movement and interactivity is not always possible in a 'true' virtual environment, much less in lower-fidelity multimedia systems. However, many of the perceptual and engineering constraints (and frustrations) that researchers, engineers and listeners have experienced in virtual audio are relevant to multimedia. In fact, some of the problems that have been studied will be even more of an issue for lower fidelity systems that are attempting to address the requirements of a huge, diverse and ultimately unknown audience. Examples include individual differences in head-related transfer functions, A lack of real interactively (head-tracking) in many multimedia displays, and perceptual degradation due to low sampling rates and/or low-bit compression. This paper discusses some of the engineering constraints faced during implementation of virtual acoustic environments and the perceptual consequences of these constraints. Specific examples are given for NASA applications such as telerobotic control, aeronautical displays, and shuttle launch communications. An attempt will also be made to relate these issues to low-fidelity implementations such as the internet.

  6. Using Virtual Reality in the Inference-Based Treatment of Compulsive Hoarding

    PubMed Central

    St-Pierre-Delorme, Marie-Eve; O’Connor, Kieron

    2016-01-01

    The present study evaluated the efficacy of adding a virtual reality (VR) component to the treatment of compulsive hoarding (CH), following inference-based therapy (IBT). Participants were randomly assigned to either an experimental or a control condition. Seven participants received the experimental and seven received the control condition. Five sessions of 1 h were administered weekly. A significant difference indicated that the level of clutter in the bedroom tended to diminish more in the experimental group as compared to the control group F(2,24) = 2.28, p = 0.10. In addition, the results demonstrated that both groups were immersed and present in the environment. The results on posttreatment measures of CH (Saving Inventory revised, Saving Cognition Inventory and Clutter Image Rating scale) demonstrate the efficacy of IBT in terms of symptom reduction. Overall, these results suggest that the creation of a virtual environment may be effective in the treatment of CH by helping the compulsive hoarders take action over their clutter. PMID:27486574

  7. Language-driven anticipatory eye movements in virtual reality.

    PubMed

    Eichert, Nicole; Peeters, David; Hagoort, Peter

    2018-06-01

    Predictive language processing is often studied by measuring eye movements as participants look at objects on a computer screen while they listen to spoken sentences. This variant of the visual-world paradigm has revealed that information encountered by a listener at a spoken verb can give rise to anticipatory eye movements to a target object, which is taken to indicate that people predict upcoming words. The ecological validity of such findings remains questionable, however, because these computer experiments used two-dimensional stimuli that were mere abstractions of real-world objects. Here we present a visual-world paradigm study in a three-dimensional (3-D) immersive virtual reality environment. Despite significant changes in the stimulus materials and the different mode of stimulus presentation, language-mediated anticipatory eye movements were still observed. These findings thus indicate that people do predict upcoming words during language comprehension in a more naturalistic setting where natural depth cues are preserved. Moreover, the results confirm the feasibility of using eyetracking in rich and multimodal 3-D virtual environments.

  8. Fish in the matrix: motor learning in a virtual world.

    PubMed

    Engert, Florian

    2012-01-01

    One of the large remaining challenges in the field of zebrafish neuroscience is the establishment of techniques and preparations that permit the recording and perturbation of neural activity in animals that can interact meaningfully with the environment. Since it is very difficult to do this in freely behaving zebrafish, I describe here two alternative approaches that meet this goal via tethered preparations. The first uses head-fixation in agarose in combination with online imaging and analysis of tail motion. In the second method, paralyzed fish are suspended with suction pipettes in mid-water and nerve root recordings serve as indicators for intended locomotion. In both cases, fish can be immersed into a virtual environment and allowed to interact with this virtual world via real or fictive tail motions. The specific examples given in this review focus primarily on the role of visual feedback~- but the general principles certainly extend to other modalities, including proprioception, hearing, balance, and somatosensation.

  9. Fish in the matrix: motor learning in a virtual world

    PubMed Central

    Engert, Florian

    2013-01-01

    One of the large remaining challenges in the field of zebrafish neuroscience is the establishment of techniques and preparations that permit the recording and perturbation of neural activity in animals that can interact meaningfully with the environment. Since it is very difficult to do this in freely behaving zebrafish, I describe here two alternative approaches that meet this goal via tethered preparations. The first uses head-fixation in agarose in combination with online imaging and analysis of tail motion. In the second method, paralyzed fish are suspended with suction pipettes in mid-water and nerve root recordings serve as indicators for intended locomotion. In both cases, fish can be immersed into a virtual environment and allowed to interact with this virtual world via real or fictive tail motions. The specific examples given in this review focus primarily on the role of visual feedback~– but the general principles certainly extend to other modalities, including proprioception, hearing, balance, and somatosensation. PMID:23355810

  10. Realistic realtime illumination of complex environment for immersive systems. A case study: the Parthenon

    NASA Astrophysics Data System (ADS)

    Callieri, M.; Debevec, P.; Pair, J.; Scopigno, R.

    2005-06-01

    Offine rendering techniques have nowadays reached an astonishing level of realism but paying the cost of a long computational time. The new generation of programmable graphic hardware, on the other hand, gives the possibility to implement in realtime some of the visual effects previously available only for cinematographic production. In a collaboration between the Visual Computing Lab (ISTI-CNR) with the Institute for Creative Technologies of the University of Southern California, has been developed a realtime demo that replicate a sequence from the short movie "The Parthenon" presented at Siggraph 2004. The application is designed to run on an immersive reality system, making possible for a user to perceive the virtual environment with a cinematographic visual quality. In this paper we present the principal ideas of the project, discussing design issues and technical solution used for the realtime demo.

  11. Transduction between worlds: using virtual and mixed reality for earth and planetary science

    NASA Astrophysics Data System (ADS)

    Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.

    2017-12-01

    Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.

  12. The Effect of Realistic Appearance of Virtual Characters in Immersive Environments - Does the Character's Personality Play a Role?

    PubMed

    Zibrek, Katja; Kokkinara, Elena; Mcdonnell, Rachel

    2018-04-01

    Virtual characters that appear almost photo-realistic have been shown to induce negative responses from viewers in traditional media, such as film and video games. This effect, described as the uncanny valley, is the reason why realism is often avoided when the aim is to create an appealing virtual character. In Virtual Reality, there have been few attempts to investigate this phenomenon and the implications of rendering virtual characters with high levels of realism on user enjoyment. In this paper, we conducted a large-scale experiment on over one thousand members of the public in order to gather information on how virtual characters are perceived in interactive virtual reality games. We were particularly interested in whether different render styles (realistic, cartoon, etc.) would directly influence appeal, or if a character's personality was the most important indicator of appeal. We used a number of perceptual metrics such as subjective ratings, proximity, and attribution bias in order to test our hypothesis. Our main result shows that affinity towards virtual characters is a complex interaction between the character's appearance and personality, and that realism is in fact a positive choice for virtual characters in virtual reality.

  13. What to expect from immersive virtual environment exposure: influences of gender, body mass index, and past experience.

    PubMed

    Stanney, Kay M; Hale, Kelly S; Nahmens, Isabelina; Kennedy, Robert S

    2003-01-01

    For those interested in using head-coupled PC-based immersive virtual environment (VE) technology to train, entertain, or inform, it is essential to understand the effects this technology has on its users. This study investigated potential adverse effects, including the sickness associated with exposure and extreme responses (emesis, flashbacks). Participants were exposed to a VE for 15 to 60 min, with either complete or streamlined navigational control and simple or complex scenes, after which time measures of sickness were obtained. More than 80% of participants experienced nausea, oculomotor disturbances, and/or disorientation, with disorientation potentially lasting > 24 hr. Of the participants, 12.9% prematurely ended their exposure because of adverse effects; of these, 9.2% experienced an emetic response, whereas only 1.2% of all participants experienced emesis. The results indicate that designers may be able to reduce these rates by limiting exposure duration and reducing the degrees of freedom of the user's navigational control. Results from gender, body mass, and past experience comparisons indicated it may be possible to identify those who will experience adverse effects attributable to exposure and warn such individuals. Applications for this research include military, entertainment, and any other interactive systems for which designers seek to avoid adverse effects associated with exposure.

  14. A Storm's Approach; Hurricane Shelter Training in a Digital Age

    NASA Technical Reports Server (NTRS)

    Boyarsky, Andrew; Burden, David; Gronstedt, Anders; Jinman, Andrew

    2012-01-01

    New York City's Office of Emergency Management (OEM) originally ran hundreds of classroom based courses, where they brought together civil servants to learn how to run a Hurricane Shelter (HS). This approach was found to be costly, time consuming and lacked any sense of an impending disaster and need for emergency response. In partnership with the City of New York University School of Professional studies, Gronstedt Group and Daden Limited, the OEM wanted to create a simulation that overcame these issues, providing users with a more immersive and realistic approach at a lower cost. The HS simulation was built in the virtual world Second Life (SL). Virtual worlds are a genre of online communities that often take the form of a computer-based simulated environments, through which users can interact with one another and use or create objects. Using this technology allowed managers to apply their knowledge in both classroom and remote learning environments. The shelter simulation is operational 24/7, guiding users through a 4 1/2 hour narrative from start to finish. This paper will describe the rationale for the project, the technical approach taken - particularly the use of a web based authoring tool to create and manage the immersive simulation, and the results from operational use.

  15. Incorporating immersive virtual environments in health promotion campaigns: a construal level theory approach.

    PubMed

    Ahn, Sun Joo Grace

    2015-01-01

    In immersive virtual environments (IVEs), users may observe negative consequences of a risky health behavior in a personally involving way via digital simulations. In the context of an ongoing health promotion campaign, IVEs coupled with pamphlets are proposed as a novel messaging strategy to heighten personal relevance and involvement with the issue of soft-drink consumption and obesity, as well as perceptions that the risk is proximal and imminent. The framework of construal level theory guided the design of a 2 (tailoring: other vs. self) × 2 (medium: pamphlet only vs. pamphlet with IVEs) between-subjects experiment to test the efficacy in reducing the consumption of soft drinks over 1 week. Immediately following exposure, tailoring the message to the self (vs. other) seemed to be effective in reducing intentions to consume soft drinks. The effect of tailoring dissipated after 1 week, and measures of actual soft-drink consumption 1 week following experimental treatments demonstrated that coupling IVEs with the pamphlet was more effective. Behavioral intention was a significant predictor of actual behavior, but underlying mechanisms driving intentions and actual behavior were distinct. Results prescribed a messaging strategy that incorporates both tailoring and coupling IVEs with traditional media to increase behavioral changes over time.

  16. Workstations for people with disabilities: an example of a virtual reality approach

    PubMed Central

    Budziszewski, Paweł; Grabowski, Andrzej; Milanowicz, Marcin; Jankowski, Jarosław

    2016-01-01

    This article describes a method of adapting workstations for workers with motion disability using computer simulation and virtual reality (VR) techniques. A workstation for grinding spring faces was used as an example. It was adjusted for two people with a disabled right upper extremity. The study had two stages. In the first, a computer human model with a visualization of maximal arm reach and preferred workspace was used to develop a preliminary modification of a virtual workstation. In the second stage, an immersive VR environment was used to assess the virtual workstation and to add further modifications. All modifications were assessed by measuring the efficiency of work and the number of movements involved. The results of the study showed that a computer simulation could be used to determine whether a worker with a disability could access all important areas of a workstation and to propose necessary modifications. PMID:26651540

  17. Community-based pedestrian safety training in virtual reality: A pragmatic trial.

    PubMed

    Schwebel, David C; Combs, Tabitha; Rodriguez, Daniel; Severson, Joan; Sisiopiku, Virginia

    2016-01-01

    Child pedestrian injuries are a leading cause of mortality and morbidity across the United States and the world. Repeated practice at the cognitive-perceptual task of crossing a street may lead to safer pedestrian behavior. Virtual reality offers a unique opportunity for repeated practice without the risk of actual injury. This study conducted a pre-post within-subjects trial of training children in pedestrian safety using a semi-mobile, semi-immersive virtual pedestrian environment placed at schools and community centers. Pedestrian safety skills among a group of 44 seven- and eight-year-old children were assessed in a laboratory, and then children completed six 15-minute training sessions in the virtual pedestrian environment at their school or community center following pragmatic trial strategies over the course of three weeks. Following training, pedestrian safety skills were re-assessed. Results indicate improvement in delay entering traffic following training. Safe crossings did not demonstrate change. Attention to traffic and time to contact with oncoming vehicles both decreased somewhat, perhaps an indication that training was incomplete and children were in the process of actively learning to be safer pedestrians. The findings suggest virtual reality environments placed in community centers hold promise for teaching children to be safer pedestrians, but future research is needed to determine the optimal training dosage. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. A Framework for Web-Based Interprofessional Education for Midwifery and Medical Students.

    PubMed

    Reis, Pamela J; Faser, Karl; Davis, Marquietta

    2015-01-01

    Scheduling interprofessional team-based activities for health sciences students who are geographically dispersed, with divergent and often competing schedules, can be challenging. The use of Web-based technologies such as 3-dimensional (3D) virtual learning environments in interprofessional education is a relatively new phenomenon, which offers promise in helping students come together in online teams when face-to-face encounters are not possible. The purpose of this article is to present the experience of a nurse-midwifery education program in a Southeastern US university in delivering Web-based interprofessional education for nurse-midwifery and third-year medical students utilizing the Virtual Community Clinic Learning Environment (VCCLE). The VCCLE is a 3D, Web-based, asynchronous, immersive clinic environment into which students enter to meet and interact with instructor-controlled virtual patient and virtual preceptor avatars and then move through a classic diagnostic sequence in arriving at a plan of care for women throughout the lifespan. By participating in the problem-based management of virtual patients within the VCCLE, students learn both clinical competencies and competencies for interprofessional collaborative practice, as described by the Interprofessional Education Collaborative Core Competencies for Interprofessional Collaborative Practice. This article is part of a special series of articles that address midwifery innovations in clinical practice, education, interprofessional collaboration, health policy, and global health. © 2015 by the American College of Nurse-Midwives.

  19. The time course of location-avoidance learning in fear of spiders.

    PubMed

    Rinck, Mike; Koene, Marieke; Telli, Sibel; Moerman-van den Brink, Wiltine; Verhoeven, Barbara; Becker, Eni S

    2016-01-01

    Two experiments were designed to study the time course of avoidance learning in spider fearfuls (SFs) under controlled experimental conditions. To achieve this, we employed an immersive virtual environment (IVE): While walking freely through a virtual art museum to search for specific paintings, the participants were exposed to virtual spiders. Unbeknown to the participants, only two of four museum rooms contained spiders, allowing for avoidance learning. Indeed, the more SF the participants were, the faster they learned to avoid the rooms that contained spiders (Experiment. 1), and within the first six trials, high fearfuls already developed a preference for starting their search task in rooms without spiders (Experiment 2). These results illustrate the time course of avoidance learning in SFs, and they speak to the usefulness of IVEs in fundamental anxiety research.

  20. Emerging technologies in education and training: applications for the laboratory animal science community.

    PubMed

    Ketelhut, Diane Jass; Niemi, Steven M

    2007-01-01

    This article examines several new and exciting communication technologies. Many of the technologies were developed by the entertainment industry; however, other industries are adopting and modifying them for their own needs. These new technologies allow people to collaborate across distance and time and to learn in simulated work contexts. The article explores the potential utility of these technologies for advancing laboratory animal care and use through better education and training. Descriptions include emerging technologies such as augmented reality and multi-user virtual environments, which offer new approaches with different capabilities. Augmented reality interfaces, characterized by the use of handheld computers to infuse the virtual world into the real one, result in deeply immersive simulations. In these simulations, users can access virtual resources and communicate with real and virtual participants. Multi-user virtual environments enable multiple participants to simultaneously access computer-based three-dimensional virtual spaces, called "worlds," and to interact with digital tools. They allow for authentic experiences that promote collaboration, mentoring, and communication. Because individuals may learn or train differently, it is advantageous to combine the capabilities of these technologies and applications with more traditional methods to increase the number of students who are served by using current methods alone. The use of these technologies in animal care and use programs can create detailed training and education environments that allow students to learn the procedures more effectively, teachers to assess their progress more objectively, and researchers to gain insights into animal care.

  1. Training Effectiveness of a Wide Area Virtual Environment in Medical Simulation.

    PubMed

    Wier, Grady S; Tree, Rebekah; Nusr, Rasha

    2017-02-01

    The success of war fighters and medical personnel handling traumatic injuries largely depends on the quality of training they receive before deployment. The purpose of this study was to gauge the utility of a Wide Area Virtual Environment (WAVE) as a training adjunct by comparing and evaluating student performance, measuring sense of realism, and assessing the impact on student satisfaction with their training exposure in an immersive versus a field environment. This comparative prospective cohort study examined the utility of a three-screen WAVE where subjects were immersed in the training environment with medical simulators. Standard field training commenced for the control group subjects. Medical skills, time to completion, and Team Strategies and Tools to Enhance Performance and Patient Safety objective metrics were assessed for each team (n = 94). In addition, self-efficacy questionnaires were collected for each subject (N = 470). Medical teams received poorer overall team scores (F1,186 = 0.756, P = 0.001), took longer to complete the scenario (F1,186 = 25.15, P = 0.001), and scored lower on The National Registry of Emergency Medical Technicians trauma assessment checklist (F1,186 = 1.13, P = 0.000) in the WAVE versus the field environment. Critical thinking and realism factors within the self-efficacy questionnaires scored higher in the WAVE versus the field [(F1,466 = 8.04, P = 0.005), (F1,465 = 18.57, P = 0.000), and (F1,466 = 53.24, P = 0.000), respectively]. Environmental and emotional stressors may negatively affect critical thinking and clinical skill performance of medical teams. However, by introducing more advanced simulation trainings with added stressors, students may be able to adapt and overcome barriers to performance found in high-stress environments.

  2. ARENA - A Collaborative Immersive Environment for Virtual Fieldwork

    NASA Astrophysics Data System (ADS)

    Kwasnitschka, T.

    2012-12-01

    Whenever a geoscientific study area is not readily accessible, as is the case on the deep seafloor, it is difficult to apply traditional but effective methods of fieldwork, which often require physical presence of the observer. The Artificial Research Environment for Networked Analysis (ARENA), developed at GEOMAR | Helmholtz Centre for Ocean Research Kiel within the Cluster of Excellence "The Future Ocean", provides a backend solution to robotic research on the seafloor by means of an immersive simulation environment for marine research: A hemispherical screen of 6m diameter covering the entire lower hemisphere surrounds a group of up to four researchers at once. A variety of open source (e.g. Microsoft Research World Wide Telescope) and commercial software platforms allow the interaction with e.g. in-situ recorded video, vector maps, terrain, textured geometry, point cloud and volumetric data in four dimensions. Data can be put into a holistic, georeferenced context and viewed on scales stretching from centimeters to global. Several input devices from joysticks to gestures and vocalized commands allow interaction with the simulation, depending on individual preference. Annotations added to the dataset during the simulation session catalyze the following quantitative evaluation. Both the special simulator design, making data perception a group experience, and the ability to connect remote instances or scaled down versions of ARENA over the Internet are significant advantages over established immersive simulation environments.

  3. The Effects of Actual Human Size Display and Stereoscopic Presentation on Users' Sense of Being Together with and of Psychological Immersion in a Virtual Character

    PubMed Central

    Ahn, Dohyun; Seo, Youngnam; Kim, Minkyung; Kwon, Joung Huem; Jung, Younbo; Ahn, Jungsun

    2014-01-01

    Abstract This study examined the role of display size and mode in increasing users' sense of being together with and of their psychological immersion in a virtual character. Using a high-resolution three-dimensional virtual character, this study employed a 2×2 (stereoscopic mode vs. monoscopic mode×actual human size vs. small size display) factorial design in an experiment with 144 participants randomly assigned to each condition. Findings showed that stereoscopic mode had a significant effect on both users' sense of being together and psychological immersion. However, display size affected only the sense of being together. Furthermore, display size was not found to moderate the effect of stereoscopic mode. PMID:24606057

  4. [Neuropsychological evaluation of the executive functions by means of virtual reality].

    PubMed

    Climent-Martínez, Gema; Luna-Lario, Pilar; Bombín-González, Igor; Cifuentes-Rodríguez, Alicia; Tirapu-Ustárroz, Javier; Díaz-Orueta, Unai

    2014-05-16

    Executive functions include a wide range of self regulatory functions that allow control, organization and coordination of other cognitive functions, emotional responses and behaviours. The traditional approach to evaluate these functions, by means of paper and pencil neuropsychological tests, shows a greater than expected performance within the normal range for patients whose daily life difficulties would predict an inferior performance. These discrepancies suggest that classical neuropsychological tests may not adequately reproduce the complexity and dynamic nature of real life situations. Latest developments in the field of virtual reality offer interesting options for the neuropsychological assessment of many cognitive processes. Virtual reality reproduces three-dimensional environments with which the patient interacts in a dynamic way, with a sense of immersion in the environment similar to the presence and exposure to a real environment. Furthermore, the presentation of these stimuli, as well as distractors and other variables, may be controlled in a systematic way. Moreover, more consistent and precise answers may be obtained, and an in-depth analysis of them is possible. The present review shows current problems in neuropsychological evaluation of executive functions and latest advances in the consecution of higher preciseness and validity of the evaluation by means of new technologies and virtual reality, with special mention to some developments performed in Spain.

  5. Determining sensitivity/specificity of virtual reality-based neuropsychological tool for detecting residual abnormalities following sport-related concussion.

    PubMed

    Teel, Elizabeth; Gay, Michael; Johnson, Brian; Slobounov, Semyon

    2016-05-01

    Computer-based neuropsychological (NP) evaluation is an effective clinical tool used to assess cognitive function which complements the clinical diagnosis of a concussion. However, some researchers and clinicians argue its lack of ecological validity places limitations on externalizing results to a sensory rich athletic environment. Virtual reality-based NP assessment offers clinical advantages using an immersive environment and evaluating domains not typically assessed by traditional NP assessments. The sensitivity and specificity of detecting lingering cognitive abnormalities was examined on components of a virtual reality-based NP assessment battery to cohort affiliation (concussed vs. controls). Data were retrospectively gathered on 128 controls (no concussion) and 24 concussed college-age athletes on measures of spatial navigation, whole body reaction, attention, and balance in a virtual environment. Concussed athletes were tested within 10 days (M = 8.33, SD = 1.06) of concussion and were clinically asymptomatic at the time of testing. A priori alpha level was set at 0.05 for all tests. Spatial navigation (sensitivity 95.8%/specificity 91.4%, d = 1.89), whole body reaction time (sensitivity 95.2%/specificity 89.1%, d = 1.50) and combined virtual reality modules (sensitivity 95.8%,/specificity 96.1%, d = 3.59) produced high sensitivity/specificity values when determining performance-based variability between groups. Use of a virtual reality-based NP platform can detect lingering cognitive abnormalities resulting from concussion in clinically asymptomatic participants. Virtual reality NP platforms may compliment the traditional concussion assessment battery by providing novel information. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Usability Testing of an Interactive Virtual Reality Distraction Intervention to Reduce Procedural Pain in Children and Adolescents With Cancer.

    PubMed

    Birnie, Kathryn A; Kulandaivelu, Yalinie; Jibb, Lindsay; Hroch, Petra; Positano, Karyn; Robertson, Simon; Campbell, Fiona; Abla, Oussama; Stinson, Jennifer

    2018-06-01

    Needle procedures are among the most distressing aspects of pediatric cancer-related treatment. Virtual reality (VR) distraction offers promise for needle-related pain and distress given its highly immersive and interactive virtual environment. This study assessed the usability (ease of use and understanding, acceptability) of a custom VR intervention for children with cancer undergoing implantable venous access device (IVAD) needle insertion. Three iterative cycles of mixed-method usability testing with semistructured interviews were undertaken to refine the VR. Participants included 17 children and adolescents (8-18 years old) with cancer who used the VR intervention prior to or during IVAD access. Most participants reported the VR as easy to use (82%) and understand (94%), and would like to use it during subsequent needle procedures (94%). Based on usability testing, refinements were made to VR hardware, software, and clinical implementation. Refinements focused on increasing responsiveness, interaction, and immersion of the VR program, reducing head movement for VR interaction, and enabling participant alerts to steps of the procedure by clinical staff. No adverse events of nausea or dizziness were reported. The VR intervention was deemed acceptable and safe. Next steps include assessing feasibility and effectiveness of the VR intervention for pain and distress.

  7. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  8. A randomized, controlled trial of immersive virtual reality analgesia, during physical therapy for pediatric burns.

    PubMed

    Schmitt, Yuko S; Hoffman, Hunter G; Blough, David K; Patterson, David R; Jensen, Mark P; Soltani, Maryam; Carrougher, Gretchen J; Nakamura, Dana; Sharar, Sam R

    2011-02-01

    This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6-19 years old) performed range-of-motion exercises under a therapist's direction for 1-5 days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects' perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27-44%) in pain ratings during virtual reality. They also reported improved affect ("fun") during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use. Copyright © 2010 Elsevier Ltd and ISBI. All rights reserved.

  9. A Randomized, Controlled Trial of Immersive Virtual Reality Analgesia during Physical Therapy for Pediatric Burn Injuries

    PubMed Central

    Schmitt, Yuko S.; Hoffman, Hunter G.; Blough, David K.; Patterson, David R.; Jensen, Mark P.; Soltani, Maryam; Carrougher, Gretchen J.; Nakamura, Dana; Sharar, Sam R.

    2010-01-01

    This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6–19 years old) performed range-of-motion exercises under a therapist’s direction for one to five days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects’ perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27–44%) in pain ratings during virtual reality. They also reported improved affect (“fun”) during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use. PMID:20692769

  10. An Australian and New Zealand Scoping Study on the Use of 3D Immersive Virtual Worlds in Higher Education

    ERIC Educational Resources Information Center

    Dalgarno, Barney; Lee, Mark J. W.; Carlson, Lauren; Gregory, Sue; Tynan, Belinda

    2011-01-01

    This article describes the research design of, and reports selected findings from, a scoping study aimed at examining current and planned applications of 3D immersive virtual worlds at higher education institutions across Australia and New Zealand. The scoping study is the first of its kind in the region, intended to parallel and complement a…

  11. Immersive Technologies and Language Learning

    ERIC Educational Resources Information Center

    Blyth, Carl

    2018-01-01

    This article briefly traces the historical conceptualization of linguistic and cultural immersion through technological applications, from the early days of locally networked computers to the cutting-edge technologies known as virtual reality and augmented reality. Next, the article explores the challenges of immersive technologies for the field…

  12. Armagh Observatory - Historic Building Information Modelling for Virtual Learning in Building Conservation

    NASA Astrophysics Data System (ADS)

    Murphy, M.; Chenaux, A.; Keenaghan, G.; GIbson, V..; Butler, J.; Pybusr, C.

    2017-08-01

    In this paper the recording and design for a Virtual Reality Immersive Model of Armagh Observatory is presented, which will replicate the historic buildings and landscape with distant meridian markers and position of its principal historic instruments within a model of the night sky showing the position of bright stars. The virtual reality model can be used for educational purposes allowing the instruments within the historic building model to be manipulated within 3D space to demonstrate how the position measurements of stars were made in the 18th century. A description is given of current student and researchers activities concerning on-site recording and surveying and the virtual modelling of the buildings and landscape. This is followed by a design for a Virtual Reality Immersive Model of Armagh Observatory use game engine and virtual learning platforms and concepts.

  13. The Use of Information Operations (IO) in Immersive Virtual Environments (IVE)

    DTIC Science & Technology

    2010-06-01

    are motivated or persuaded when interacting with computing products rather than through them. [26] In 2003, Dr. B.J. Fogg , leader of the Stanford...comparable IO utility may be possible through the other computing technologies listed. 23 Figure 6. Captology Focus. From [25] In his book, Dr. Fogg ...Self- Representation on Behavior.” Human Communication Research, no. 33 pp. 271– 290, 2007. [26] B. J. Fogg . Persuasive Technology: Using Computers

  14. Simultaneous neural and movement recording in large-scale immersive virtual environments.

    PubMed

    Snider, Joseph; Plank, Markus; Lee, Dongpyo; Poizner, Howard

    2013-10-01

    Virtual reality (VR) allows precise control and manipulation of rich, dynamic stimuli that, when coupled with on-line motion capture and neural monitoring, can provide a powerful means both of understanding brain behavioral relations in the high dimensional world and of assessing and treating a variety of neural disorders. Here we present a system that combines state-of-the-art, fully immersive, 3D, multi-modal VR with temporally aligned electroencephalographic (EEG) recordings. The VR system is dynamic and interactive across visual, auditory, and haptic interactions, providing sight, sound, touch, and force. Crucially, it does so with simultaneous EEG recordings while subjects actively move about a 20 × 20 ft² space. The overall end-to-end latency between real movement and its simulated movement in the VR is approximately 40 ms. Spatial precision of the various devices is on the order of millimeters. The temporal alignment with the neural recordings is accurate to within approximately 1 ms. This powerful combination of systems opens up a new window into brain-behavioral relations and a new means of assessment and rehabilitation of individuals with motor and other disorders.

  15. Using virtual reality to characterize episodic memory profiles in amnestic mild cognitive impairment and Alzheimer's disease: influence of active and passive encoding.

    PubMed

    Plancher, G; Tirard, A; Gyselinck, V; Nicolas, S; Piolino, P

    2012-04-01

    Most neuropsychological assessments of episodic memory bear little similarity to the events that patients actually experience as memories in daily life. The first aim of this study was to use a virtual environment to characterize episodic memory profiles in an ecological fashion, which includes memory for central and perceptual details, spatiotemporal contextual elements, and binding. This study included subjects from three different populations: healthy older adults, patients with amnestic mild cognitive impairment (aMCI) and patients with early to moderate Alzheimer's disease (AD). Second, we sought to determine whether environmental factors that can affect encoding (active vs. passive exploration) influence memory performance in pathological aging. Third, we benchmarked the results of our virtual reality episodic memory test against a classical memory test and a subjective daily memory complaint scale. Here, the participants were successively immersed in two virtual environments; the first, as the driver of a virtual car (active exploration) and the second, as the passenger of that car (passive exploration). Subjects were instructed to encode all elements of the environment as well as the associated spatiotemporal contexts. Following each immersion, we assessed the patient's recall and recognition of central information (i.e., the elements of the environment), contextual information (i.e., temporal, egocentric and allocentric spatial information) and lastly, the quality of binding. We found that the AD patients' performances were inferior to that of the aMCI and even more to that of the healthy aged groups, in line with the progression of hippocampal atrophy reported in the literature. Spatial allocentric memory assessments were found to be particularly useful for distinguishing aMCI patients from healthy older adults. Active exploration yielded enhanced recall of central and allocentric spatial information, as well as binding in all groups. This led aMCI patients to achieve better performance scores on immediate temporal memory tasks. Finally, the patients' daily memory complaints were more highly correlated with the performances on the virtual test than with their performances on the classical memory test. Taken together, these results highlight specific cognitive differences found between these three populations that may provide additional insight into the early diagnosis and rehabilitation of pathological aging. In particular, neuropsychological studies would benefit to use virtual tests and a multi-component approach to assess episodic memory, and encourage active encoding of information in patients suffering from mild or severe age-related memory impairment. The beneficial effect of active encoding on episodic memory in aMCI and early to moderate AD is discussed in the context of relatively preserved frontal and motor brain functions implicated in self-referential effects and procedural abilities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Journey to the centre of the cell: Virtual reality immersion into scientific data.

    PubMed

    Johnston, Angus P R; Rae, James; Ariotti, Nicholas; Bailey, Benjamin; Lilja, Andrew; Webb, Robyn; Ferguson, Charles; Maher, Sheryl; Davis, Thomas P; Webb, Richard I; McGhee, John; Parton, Robert G

    2018-02-01

    Visualization of scientific data is crucial not only for scientific discovery but also to communicate science and medicine to both experts and a general audience. Until recently, we have been limited to visualizing the three-dimensional (3D) world of biology in 2 dimensions. Renderings of 3D cells are still traditionally displayed using two-dimensional (2D) media, such as on a computer screen or paper. However, the advent of consumer grade virtual reality (VR) headsets such as Oculus Rift and HTC Vive means it is now possible to visualize and interact with scientific data in a 3D virtual world. In addition, new microscopic methods provide an unprecedented opportunity to obtain new 3D data sets. In this perspective article, we highlight how we have used cutting edge imaging techniques to build a 3D virtual model of a cell from serial block-face scanning electron microscope (SBEM) imaging data. This model allows scientists, students and members of the public to explore and interact with a "real" cell. Early testing of this immersive environment indicates a significant improvement in students' understanding of cellular processes and points to a new future of learning and public engagement. In addition, we speculate that VR can become a new tool for researchers studying cellular architecture and processes by populating VR models with molecular data. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Investigation of tracking systems properties in CAVE-type virtual reality systems

    NASA Astrophysics Data System (ADS)

    Szymaniak, Magda; Mazikowski, Adam; Meironke, Michał

    2017-08-01

    In recent years, many scientific and industrial centers in the world developed a virtual reality systems or laboratories. One of the most advanced solutions are Immersive 3D Visualization Lab (I3DVL), a CAVE-type (Cave Automatic Virtual Environment) laboratory. It contains two CAVE-type installations: six-screen installation arranged in a form of a cube, and four-screen installation, a simplified version of the previous one. The user feeling of "immersion" and interaction with virtual world depend on many factors, in particular on the accuracy of the tracking system of the user. In this paper properties of the tracking systems applied in I3DVL was investigated. For analysis two parameters were selected: the accuracy of the tracking system and the range of detection of markers by the tracking system in space of the CAVE. Measurements of system accuracy were performed for six-screen installation, equipped with four tracking cameras for three axes: X, Y, Z. Rotation around the Y axis was also analyzed. Measured tracking system shows good linear and rotating accuracy. The biggest issue was the range of the monitoring of markers inside the CAVE. It turned out, that the tracking system lose sight of the markers in the corners of the installation. For comparison, for a simplified version of CAVE (four-screen installation), equipped with eight tracking cameras, this problem was not occur. Obtained results will allow for improvement of cave quality.

  18. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this visualization tool freely available to the academic community within a few months, on an experimental (beta testing) basis.

  19. The Immersive Virtual Reality Lab: Possibilities for Remote Experimental Manipulations of Autonomic Activity on a Large Scale.

    PubMed

    Juvrud, Joshua; Gredebäck, Gustaf; Åhs, Fredrik; Lerin, Nils; Nyström, Pär; Kastrati, Granit; Rosén, Jörgen

    2018-01-01

    There is a need for large-scale remote data collection in a controlled environment, and the in-home availability of virtual reality (VR) and the commercial availability of eye tracking for VR present unique and exciting opportunities for researchers. We propose and provide a proof-of-concept assessment of a robust system for large-scale in-home testing using consumer products that combines psychophysiological measures and VR, here referred to as a Virtual Lab. For the first time, this method is validated by correlating autonomic responses, skin conductance response (SCR), and pupillary dilation, in response to a spider, a beetle, and a ball using commercially available VR. Participants demonstrated greater SCR and pupillary responses to the spider, and the effect was dependent on the proximity of the stimuli to the participant, with a stronger response when the spider was close to the virtual self. We replicated these effects across two experiments and in separate physical room contexts to mimic variability in home environment. Together, these findings demonstrate the utility of pupil dilation as a marker of autonomic arousal and the feasibility to assess this in commercially available VR hardware and support a robust Virtual Lab tool for massive remote testing.

  20. The Immersive Virtual Reality Lab: Possibilities for Remote Experimental Manipulations of Autonomic Activity on a Large Scale

    PubMed Central

    Juvrud, Joshua; Gredebäck, Gustaf; Åhs, Fredrik; Lerin, Nils; Nyström, Pär; Kastrati, Granit; Rosén, Jörgen

    2018-01-01

    There is a need for large-scale remote data collection in a controlled environment, and the in-home availability of virtual reality (VR) and the commercial availability of eye tracking for VR present unique and exciting opportunities for researchers. We propose and provide a proof-of-concept assessment of a robust system for large-scale in-home testing using consumer products that combines psychophysiological measures and VR, here referred to as a Virtual Lab. For the first time, this method is validated by correlating autonomic responses, skin conductance response (SCR), and pupillary dilation, in response to a spider, a beetle, and a ball using commercially available VR. Participants demonstrated greater SCR and pupillary responses to the spider, and the effect was dependent on the proximity of the stimuli to the participant, with a stronger response when the spider was close to the virtual self. We replicated these effects across two experiments and in separate physical room contexts to mimic variability in home environment. Together, these findings demonstrate the utility of pupil dilation as a marker of autonomic arousal and the feasibility to assess this in commercially available VR hardware and support a robust Virtual Lab tool for massive remote testing. PMID:29867318

Top