Cybersickness and Anxiety During Simulated Motion: Implications for VRET.
Bruck, Susan; Watters, Paul
2009-01-01
Some clinicians have suggested using virtual reality environments to deliver psychological interventions to treat anxiety disorders. However, given a significant body of work on cybersickness symptoms which may arise in virtual environments - especially those involving simulated motion - we tested (a) whether being exposed to a virtual reality environment alone causes anxiety to increase, and (b) whether exposure to simulated motion in a virtual reality environment increases anxiety. Using a repeated measures design, we used Kim's Anxiety Scale questionnaire to compare baseline anxiety, anxiety after virtual environment exposure, and anxiety after simulated motion. While there was no significant effect on anxiety for being in a virtual environment with no simulated motion, the introduction of simulated motion caused anxiety to significantly increase, but not to a severe or extreme level. The implications of this work for virtual reality exposure therapy (VRET) are discussed.
Virtual environments simulation in research reactor
NASA Astrophysics Data System (ADS)
Muhamad, Shalina Bt. Sheik; Bahrin, Muhammad Hannan Bin
2017-01-01
Virtual reality based simulations are interactive and engaging. It has the useful potential in improving safety training. Virtual reality technology can be used to train workers who are unfamiliar with the physical layout of an area. In this study, a simulation program based on the virtual environment at research reactor was developed. The platform used for virtual simulation is 3DVia software for which it's rendering capabilities, physics for movement and collision and interactive navigation features have been taken advantage of. A real research reactor was virtually modelled and simulated with the model of avatars adopted to simulate walking. Collision detection algorithms were developed for various parts of the 3D building and avatars to restrain the avatars to certain regions of the virtual environment. A user can control the avatar to move around inside the virtual environment. Thus, this work can assist in the training of personnel, as in evaluating the radiological safety of the research reactor facility.
ERIC Educational Resources Information Center
Martinez, Guadalupe; Naranjo, Francisco L.; Perez, Angel L.; Suero, Maria Isabel; Pardo, Pedro J.
2011-01-01
This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output.…
Virtual agents in a simulated virtual training environment
NASA Technical Reports Server (NTRS)
Achorn, Brett; Badler, Norman L.
1993-01-01
A drawback to live-action training simulations is the need to gather a large group of participants in order to train a few individuals. One solution to this difficulty is the use of computer-controlled agents in a virtual training environment. This allows a human participant to be replaced by a virtual, or simulated, agent when only limited responses are needed. Each agent possesses a specified set of behaviors and is capable of limited autonomous action in response to its environment or the direction of a human trainee. The paper describes these agents in the context of a simulated hostage rescue training session, involving two human rescuers assisted by three virtual (computer-controlled) agents and opposed by three other virtual agents.
3D multiplayer virtual pets game using Google Card Board
NASA Astrophysics Data System (ADS)
Herumurti, Darlis; Riskahadi, Dimas; Kuswardayan, Imam
2017-08-01
Virtual Reality (VR) is a technology which allows user to interact with the virtual environment. This virtual environment is generated and simulated by computer. This technology can make user feel the sensation when they are in the virtual environment. The VR technology provides real virtual environment view for user and it is not viewed from screen. But it needs another additional device to show the view of virtual environment. This device is known as Head Mounted Device (HMD). Oculust Rift and Microsoft Hololens are the most famous HMD devices used in VR. And in 2014, Google Card Board was introduced at Google I/O developers conference. Google Card Board is VR platform which allows user to enjoy the VR with simple and cheap way. In this research, we explore Google Card Board to develop simulation game of raising pet. The Google Card Board is used to create view for the VR environment. The view and control in VR environment is built using Unity game engine. And the simulation process is designed using Finite State Machine (FSM). This FSM can help to design the process clearly. So the simulation process can describe the simulation of raising pet well. Raising pet is fun activity. But sometimes, there are many conditions which cause raising pet become difficult to do, i.e. environment condition, disease, high cost, etc. this research aims to explore and implement Google Card Board in simulation of raising pet.
Using smartphone technology to deliver a virtual pedestrian environment: usability and validation.
Schwebel, David C; Severson, Joan; He, Yefei
2017-09-01
Various programs effectively teach children to cross streets more safely, but all are labor- and cost-intensive. Recent developments in mobile phone technology offer opportunity to deliver virtual reality pedestrian environments to mobile smartphone platforms. Such an environment may offer a cost- and labor-effective strategy to teach children to cross streets safely. This study evaluated usability, feasibility, and validity of a smartphone-based virtual pedestrian environment. A total of 68 adults completed 12 virtual crossings within each of two virtual pedestrian environments, one delivered by smartphone and the other a semi-immersive kiosk virtual environment. Participants completed self-report measures of perceived realism and simulator sickness experienced in each virtual environment, plus self-reported demographic and personality characteristics. All participants followed system instructions and used the smartphone-based virtual environment without difficulty. No significant simulator sickness was reported or observed. Users rated the smartphone virtual environment as highly realistic. Convergent validity was detected, with many aspects of pedestrian behavior in the smartphone-based virtual environment matching behavior in the kiosk virtual environment. Anticipated correlations between personality and kiosk virtual reality pedestrian behavior emerged for the smartphone-based system. A smartphone-based virtual environment can be usable and valid. Future research should develop and evaluate such a training system.
Physical Models and Virtual Reality Simulators in Otolaryngology.
Javia, Luv; Sardesai, Maya G
2017-10-01
The increasing role of simulation in the medical education of future otolaryngologists has followed suit with other surgical disciplines. Simulators make it possible for the resident to explore and learn in a safe and less stressful environment. The various subspecialties in otolaryngology use physical simulators and virtual-reality simulators. Although physical simulators allow the operator to make direct contact with its components, virtual-reality simulators allow the operator to interact with an environment that is computer generated. This article gives an overview of the various types of physical simulators and virtual-reality simulators used in otolaryngology that have been reported in the literature. Copyright © 2017 Elsevier Inc. All rights reserved.
Intelligent Motion and Interaction Within Virtual Environments
NASA Technical Reports Server (NTRS)
Ellis, Stephen R. (Editor); Slater, Mel (Editor); Alexander, Thomas (Editor)
2007-01-01
What makes virtual actors and objects in virtual environments seem real? How can the illusion of their reality be supported? What sorts of training or user-interface applications benefit from realistic user-environment interactions? These are some of the central questions that designers of virtual environments face. To be sure simulation realism is not necessarily the major, or even a required goal, of a virtual environment intended to communicate specific information. But for some applications in entertainment, marketing, or aspects of vehicle simulation training, realism is essential. The following chapters will examine how a sense of truly interacting with dynamic, intelligent agents may arise in users of virtual environments. These chapters are based on presentations at the London conference on Intelligent Motion and Interaction within a Virtual Environments which was held at University College, London, U.K., 15-17 September 2003.
A fast simulation method for radiation maps using interpolation in a virtual environment.
Li, Meng-Kun; Liu, Yong-Kuo; Peng, Min-Jun; Xie, Chun-Li; Yang, Li-Qun
2018-05-10
In nuclear decommissioning, virtual simulation technology is a useful tool to achieve an effective work process by using virtual environments to represent the physical and logical scheme of a real decommissioning project. This technology is cost-saving and time-saving, with the capacity to develop various decommissioning scenarios and reduce the risk of retrofitting. The method utilises a radiation map in a virtual simulation as the basis for the assessment of exposure to a virtual human. In this paper, we propose a fast simulation method using a known radiation source. The method has a unique advantage over point kernel and Monte Carlo methods because it generates the radiation map using interpolation in a virtual environment. The simulation of the radiation map including the calculation and the visualisation were realised using UNITY and MATLAB. The feasibility of the proposed method was tested on a hypothetical case and the results obtained are discussed in this paper.
ERIC Educational Resources Information Center
Barbalios, N.; Ioannidou, I.; Tzionas, P.; Paraskeuopoulos, S.
2013-01-01
This paper introduces a realistic 3D model supported virtual environment for environmental education, that highlights the importance of water resource sharing by focusing on the tragedy of the commons dilemma. The proposed virtual environment entails simulations that are controlled by a multi-agent simulation model of a real ecosystem consisting…
Physical environment virtualization for human activities recognition
NASA Astrophysics Data System (ADS)
Poshtkar, Azin; Elangovan, Vinayak; Shirkhodaie, Amir; Chan, Alex; Hu, Shuowen
2015-05-01
Human activity recognition research relies heavily on extensive datasets to verify and validate performance of activity recognition algorithms. However, obtaining real datasets are expensive and highly time consuming. A physics-based virtual simulation can accelerate the development of context based human activity recognition algorithms and techniques by generating relevant training and testing videos simulating diverse operational scenarios. In this paper, we discuss in detail the requisite capabilities of a virtual environment to aid as a test bed for evaluating and enhancing activity recognition algorithms. To demonstrate the numerous advantages of virtual environment development, a newly developed virtual environment simulation modeling (VESM) environment is presented here to generate calibrated multisource imagery datasets suitable for development and testing of recognition algorithms for context-based human activities. The VESM environment serves as a versatile test bed to generate a vast amount of realistic data for training and testing of sensor processing algorithms. To demonstrate the effectiveness of VESM environment, we present various simulated scenarios and processed results to infer proper semantic annotations from the high fidelity imagery data for human-vehicle activity recognition under different operational contexts.
Virtual Collaborative Simulation Environment for Integrated Product and Process Development
NASA Technical Reports Server (NTRS)
Gulli, Michael A.
1997-01-01
Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.
A virtual therapeutic environment with user projective agents.
Ookita, S Y; Tokuda, H
2001-02-01
Today, we see the Internet as more than just an information infrastructure, but a socializing place and a safe outlet of inner feelings. Many personalities develop aside from real world life due to its anonymous environment. Virtual world interactions are bringing about new psychological illnesses ranging from netaddiction to technostress, as well as online personality disorders and conflicts in multiple identities that exist in the virtual world. Presently, there are no standard therapy models for the virtual environment. There are very few therapeutic environments, or tools especially made for virtual therapeutic environments. The goal of our research is to provide the therapy model and middleware tools for psychologists to use in virtual therapeutic environments. We propose the Cyber Therapy Model, and Projective Agents, a tool used in the therapeutic environment. To evaluate the effectiveness of the tool, we created a prototype system, called the Virtual Group Counseling System, which is a therapeutic environment that allows the user to participate in group counseling through the eyes of their Projective Agent. Projective Agents inherit the user's personality traits. During the virtual group counseling, the user's Projective Agent interacts and collaborates to recover and increase their psychological growth. The prototype system provides a simulation environment where psychologists can adjust the parameters and customize their own simulation environment. The model and tool is a first attempt toward simulating online personalities that may exist only online, and provide data for observation.
Mixed virtual reality simulation--taking endoscopic simulation one step further.
Courteille, O; Felländer-Tsai, L; Hedman, L; Kjellin, A; Enochsson, L; Lindgren, G; Fors, U
2011-01-01
This pilot study aimed to assess medical students' appraisals of a "mixed" virtual reality simulation for endoscopic surgery (with a virtual patient case in addition to a virtual colonoscopy) as well as the impact of this simulation set-up on students' performance. Findings indicate that virtual patients can enhance contextualization of simulated endoscopy and thus facilitate an authentic learning environment, which is important in order to increase motivation.
NASA Technical Reports Server (NTRS)
1994-01-01
This symposium on measurement and control in robotics included sessions on: (1) rendering, including tactile perception and applied virtual reality; (2) applications in simulated medical procedures and telerobotics; (3) tracking sensors in a virtual environment; (4) displays for virtual reality applications; (5) sensory feedback including a virtual environment application with partial gravity simulation; and (6) applications in education, entertainment, technical writing, and animation.
Feasibility and fidelity of practising surgical fixation on a virtual ulna bone
LeBlanc, Justin; Hutchison, Carol; Hu, Yaoping; Donnon, Tyrone
2013-01-01
Background Surgical simulators provide a safe environment to learn and practise psychomotor skills. A goal for these simulators is to achieve high levels of fidelity. The purpose of this study was to develop a reliable surgical simulator fidelity questionnaire and to assess whether a newly developed virtual haptic simulator for fixation of an ulna has comparable levels of fidelity as Sawbones. Methods Simulator fidelity questionnaires were developed. We performed a stratified randomized study with surgical trainees. They performed fixation of the ulna using a virtual simulator and Sawbones. They completed the fidelity questionnaires after each procedure. Results Twenty-two trainees participated in the study. The reliability of the fidelity questionnaire for each separate domain (environment, equipment, psychological) was Cronbach α greater than 0.70, except for virtual environment. The Sawbones had significantly higher levels of fidelity than the virtual simulator (p < 0.001) with a large effect size difference (Cohen d < 1.3). Conclusion The newly developed fidelity questionnaire is a reliable tool that can potentially be used to determine the fidelity of other surgical simulators. Increasing the fidelity of this virtual simulator is required before its use as a training tool for surgical fixation. The virtual simulator brings with it the added benefits of repeated, independent safe use with immediate, objective feedback and the potential to alter the complexity of the skill. PMID:23883510
NASA Technical Reports Server (NTRS)
Lindsey, Patricia F.
1993-01-01
In its search for higher level computer interfaces and more realistic electronic simulations for measurement and spatial analysis in human factors design, NASA at MSFC is evaluating the functionality of virtual reality (VR) technology. Virtual reality simulation generates a three dimensional environment in which the participant appears to be enveloped. It is a type of interactive simulation in which humans are not only involved, but included. Virtual reality technology is still in the experimental phase, but it appears to be the next logical step after computer aided three-dimensional animation in transferring the viewer from a passive to an active role in experiencing and evaluating an environment. There is great potential for using this new technology when designing environments for more successful interaction, both with the environment and with another participant in a remote location. At the University of North Carolina, a VR simulation of a the planned Sitterson Hall, revealed a flaw in the building's design that had not been observed during examination of the more traditional building plan simulation methods on paper and on computer aided design (CAD) work station. The virtual environment enables multiple participants in remote locations to come together and interact with one another and with the environment. Each participant is capable of seeing herself and the other participants and of interacting with them within the simulated environment.
Virtual community centre for power wheelchair training: Experience of children and clinicians.
Torkia, Caryne; Ryan, Stephen E; Reid, Denise; Boissy, Patrick; Lemay, Martin; Routhier, François; Contardo, Resi; Woodhouse, Janet; Archambault, Phillipe S
2017-11-02
To: 1) characterize the overall experience in using the McGill immersive wheelchair - community centre (miWe-CC) simulator; and 2) investigate the experience of presence (i.e., sense of being in the virtual rather than in the real, physical environment) while driving a PW in the miWe-CC. A qualitative research design with structured interviews was used. Fifteen clinicians and 11 children were interviewed after driving a power wheelchair (PW) in the miWe-CC simulator. Data were analyzed using the conventional and directed content analysis approaches. Overall, participants enjoyed using the simulator and experienced a sense of presence in the virtual space. They felt a sense of being in the virtual environment, involved and focused on driving the virtual PW rather than on the surroundings of the actual room where they were. Participants reported several similarities between the virtual community centre layout and activities of the miWe-CC and the day-to-day reality of paediatric PW users. The simulator replicated participants' expectations of real-life PW use and promises to have an effect on improving the driving skills of new PW users. Implications for rehabilitation Among young users, the McGill immersive wheelchair (miWe) simulator provides an experience of presence within the virtual environment. This experience of presence is generated by a sense of being in the virtual scene, a sense of being involved, engaged, and focused on interacting within the virtual environment, and by the perception that the virtual environment is consistent with the real world. The miWe is a relevant and accessible approach, complementary to real world power wheelchair training for young users.
NASA Technical Reports Server (NTRS)
Lehnert, H.; Blauert, Jens; Pompetzki, W.
1991-01-01
In every-day listening the auditory event perceived by a listener is determined not only by the sound signal that a sound emits but also by a variety of environmental parameters. These parameters are the position, orientation and directional characteristics of the sound source, the listener's position and orientation, the geometrical and acoustical properties of surfaces which affect the sound field and the sound propagation properties of the surrounding fluid. A complete set of these parameters can be called an Acoustic Environment. If the auditory event perceived by a listener is manipulated in such a way that the listener is shifted acoustically into a different acoustic environment without moving himself physically, a Virtual Acoustic Environment has been created. Here, we deal with a special technique to set up nearly arbitrary Virtual Acoustic Environments, the Binaural Room Simulation. The purpose of the Binaural Room Simulation is to compute the binaural impulse response related to a virtual acoustic environment taking into account all parameters mentioned above. One possible way to describe a Virtual Acoustic Environment is the concept of the virtual sound sources. Each of the virtual sources emits a certain signal which is correlated but not necessarily identical with the signal emitted by the direct sound source. If source and receiver are non moving, the acoustic environment becomes a linear time-invariant system. Then, the Binaural Impulse Response from the source to a listener' s eardrums contains all relevant auditory information related to the Virtual Acoustic Environment. Listening into the simulated environment can easily be achieved by convolving the Binaural Impulse Response with dry signals and representing the results via headphones.
Emerging CAE technologies and their role in Future Ambient Intelligence Environments
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.
2011-03-01
Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.
Virtual acoustic environments for comprehensive evaluation of model-based hearing devices.
Grimm, Giso; Luberadzka, Joanna; Hohmann, Volker
2018-06-01
Create virtual acoustic environments (VAEs) with interactive dynamic rendering for applications in audiology. A toolbox for creation and rendering of dynamic virtual acoustic environments (TASCAR) that allows direct user interaction was developed for application in hearing aid research and audiology. The software architecture and the simulation methods used to produce VAEs are outlined. Example environments are described and analysed. With the proposed software, a tool for simulation of VAEs is available. A set of VAEs rendered with the proposed software was described.
ERIC Educational Resources Information Center
Keskitalo, Tuulikki
2011-01-01
This research article focuses on virtual reality (VR) and simulation-based training, with a special focus on the pedagogical use of the Virtual Centre of Wellness Campus known as ENVI (Rovaniemi, Finland). In order to clearly understand how teachers perceive teaching and learning in such environments, this research examines the concepts of…
Garretson, Justin R [Albuquerque, NM; Parker, Eric P [Albuquerque, NM; Gladwell, T Scott [Albuquerque, NM; Rigdon, J Brian [Edgewood, NM; Oppel, III, Fred J.
2012-05-29
Apparatus and methods for modifying the operation of a robotic vehicle in a real environment to emulate the operation of the robotic vehicle in a mixed reality environment include a vehicle sensing system having a communications module attached to the robotic vehicle for communicating operating parameters related to the robotic vehicle in a real environment to a simulation controller for simulating the operation of the robotic vehicle in a mixed (live, virtual and constructive) environment wherein the affects of virtual and constructive entities on the operation of the robotic vehicle (and vice versa) are simulated. These effects are communicated to the vehicle sensing system which generates a modified control command for the robotic vehicle including the effects of virtual and constructive entities, causing the robot in the real environment to behave as if virtual and constructive entities existed in the real environment.
Korocsec, D; Holobar, A; Divjak, M; Zazula, D
2005-12-01
Medicine is a difficult thing to learn. Experimenting with real patients should not be the only option; simulation deserves a special attention here. Virtual Reality Modelling Language (VRML) as a tool for building virtual objects and scenes has a good record of educational applications in medicine, especially for static and animated visualisations of body parts and organs. However, to create computer simulations resembling situations in real environments the required level of interactivity and dynamics is difficult to achieve. In the present paper we describe some approaches and techniques which we used to push the limits of the current VRML technology further toward dynamic 3D representation of virtual environments (VEs). Our demonstration is based on the implementation of a virtual baby model, whose vital signs can be controlled from an external Java application. The main contributions of this work are: (a) outline and evaluation of the three-level VRML/Java implementation of the dynamic virtual environment, (b) proposal for a modified VRML Timesensor node, which greatly improves the overall control of system performance, and (c) architecture of the prototype distributed virtual environment for training in neonatal resuscitation comprising the interactive virtual newborn, active bedside monitor for vital signs and full 3D representation of the surgery room.
NASA Technical Reports Server (NTRS)
Hammrs, Stephan R.
2008-01-01
Virtual Satellite (VirtualSat) is a computer program that creates an environment that facilitates the development, verification, and validation of flight software for a single spacecraft or for multiple spacecraft flying in formation. In this environment, enhanced functionality and autonomy of navigation, guidance, and control systems of a spacecraft are provided by a virtual satellite that is, a computational model that simulates the dynamic behavior of the spacecraft. Within this environment, it is possible to execute any associated software, the development of which could benefit from knowledge of, and possible interaction (typically, exchange of data) with, the virtual satellite. Examples of associated software include programs for simulating spacecraft power and thermal- management systems. This environment is independent of the flight hardware that will eventually host the flight software, making it possible to develop the software simultaneously with, or even before, the hardware is delivered. Optionally, by use of interfaces included in VirtualSat, hardware can be used instead of simulated. The flight software, coded in the C or C++ programming language, is compilable and loadable into VirtualSat without any special modifications. Thus, VirtualSat can serve as a relatively inexpensive software test-bed for development test, integration, and post-launch maintenance of spacecraft flight software.
ERIC Educational Resources Information Center
Keskitalo, Tuulikki
2012-01-01
Expectations for simulations in healthcare education are high; however, little is known about healthcare students' expectations of the learning process in virtual reality (VR) and simulation-based learning environments (SBLEs). This research aims to describe first-year healthcare students' (N=97) expectations regarding teaching, studying, and…
Physics-based approach to haptic display
NASA Technical Reports Server (NTRS)
Brown, J. Michael; Colgate, J. Edward
1994-01-01
This paper addresses the implementation of complex multiple degree of freedom virtual environments for haptic display. We suggest that a physics based approach to rigid body simulation is appropriate for hand tool simulation, but that currently available simulation techniques are not sufficient to guarantee successful implementation. We discuss the desirable features of a virtual environment simulation, specifically highlighting the importance of stability guarantees.
The Use of Visual-Based Simulated Environments in Teacher Preparation
ERIC Educational Resources Information Center
Judge, Sharon; Bobzien, Jonna; Maydosz, Ann; Gear, Sabra; Katsioloudis, Petros
2013-01-01
While virtual technology for training in the simulation field has a long history in medicine, aviation, and the military, the application of similar emerging and innovative technologies in teacher preparation and education has been limited. TLE TeachLive™ (Teaching Learning Environment, Teaching in a Virtual Environment) [TLE] is an inventive…
Knowledge-Driven Design of Virtual Patient Simulations
ERIC Educational Resources Information Center
Vergara, Victor; Caudell, Thomas; Goldsmith, Timothy; Panaiotis; Alverson, Dale
2009-01-01
Virtual worlds provide unique opportunities for instructors to promote, study, and evaluate student learning and comprehension. In this article, Victor Vergara, Thomas Caudell, Timothy Goldsmith, Panaiotis, and Dale Alverson explore the advantages of using virtual reality environments to create simulations for medical students. Virtual simulations…
Virtual reality simulation in neurosurgery: technologies and evolution.
Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H
2013-01-01
Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.
Simulation of Physical Experiments in Immersive Virtual Environments
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Wasfy, Tamer M.
2001-01-01
An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.
Higuera-Trujillo, Juan Luis; López-Tarruella Maldonado, Juan; Llinares Millán, Carmen
2017-11-01
Psychological research into human factors frequently uses simulations to study the relationship between human behaviour and the environment. Their validity depends on their similarity with the physical environments. This paper aims to validate three environmental-simulation display formats: photographs, 360° panoramas, and virtual reality. To do this we compared the psychological and physiological responses evoked by simulated environments set-ups to those from a physical environment setup; we also assessed the users' sense of presence. Analysis show that 360° panoramas offer the closest to reality results according to the participants' psychological responses, and virtual reality according to the physiological responses. Correlations between the feeling of presence and physiological and other psychological responses were also observed. These results may be of interest to researchers using environmental-simulation technologies currently available in order to replicate the experience of physical environments. Copyright © 2017 Elsevier Ltd. All rights reserved.
VERSE - Virtual Equivalent Real-time Simulation
NASA Technical Reports Server (NTRS)
Zheng, Yang; Martin, Bryan J.; Villaume, Nathaniel
2005-01-01
Distributed real-time simulations provide important timing validation and hardware in the- loop results for the spacecraft flight software development cycle. Occasionally, the need for higher fidelity modeling and more comprehensive debugging capabilities - combined with a limited amount of computational resources - calls for a non real-time simulation environment that mimics the real-time environment. By creating a non real-time environment that accommodates simulations and flight software designed for a multi-CPU real-time system, we can save development time, cut mission costs, and reduce the likelihood of errors. This paper presents such a solution: Virtual Equivalent Real-time Simulation Environment (VERSE). VERSE turns the real-time operating system RTAI (Real-time Application Interface) into an event driven simulator that runs in virtual real time. Designed to keep the original RTAI architecture as intact as possible, and therefore inheriting RTAI's many capabilities, VERSE was implemented with remarkably little change to the RTAI source code. This small footprint together with use of the same API allows users to easily run the same application in both real-time and virtual time environments. VERSE has been used to build a workstation testbed for NASA's Space Interferometry Mission (SIM PlanetQuest) instrument flight software. With its flexible simulation controls and inexpensive setup and replication costs, VERSE will become an invaluable tool in future mission development.
ERIC Educational Resources Information Center
McCoy, Lise
2014-01-01
Virtual Patient Simulations (VPS) are web-based exercises involving simulated patients in virtual environments. This study investigates the utility of VPS for increasing medical student clinical reasoning skills, collaboration, and engagement. Many studies indicate that VPS provide medical students with essential practice in clinical decision…
Computer Vision Assisted Virtual Reality Calibration
NASA Technical Reports Server (NTRS)
Kim, W.
1999-01-01
A computer vision assisted semi-automatic virtual reality (VR) calibration technology has been developed that can accurately match a virtual environment of graphically simulated three-dimensional (3-D) models to the video images of the real task environment.
The Influences of the 2D Image-Based Augmented Reality and Virtual Reality on Student Learning
ERIC Educational Resources Information Center
Liou, Hsin-Hun; Yang, Stephen J. H.; Chen, Sherry Y.; Tarng, Wernhuar
2017-01-01
Virtual reality (VR) learning environments can provide students with concepts of the simulated phenomena, but users are not allowed to interact with real elements. Conversely, augmented reality (AR) learning environments blend real-world environments so AR could enhance the effects of computer simulation and promote students' realistic experience.…
NASA Technical Reports Server (NTRS)
Smith, Jeffrey D.; Twombly, I. Alexander; Maese, A. Christopher; Cagle, Yvonne; Boyle, Richard
2003-01-01
The International Space Station demonstrates the greatest capabilities of human ingenuity, international cooperation and technology development. The complexity of this space structure is unprecedented; and training astronaut crews to maintain all its systems, as well as perform a multitude of research experiments, requires the most advanced training tools and techniques. Computer simulation and virtual environments are currently used by astronauts to train for robotic arm manipulations and extravehicular activities; but now, with the latest computer technologies and recent successes in areas of medical simulation, the capability exists to train astronauts for more hands-on research tasks using immersive virtual environments. We have developed a new technology, the Virtual Glovebox (VGX), for simulation of experimental tasks that astronauts will perform aboard the Space Station. The VGX may also be used by crew support teams for design of experiments, testing equipment integration capability and optimizing the procedures astronauts will use. This is done through the 3D, desk-top sized, reach-in virtual environment that can simulate the microgravity environment in space. Additional features of the VGX allow for networking multiple users over the internet and operation of tele-robotic devices through an intuitive user interface. Although the system was developed for astronaut training and assisting support crews, Earth-bound applications, many emphasizing homeland security, have also been identified. Examples include training experts to handle hazardous biological and/or chemical agents in a safe simulation, operation of tele-robotic systems for assessing and diffusing threats such as bombs, and providing remote medical assistance to field personnel through a collaborative virtual environment. Thus, the emerging VGX simulation technology, while developed for space- based applications, can serve a dual use facilitating homeland security here on Earth.
Virtual Learning Environment for Interactive Engagement with Advanced Quantum Mechanics
ERIC Educational Resources Information Center
Pedersen, Mads Kock; Skyum, Birk; Heck, Robert; Müller, Romain; Bason, Mark; Lieberoth, Andreas; Sherson, Jacob F.
2016-01-01
A virtual learning environment can engage university students in the learning process in ways that the traditional lectures and lab formats cannot. We present our virtual learning environment "StudentResearcher," which incorporates simulations, multiple-choice quizzes, video lectures, and gamification into a learning path for quantum…
Learning Relative Motion Concepts in Immersive and Non-Immersive Virtual Environments
ERIC Educational Resources Information Center
Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria
2013-01-01
The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop…
Object Creation and Human Factors Evaluation for Virtual Environments
NASA Technical Reports Server (NTRS)
Lindsey, Patricia F.
1998-01-01
The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.
ERIC Educational Resources Information Center
Kapralos, Bill; Hogan, Michelle; Pribetic, Antonin I.; Dubrowski, Adam
2011-01-01
Purpose: Gaming and interactive virtual simulation environments support a learner-centered educational model allowing learners to work through problems acquiring knowledge through an active, experiential learning approach. To develop effective virtual simulations and serious games, the views and perceptions of learners and educators must be…
Clandestine Message Passing in Virtual Environments
2008-09-01
accessed April 4, 2008). Weir, Laila. “Boring Game? Outsorce It.” (August 24, 2004). http://www.wired.com/ entertainment / music /news/2004/08/ 64638...Multiplayer Online MOVES - Modeling Virtual Environments and Simulation MTV – Music Television NPS - Naval Postgraduate School PAN – Personal Area...Network PSP - PlayStation Portable RPG – Role-playing Game SL - Second Life SVN - Subversion VE – Virtual Environments vMTV – Virtual Music
Fire training in a virtual-reality environment
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Jurgen; Bucken, Arno
2005-03-01
Although fire is very common in our daily environment - as a source of energy at home or as a tool in industry - most people cannot estimate the danger of a conflagration. Therefore it is important to train people in combating fire. Beneath training with propane simulators or real fires and real extinguishers, fire training can be performed in virtual reality, which means a pollution-free and fast way of training. In this paper we describe how to enhance a virtual-reality environment with a real-time fire simulation and visualisation in order to establish a realistic emergency-training system. The presented approach supports extinguishing of the virtual fire including recordable performance data as needed in teletraining environments. We will show how to get realistic impressions of fire using advanced particle-simulation and how to use the advantages of particles to trigger states in a modified cellular automata used for the simulation of fire-behaviour. Using particle systems that interact with cellular automata it is possible to simulate a developing, spreading fire and its reaction on different extinguishing agents like water, CO2 or oxygen. The methods proposed in this paper have been implemented and successfully tested on Cosimir, a commercial robot-and VR-simulation-system.
Schmitt, Paul J; Agarwal, Nitin; Prestigiacomo, Charles J
2012-01-01
Military explorations of the practical role of simulators have served as a driving force for much of the virtual reality technology that we have today. The evolution of 3-dimensional and virtual environments from the early flight simulators used during World War II to the sophisticated training simulators in the modern military followed a path that virtual surgical and neurosurgical devices have already begun to parallel. By understanding the evolution of military simulators as well as comparing and contrasting that evolution with current and future surgical simulators, it may be possible to expedite the development of appropriate devices and establish their validity as effective training tools. As such, this article presents a historical perspective examining the progression of neurosurgical simulators, the establishment of effective and appropriate curricula for using them, and the contributions that the military has made during the ongoing maturation of this exciting treatment and training modality. Copyright © 2012. Published by Elsevier Inc.
ERIC Educational Resources Information Center
Cooper, Rory A.; Ding, Dan; Simpson, Richard; Fitzgerald, Shirley G.; Spaeth, Donald M.; Guo, Songfeng; Koontz, Alicia M.; Cooper, Rosemarie; Kim, Jongbae; Boninger, Michael L.
2005-01-01
Some aspects of assistive technology can be enhanced by the application of virtual reality. Although virtual simulation offers a range of new possibilities, learning to navigate in a virtual environment is not equivalent to learning to navigate in the real world. Therefore, virtual reality simulation is advocated as a useful preparation for…
G2H--graphics-to-haptic virtual environment development tool for PC's.
Acosta, E; Temkin, B; Krummel, T M; Heinrichs, W L
2000-01-01
For surgical training and preparations, the existing surgical virtual environments have shown great improvement. However, these improvements are more in the visual aspect. The incorporation of haptics into virtual reality base surgical simulations would enhance the sense of realism greatly. To aid in the development of the haptic surgical virtual environment we have created a graphics to haptic, G2H, virtual environment developer tool. G2H transforms graphical virtual environments (created or imported) to haptic virtual environments without programming. The G2H capability has been demonstrated using the complex 3D pelvic model of Lucy 2.0, the Stanford Visible Female. The pelvis was made haptic using G2H without any further programming effort.
Simulation fidelity of a virtual environment display
NASA Technical Reports Server (NTRS)
Nemire, Kenneth; Jacoby, Richard H.; Ellis, Stephen R.
1994-01-01
We assessed the degree to which a virtual environment system produced a faithful simulation of three-dimensional space by investigating the influence of a pitched optic array on the perception of gravity-referenced eye level (GREL). We compared the results with those obtained in a physical environment. In a within-subjects factorial design, 12 subjects indicated GREL while viewing virtual three-dimensional arrays at different static orientations. A physical array biased GREL more than did a geometrically identical virtual pitched array. However, addition of two sets of orthogonal parallel lines (a grid) to the virtual pitched array resulted in as large a bias as that obtained with the physical pitched array. The increased bias was caused by longitudinal, but not the transverse, components of the grid. We discuss implications of our results for spatial orientation models and for designs of virtual displays.
SIMPAVE : evaluation of virtual environments for pavement construction simulations
DOT National Transportation Integrated Search
2007-05-01
In the last couple of years, the authors have been developing virtual simulations for modeling the construction of asphalt pavements. The simulations are graphically rich, interactive, three-dimensional, with realistic physics, and allow multiple peo...
ERIC Educational Resources Information Center
Lau, Kung Wong; Lee, Pui Yuen
2015-01-01
This paper discusses the roles of simulation in creativity education and how to apply immersive virtual environments to enhance students' learning experiences in university, through the provision of interactive simulations. An empirical study of a simulated virtual reality was carried out in order to investigate the effectiveness of providing…
ERIC Educational Resources Information Center
Standard Smith, Kristy
2008-01-01
The purpose of this qualitative study was to explore the influence a simulated virtual team learning experience had on business school students' leadership competencies. The researcher sought to discover the relationship between filling the leadership role in the simulated virtual environment and developing leadership competencies. A…
Christie, Lorna S.; Goossens, Richard H. M.; de Ridder, Huib; Jakimowicz, Jack J.
2010-01-01
Background The aim of this study is to investigate the influence of the presence of anatomic landmarks on the performance of angled laparoscope navigation on the SimSurgery SEP simulator. Methods Twenty-eight experienced laparoscopic surgeons (familiar with 30° angled laparoscope, >100 basic laparoscopic procedures, >5 advanced laparoscopic procedures) and 23 novices (no laparoscopy experience) performed the Camera Navigation task in an abstract virtual environment (CN-box) and in a virtual representation of the lower abdomen (CN-abdomen). They also rated the realism and added value of the virtual environments on seven-point scales. Results Within both groups, the CN-box task was accomplished in less time and with shorter tip trajectory than the CN-abdomen task (Wilcoxon test, p < 0.05). No significant differences were found between the performances of the experienced participants and the novices on the CN tasks (Mann–Whitney U test, p > 0.05). In both groups, the CN tasks were perceived as hard work and more challenging than anticipated. Conclusions Performance of the angled laparoscope navigation task is influenced by the virtual environment surrounding the exercise. The task was performed better in an abstract environment than in a virtual environment with anatomic landmarks. More insight is required into the influence and function of different types of intrinsic and extrinsic feedback on the effectiveness of preclinical simulator training. PMID:20419318
NASA Technical Reports Server (NTRS)
Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard
2003-01-01
The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.
ERIC Educational Resources Information Center
Barkand, Jonathan; Kush, Joseph
2009-01-01
Virtual Learning Environments (VLEs) are becoming increasingly popular in online education environments and have multiple pedagogical advantages over more traditional approaches to education. VLEs include 3D worlds where students can engage in simulated learning activities such as Second Life. According to Claudia L'Amoreaux at Linden Lab, "at…
Virtual Reality Calibration for Telerobotic Servicing
NASA Technical Reports Server (NTRS)
Kim, W.
1994-01-01
A virtual reality calibration technique of matching a virtual environment of simulated graphics models in 3-D geometry and perspective with actual camera views of the remote site task environment has been developed to enable high-fidelity preview/predictive displays with calibrated graphics overlay on live video.
The effect of fidelity: how expert behavior changes in a virtual reality environment.
Ioannou, Ioanna; Avery, Alex; Zhou, Yun; Szudek, Jacek; Kennedy, Gregor; O'Leary, Stephen
2014-09-01
We compare the behavior of expert surgeons operating on the "gold standard" of simulation-the cadaveric temporal bone-against a high-fidelity virtual reality (VR) simulation. We aim to determine whether expert behavior changes within the virtual environment and to understand how the fidelity of simulation affects users' behavior. Five expert otologists performed cortical mastoidectomy and cochleostomy on a human cadaveric temporal bone and a VR temporal bone simulator. Hand movement and video recordings were used to derive a range of measures, to facilitate an analysis of surgical technique, and to compare expert behavior between the cadaveric and simulator environments. Drilling time was similar across the two environments. Some measures such as total time and burr change count differed predictably due to the ease of switching burrs within the simulator. Surgical strokes were generally longer in distance and duration in VR, but these measures changed proportionally to cadaveric measures across the stages of the procedure. Stroke shape metrics differed, which was attributed to the modeling of burr behavior within the simulator. This will be corrected in future versions. Slight differences in drill interaction between a virtual environment and the real world can have measurable effects on surgical technique, particularly in terms of stroke length, duration, and curvature. It is important to understand these effects when designing and implementing surgical training programs based on VR simulation--and when improving the fidelity of VR simulators to facilitate use of a similar technique in both real and simulated situations. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
Virtual Learning Environments.
ERIC Educational Resources Information Center
Follows, Scott B.
1999-01-01
Illustrates the possibilities and educational benefits of virtual learning environments (VLEs), based on experiences with "Thirst for Knowledge," a VLE that simulates the workplace of a major company. While working in this virtual office world, students walk through the building, attend meetings, read reports, receive e-mail, answer the telephone,…
Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.
Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk
2013-08-01
Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.
Shared virtual environments for aerospace training
NASA Technical Reports Server (NTRS)
Loftin, R. Bowen; Voss, Mark
1994-01-01
Virtual environments have the potential to significantly enhance the training of NASA astronauts and ground-based personnel for a variety of activities. A critical requirement is the need to share virtual environments, in real or near real time, between remote sites. It has been hypothesized that the training of international astronaut crews could be done more cheaply and effectively by utilizing such shared virtual environments in the early stages of mission preparation. The Software Technology Branch at NASA's Johnson Space Center has developed the capability for multiple users to simultaneously share the same virtual environment. Each user generates the graphics needed to create the virtual environment. All changes of object position and state are communicated to all users so that each virtual environment maintains its 'currency.' Examples of these shared environments will be discussed and plans for the utilization of the Department of Defense's Distributed Interactive Simulation (DIS) protocols for shared virtual environments will be presented. Finally, the impact of this technology on training and education in general will be explored.
Virtual reality simulators and training in laparoscopic surgery.
Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos
2015-01-01
Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills
NASA Astrophysics Data System (ADS)
Choi, Kup-Sze
This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.
Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2010-01-01
Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933
Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2011-07-01
Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.
Virtual reality, disability and rehabilitation.
Wilson, P N; Foreman, N; Stanton, D
1997-06-01
Virtual reality, or virtual environment computer technology, generates simulated objects and events with which people can interact. Existing and potential applications for this technology in the field of disability and rehabilitation are discussed. The main benefits identified for disabled people are that they can engage in a range of activities in a simulator relatively free from the limitations imposed by their disability, and they can do so in safety. Evidence that the knowledge and skills acquired by disabled individuals in simulated environments can transfer to the real world is presented. In particular, spatial information and life skills learned in a virtual environment have been shown to transfer to the real world. Applications for visually impaired people are discussed, and the potential for medical interventions and the assessment and treatment of neurological damage are considered. Finally some current limitations of the technology, and ethical concerns in relation to disability, are discussed.
Scientific Assistant Virtual Laboratory (SAVL)
NASA Astrophysics Data System (ADS)
Alaghband, Gita; Fardi, Hamid; Gnabasik, David
2007-03-01
The Scientific Assistant Virtual Laboratory (SAVL) is a scientific discovery environment, an interactive simulated virtual laboratory, for learning physics and mathematics. The purpose of this computer-assisted intervention is to improve middle and high school student interest, insight and scores in physics and mathematics. SAVL develops scientific and mathematical imagination in a visual, symbolic, and experimental simulation environment. It directly addresses the issues of scientific and technological competency by providing critical thinking training through integrated modules. This on-going research provides a virtual laboratory environment in which the student directs the building of the experiment rather than observing a packaged simulation. SAVL: * Engages the persistent interest of young minds in physics and math by visually linking simulation objects and events with mathematical relations. * Teaches integrated concepts by the hands-on exploration and focused visualization of classic physics experiments within software. * Systematically and uniformly assesses and scores students by their ability to answer their own questions within the context of a Master Question Network. We will demonstrate how the Master Question Network uses polymorphic interfaces and C# lambda expressions to manage simulation objects.
Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces
NASA Astrophysics Data System (ADS)
Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana
Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.
ERIC Educational Resources Information Center
Huang, Hsiu-Mei; Liaw, Shu-Sheng; Lai, Chung-Min
2016-01-01
Advanced technologies have been widely applied in medical education, including human-patient simulators, immersive virtual reality Cave Automatic Virtual Environment systems, and video conferencing. Evaluating learner acceptance of such virtual reality (VR) learning environments is a critical issue for ensuring that such technologies are used to…
Designing a Virtual-Reality-Based, Gamelike Math Learning Environment
ERIC Educational Resources Information Center
Xu, Xinhao; Ke, Fengfeng
2016-01-01
This exploratory study examined the design issues related to a virtual-reality-based, gamelike learning environment (VRGLE) developed via OpenSimulator, an open-source virtual reality server. The researchers collected qualitative data to examine the VRGLE's usability, playability, and content integration for math learning. They found it important…
Applicability of Virtual Environments as C4ISR Displays
2006-06-01
simulator sickness questionnaire (ssq): A method for quantifying simulator sickness. International Journal of Aviation Psychology, 3(3):203ff. Ergonomie ...Displays Thomas Alexander FGAN - Research Institute for Communication, Information Processing, and Ergonomics Wachtberg, Germany Ergonomie und...Führungssysteme FORSCHUNGSINSTITUT FÜR KOMMUNIKATION, INFORMATIONSVERARBEITUNG UND ERGONOMIE 1 FGAN Applicability of Virtual Environments as C4ISR Displays
The Virtual Environment for Rapid Prototyping of the Intelligent Environment
Bouzouane, Abdenour; Gaboury, Sébastien
2017-01-01
Advances in domains such as sensor networks and electronic and ambient intelligence have allowed us to create intelligent environments (IEs). However, research in IE is being held back by the fact that researchers face major difficulties, such as a lack of resources for their experiments. Indeed, they cannot easily build IEs to evaluate their approaches. This is mainly because of economic and logistical issues. In this paper, we propose a simulator to build virtual IEs. Simulators are a good alternative to physical IEs because they are inexpensive, and experiments can be conducted easily. Our simulator is open source and it provides users with a set of virtual sensors that simulates the behavior of real sensors. This simulator gives the user the capacity to build their own environment, providing a model to edit inhabitants’ behavior and an interactive mode. In this mode, the user can directly act upon IE objects. This simulator gathers data generated by the interactions in order to produce datasets. These datasets can be used by scientists to evaluate several approaches in IEs. PMID:29112175
The Virtual Environment for Rapid Prototyping of the Intelligent Environment.
Francillette, Yannick; Boucher, Eric; Bouzouane, Abdenour; Gaboury, Sébastien
2017-11-07
Advances in domains such as sensor networks and electronic and ambient intelligence have allowed us to create intelligent environments (IEs). However, research in IE is being held back by the fact that researchers face major difficulties, such as a lack of resources for their experiments. Indeed, they cannot easily build IEs to evaluate their approaches. This is mainly because of economic and logistical issues. In this paper, we propose a simulator to build virtual IEs. Simulators are a good alternative to physical IEs because they are inexpensive, and experiments can be conducted easily. Our simulator is open source and it provides users with a set of virtual sensors that simulates the behavior of real sensors. This simulator gives the user the capacity to build their own environment, providing a model to edit inhabitants' behavior and an interactive mode. In this mode, the user can directly act upon IE objects. This simulator gathers data generated by the interactions in order to produce datasets. These datasets can be used by scientists to evaluate several approaches in IEs.
A Simulated Learning Environment for Teaching Medicine Dispensing Skills
Styles, Kim; Sewell, Keith; Trinder, Peta; Marriott, Jennifer; Maher, Sheryl; Naidu, Som
2016-01-01
Objective. To develop an authentic simulation of the professional practice dispensary context for students to develop their dispensing skills in a risk-free environment. Design. A development team used an Agile software development method to create MyDispense, a web-based simulation. Modeled on virtual learning environments elements, the software employed widely available standards-based technologies to create a virtual community pharmacy environment. Assessment. First-year pharmacy students who used the software in their tutorials, were, at the end of the second semester, surveyed on their prior dispensing experience and their perceptions of MyDispense as a tool to learn dispensing skills. Conclusion. The dispensary simulation is an effective tool for helping students develop dispensing competency and knowledge in a safe environment. PMID:26941437
The ALIVE Project: Astronomy Learning in Immersive Virtual Environments
NASA Astrophysics Data System (ADS)
Yu, K. C.; Sahami, K.; Denn, G.
2008-06-01
The Astronomy Learning in Immersive Virtual Environments (ALIVE) project seeks to discover learning modes and optimal teaching strategies using immersive virtual environments (VEs). VEs are computer-generated, three-dimensional environments that can be navigated to provide multiple perspectives. Immersive VEs provide the additional benefit of surrounding a viewer with the simulated reality. ALIVE evaluates the incorporation of an interactive, real-time ``virtual universe'' into formal college astronomy education. In the experiment, pre-course, post-course, and curriculum tests will be used to determine the efficacy of immersive visualizations presented in a digital planetarium versus the same visual simulations in the non-immersive setting of a normal classroom, as well as a control case using traditional classroom multimedia. To normalize for inter-instructor variability, each ALIVE instructor will teach at least one of each class in each of the three test groups.
Collaboration and Synergy among Government, Industry and Academia in M&S Domain: Turkey’s Approach
2009-10-01
Analysis, Decision Support System Design and Implementation, Simulation Output Analysis, Statistical Data Analysis, Virtual Reality , Artificial... virtual and constructive visual simulation systems as well as integrated advanced analytical models. Collaboration and Synergy among Government...simulation systems that are ready to use, credible, integrated with C4ISR systems. Creating synthetic environments and/or virtual prototypes of concepts
The Evolution of Constructivist Learning Environments: Immersion in Distributed, Virtual Worlds.
ERIC Educational Resources Information Center
Dede, Chris
1995-01-01
Discusses the evolution of constructivist learning environments and examines the collaboration of simulated software models, virtual environments, and evolving mental models via immersion in artificial realities. A sidebar gives a realistic example of a student navigating through cyberspace. (JMV)
Nature and origins of virtual environments - A bibliographical essay
NASA Technical Reports Server (NTRS)
Ellis, S. R.
1991-01-01
Virtual environments presented via head-mounted, computer-driven displays provide a new media for communication. They may be analyzed by considering: (1) what may be meant by an environment; (2) what is meant by the process of virtualization; and (3) some aspects of human performance that constrain environmental design. Their origins are traced from previous work in vehicle simulation and multimedia research. Pointers are provided to key technical references, in the dispersed, archival literature, that are relevant to the development and evaluation of virtual-environment interface systems.
Closed Environment Module - Modularization and extension of the Virtual Habitat
NASA Astrophysics Data System (ADS)
Plötner, Peter; Czupalla, Markus; Zhukov, Anton
2013-12-01
The Virtual Habitat (V-HAB), is a Life Support System (LSS) simulation, created to perform dynamic simulation of LSS's for future human spaceflight missions. It allows the testing of LSS robustness by means of computer simulations, e.g. of worst case scenarios.
Challenges to the development of complex virtual reality surgical simulations.
Seymour, N E; Røtnes, J S
2006-11-01
Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.
Realistic terrain visualization based on 3D virtual world technology
NASA Astrophysics Data System (ADS)
Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai
2009-09-01
The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.
Realistic terrain visualization based on 3D virtual world technology
NASA Astrophysics Data System (ADS)
Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai
2010-11-01
The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.
Virtual Gaming Simulation in Nursing Education: A Focus Group Study.
Verkuyl, Margaret; Hughes, Michelle; Tsui, Joyce; Betts, Lorraine; St-Amant, Oona; Lapum, Jennifer L
2017-05-01
The use of serious gaming in a virtual world is a novel pedagogical approach in nursing education. A virtual gaming simulation was implemented in a health assessment class that focused on mental health and interpersonal violence. The study's purpose was to explore students' experiences of the virtual gaming simulation. Three focus groups were conducted with a convenience sample of 20 first-year nursing students after they completed the virtual gaming simulation. Analysis yielded five themes: (a) Experiential Learning, (b) The Learning Process, (c) Personal Versus Professional, (d) Self-Efficacy, and (e) Knowledge. Virtual gaming simulation can provide experiential learning opportunities that promote engagement and allow learners to acquire and apply new knowledge while practicing skills in a safe and realistic environment. [J Nurs Educ. 2017;56(5):274-280.]. Copyright 2017, SLACK Incorporated.
Foreign language learning in immersive virtual environments
NASA Astrophysics Data System (ADS)
Chang, Benjamin; Sheldon, Lee; Si, Mei; Hand, Anton
2012-03-01
Virtual reality has long been used for training simulations in fields from medicine to welding to vehicular operation, but simulations involving more complex cognitive skills present new design challenges. Foreign language learning, for example, is increasingly vital in the global economy, but computer-assisted education is still in its early stages. Immersive virtual reality is a promising avenue for language learning as a way of dynamically creating believable scenes for conversational training and role-play simulation. Visual immersion alone, however, only provides a starting point. We suggest that the addition of social interactions and motivated engagement through narrative gameplay can lead to truly effective language learning in virtual environments. In this paper, we describe the development of a novel application for teaching Mandarin using CAVE-like VR, physical props, human actors and intelligent virtual agents, all within a semester-long multiplayer mystery game. Students travel (virtually) to China on a class field trip, which soon becomes complicated with intrigue and mystery surrounding the lost manuscript of an early Chinese literary classic. Virtual reality environments such as the Forbidden City and a Beijing teahouse provide the setting for learning language, cultural traditions, and social customs, as well as the discovery of clues through conversation in Mandarin with characters in the game.
NASA Astrophysics Data System (ADS)
Herbuś, K.; Ociepka, P.
2016-08-01
The development of methods of computer aided design and engineering allows conducting virtual tests, among others concerning motion simulation of technical means. The paper presents a method of integrating an object in the form of a virtual model of a Stewart platform with an avatar of a vehicle moving in a virtual environment. The area of the problem includes issues related to the problem of fidelity of mapping the work of the analyzed technical mean. The main object of investigations is a 3D model of a Stewart platform, which is a subsystem of the simulator designated for driving learning for disabled persons. The analyzed model of the platform, prepared for motion simulation, was created in the “Motion Simulation” module of a CAD/CAE class system Siemens PLM NX. Whereas the virtual environment, in which the moves the avatar of the passenger car, was elaborated in a VR class system EON Studio. The element integrating both of the mentioned software environments is a developed application that reads information from the virtual reality (VR) concerning the current position of the car avatar. Then, basing on the accepted algorithm, it sends control signals to respective joints of the model of the Stewart platform (CAD).
Tal, Aner; Wansink, Brian
2011-01-01
Virtual reality (VR) provides a potentially powerful tool for researchers seeking to investigate eating and physical activity. Some unique conditions are necessary to ensure that the psychological processes that influence real eating behavior also influence behavior in VR environments. Accounting for these conditions is critical if VR-assisted research is to accurately reflect real-world situations. The current work discusses key considerations VR researchers must take into account to ensure similar psychological functioning in virtual and actual reality and does so by focusing on the process of spontaneous mental simulation. Spontaneous mental simulation is prevalent under real-world conditions but may be absent under VR conditions, potentially leading to differences in judgment and behavior between virtual and actual reality. For simulation to occur, the virtual environment must be perceived as being available for action. A useful chart is supplied as a reference to help researchers to investigate eating and physical activity more effectively. PMID:21527088
Tal, Aner; Wansink, Brian
2011-03-01
Virtual reality (VR) provides a potentially powerful tool for researchers seeking to investigate eating and physical activity. Some unique conditions are necessary to ensure that the psychological processes that influence real eating behavior also influence behavior in VR environments. Accounting for these conditions is critical if VR-assisted research is to accurately reflect real-world situations. The current work discusses key considerations VR researchers must take into account to ensure similar psychological functioning in virtual and actual reality and does so by focusing on the process of spontaneous mental simulation. Spontaneous mental simulation is prevalent under real-world conditions but may be absent under VR conditions, potentially leading to differences in judgment and behavior between virtual and actual reality. For simulation to occur, the virtual environment must be perceived as being available for action. A useful chart is supplied as a reference to help researchers to investigate eating and physical activity more effectively. © 2011 Diabetes Technology Society.
Distributed collaborative environments for virtual capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2003-09-01
Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.
Virtual Representations in 3D Learning Environments
ERIC Educational Resources Information Center
Shonfeld, Miri; Kritz, Miki
2013-01-01
This research explores the extent to which virtual worlds can serve as online collaborative learning environments for students by increasing social presence and engagement. 3D environments enable learning, which simulates face-to-face encounters while retaining the advantages of online learning. Students in Education departments created avatars…
NASA Astrophysics Data System (ADS)
Chen, ChuXin; Trivedi, Mohan M.
1992-03-01
This research is focused on enhancing the overall productivity of an integrated human-robot system. A simulation, animation, visualization, and interactive control (SAVIC) environment has been developed for the design and operation of an integrated robotic manipulator system. This unique system possesses the abilities for multisensor simulation, kinematics and locomotion animation, dynamic motion and manipulation animation, transformation between real and virtual modes within the same graphics system, ease in exchanging software modules and hardware devices between real and virtual world operations, and interfacing with a real robotic system. This paper describes a working system and illustrates the concepts by presenting the simulation, animation, and control methodologies for a unique mobile robot with articulated tracks, a manipulator, and sensory modules.
Using CLIPS to represent knowledge in a VR simulation
NASA Technical Reports Server (NTRS)
Engelberg, Mark L.
1994-01-01
Virtual reality (VR) is an exciting use of advanced hardware and software technologies to achieve an immersive simulation. Until recently, the majority of virtual environments were merely 'fly-throughs' in which a user could freely explore a 3-dimensional world or a visualized dataset. Now that the underlying technologies are reaching a level of maturity, programmers are seeking ways to increase the complexity and interactivity of immersive simulations. In most cases, interactivity in a virtual environment can be specified in the form 'whenever such-and-such happens to object X, it reacts in the following manner.' CLIPS and COOL provide a simple and elegant framework for representing this knowledge-base in an efficient manner that can be extended incrementally. The complexity of a detailed simulation becomes more manageable when the control flow is governed by CLIPS' rule-based inference engine as opposed to by traditional procedural mechanisms. Examples in this paper will illustrate an effective way to represent VR information in CLIPS, and to tie this knowledge base to the input and output C routines of a typical virtual environment.
Kim, Hyun K; Park, Jaehyun; Choi, Yeongcheol; Choe, Mungyeong
2018-05-01
This study aims to develop a motion sickness measurement index in a virtual reality (VR) environment. The VR market is in an early stage of market formation and technological development, and thus, research on the side effects of VR devices such as simulator motion sickness is lacking. In this study, we used the simulator sickness questionnaire (SSQ), which has been traditionally used for simulator motion sickness measurement. To measure the motion sickness in a VR environment, 24 users performed target selection tasks using a VR device. The SSQ was administered immediately after each task, and the order of work was determined using the Latin square design. The existing SSQ was revised to develop a VR sickness questionnaire, which is used as the measurement index in a VR environment. In addition, the target selection method and button size were found to be significant factors that affect motion sickness in a VR environment. The results of this study are expected to be used for measuring and designing simulator sickness using VR devices in future studies. Copyright © 2018 Elsevier Ltd. All rights reserved.
Virtual Reality Simulation Training for Ebola Deployment.
Ragazzoni, Luca; Ingrassia, Pier Luigi; Echeverri, Lina; Maccapani, Fabio; Berryman, Lizzy; Burkle, Frederick M; Della Corte, Francesco
2015-10-01
Both virtual and hybrid simulation training offer a realistic and effective educational framework and opportunity to provide virtual exposure to operational public health skills that are essential for infection control and Ebola treatment management. This training is designed to increase staff safety and create a safe and realistic environment where trainees can gain essential basic and advanced skills.
De Leo, Gianluca; Diggs, Leigh A; Radici, Elena; Mastaglio, Thomas W
2014-02-01
Virtual-reality solutions have successfully been used to train distributed teams. This study aimed to investigate the correlation between user characteristics and sense of presence in an online virtual-reality environment where distributed teams are trained. A greater sense of presence has the potential to make training in the virtual environment more effective, leading to the formation of teams that perform better in a real environment. Being able to identify, before starting online training, those user characteristics that are predictors of a greater sense of presence can lead to the selection of trainees who would benefit most from the online simulated training. This is an observational study with a retrospective postsurvey of participants' user characteristics and degree of sense of presence. Twenty-nine members from 3 Air Force National Guard Medical Service expeditionary medical support teams participated in an online virtual environment training exercise and completed the Independent Television Commission-Sense of Presence Inventory survey, which measures sense of presence and user characteristics. Nonparametric statistics were applied to determine the statistical significance of user characteristics to sense of presence. Comparing user characteristics to the 4 scales of the Independent Television Commission-Sense of Presence Inventory using Kendall τ test gave the following results: the user characteristics "how often you play video games" (τ(26)=-0.458, P<0.01) and "television/film production knowledge" (τ(27)=-0.516, P<0.01) were significantly related to negative effects. Negative effects refer to adverse physiologic reactions owing to the virtual environment experience such as dizziness, nausea, headache, and eyestrain. The user characteristic "knowledge of virtual reality" was significantly related to engagement (τ(26)=0.463, P<0.01) and negative effects (τ(26)=-0.404, P<0.05). Individuals who have knowledge about virtual environments and experience with gaming environments report a higher sense of presence that indicates that they will likely benefit more from online virtual training. Future research studies could include a larger population of expeditionary medical support, and the results obtained could be used to create a model that predicts the level of presence based on the user characteristics. To maximize results and minimize costs, only those individuals who, based on their characteristics, are supposed to have a higher sense of presence and less negative effects could be selected for online simulated virtual environment training.
Research on Intelligent Synthesis Environments
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Lobeck, William E.
2002-01-01
Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.
Research on Intelligent Synthesis Environments
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.; Loftin, R. Bowen
2002-12-01
Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.
Bringing the Real World in: Reflection on Building a Virtual Learning Environment
ERIC Educational Resources Information Center
Mundkur, Anuradha; Ellickson, Cara
2012-01-01
We reflect on translating participatory and experiential learning methodologies into an online teaching environment through a Virtual Learning Environment (VLE) that simulates the "real-world" contexts of international development in order to develop an applied critical understanding of gender analysis and gender mainstreaming. Rather than being…
Validation of virtual reality as a tool to understand and prevent child pedestrian injury.
Schwebel, David C; Gaines, Joanna; Severson, Joan
2008-07-01
In recent years, virtual reality has emerged as an innovative tool for health-related education and training. Among the many benefits of virtual reality is the opportunity for novice users to engage unsupervised in a safe environment when the real environment might be dangerous. Virtual environments are only useful for health-related research, however, if behavior in the virtual world validly matches behavior in the real world. This study was designed to test the validity of an immersive, interactive virtual pedestrian environment. A sample of 102 children and 74 adults was recruited to complete simulated road-crossings in both the virtual environment and the identical real environment. In both the child and adult samples, construct validity was demonstrated via significant correlations between behavior in the virtual and real worlds. Results also indicate construct validity through developmental differences in behavior; convergent validity by showing correlations between parent-reported child temperament and behavior in the virtual world; internal reliability of various measures of pedestrian safety in the virtual world; and face validity, as measured by users' self-reported perception of realism in the virtual world. We discuss issues of generalizability to other virtual environments, and the implications for application of virtual reality to understanding and preventing pediatric pedestrian injuries.
Gutiérrez-Maldonado, José; Ferrer-García, Marta; Caqueo-Urízar, Alejandra; Letosa-Porta, Alex
2006-10-01
The aim of this study was to assess the usefulness of virtual environments representing situations that are emotionally significant to subjects with eating disorders (ED). These environments may be applied with both evaluative and therapeutic aims and in simulation procedures to carry out a range of experimental studies. This paper is part of a wider research project analyzing the influence of the situation to which subjects are exposed on their performance on body image estimation tasks. Thirty female patients with eating disorders were exposed to six virtual environments: a living-room (neutral situation), a kitchen with high-calorie food, a kitchen with low-calorie food, a restaurant with high-calorie food, a restaurant with low-calorie food, and a swimming-pool. After exposure to each environment the STAI-S (a measurement of state anxiety) and the CDB (a measurement of depression) were administered to all subjects. The results show that virtual reality instruments are particularly useful for simulating everyday situations that may provoke emotional reactions such as anxiety and depression, in patients with ED. Virtual environments in which subjects are obliged to ingest high-calorie food provoke the highest levels of state anxiety and depression.
Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments.
Rincon, J A; Poza-Lujan, Jose-Luis; Julian, V; Posadas-Yagüe, Juan-Luis; Carrascosa, C
2016-01-01
This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system.
Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments
2016-01-01
This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system. PMID:26926691
Human Machine Interfaces for Teleoperators and Virtual Environments Conference
NASA Technical Reports Server (NTRS)
1990-01-01
In a teleoperator system the human operator senses, moves within, and operates upon a remote or hazardous environment by means of a slave mechanism (a mechanism often referred to as a teleoperator). In a virtual environment system the interactive human machine interface is retained but the slave mechanism and its environment are replaced by a computer simulation. Video is replaced by computer graphics. The auditory and force sensations imparted to the human operator are similarly computer generated. In contrast to a teleoperator system, where the purpose is to extend the operator's sensorimotor system in a manner that facilitates exploration and manipulation of the physical environment, in a virtual environment system, the purpose is to train, inform, alter, or study the human operator to modify the state of the computer and the information environment. A major application in which the human operator is the target is that of flight simulation. Although flight simulators have been around for more than a decade, they had little impact outside aviation presumably because the application was so specialized and so expensive.
Development of a virtual speaking simulator using Image Based Rendering.
Lee, J M; Kim, H; Oh, M J; Ku, J H; Jang, D P; Kim, I Y; Kim, S I
2002-01-01
The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology has enabled the use of virtual reality (VR) for the treatment of the fear of public speaking. There are two techniques for building virtual environments for the treatment of this fear: a model-based and a movie-based method. Both methods have the weakness that they are unrealistic and not controllable individually. To understand these disadvantages, this paper presents a virtual environment produced with Image Based Rendering (IBR) and a chroma-key simultaneously. IBR enables the creation of realistic virtual environments where the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma-keys puts virtual audience members under individual control in the environment. In addition, real time capture technique is used in constructing the virtual environments enabling spoken interaction between the subject and a therapist or another subject.
Digital evaluation of sitting posture comfort in human-vehicle system under Industry 4.0 framework
NASA Astrophysics Data System (ADS)
Tao, Qing; Kang, Jinsheng; Sun, Wenlei; Li, Zhaobo; Huo, Xiao
2016-09-01
Most of the previous studies on the vibration ride comfort of the human-vehicle system were focused only on one or two aspects of the investigation. A hybrid approach which integrates all kinds of investigation methods in real environment and virtual environment is described. The real experimental environment includes the WBV(whole body vibration) test, questionnaires for human subjective sensation and motion capture. The virtual experimental environment includes the theoretical calculation on simplified 5-DOF human body vibration model, the vibration simulation and analysis within ADAMS/VibrationTM module, and the digital human biomechanics and occupational health analysis in Jack software. While the real experimental environment provides realistic and accurate test results, it also serves as core and validation for the virtual experimental environment. The virtual experimental environment takes full advantages of current available vibration simulation and digital human modelling software, and makes it possible to evaluate the sitting posture comfort in a human-vehicle system with various human anthropometric parameters. How this digital evaluation system for car seat comfort design is fitted in the Industry 4.0 framework is also proposed.
Transduction between worlds: using virtual and mixed reality for earth and planetary science
NASA Astrophysics Data System (ADS)
Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.
2017-12-01
Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.
Virtual reality simulation: using three-dimensional technology to teach nursing students.
Jenson, Carole E; Forsyth, Diane McNally
2012-06-01
The use of computerized technology is rapidly growing in the classroom and in healthcare. An emerging computer technology strategy for nursing education is the use of virtual reality simulation. This computer-based three-dimensional educational tool simulates real-life patient experiences in a risk-free environment, allows for repeated practice sessions, requires clinical decision making, exposes students to diverse patient conditions, provides immediate feedback, and is portable. The purpose of this article was to review the importance of virtual reality simulation as a computerized teaching strategy. In addition, a project to explore readiness of nursing faculty at one major Midwestern university for the use of virtual reality simulation as a computerized teaching strategy is described where faculty thought virtual reality simulation would increase students' knowledge of an intravenous line insertion procedure. Faculty who practiced intravenous catheter insertion via virtual reality simulation expressed a wide range of learning experiences from using virtual reality simulation that is congruent with the literature regarding the barriers to student learning. Innovative teaching strategies, such as virtual reality simulation, address barriers of increasing patient acuity, high student-to-faculty ratio, patient safety concerns from faculty, and student anxiety and can offer rapid feedback to students.
The Persistent Issue of Simulator Sickness in Naval Aviation Training.
Geyer, Daniel J; Biggs, Adam T
2018-04-01
Virtual simulations offer nearly unlimited training potential for naval aviation due to the wide array of scenarios that can be simulated in a safe, reliable, and cost-effective environment. This versatility has created substantial interest in using existing and emerging virtual technology to enhance training scenarios. However, the virtual simulations themselves may hinder training initiatives by inducing simulator sickness among the trainees, which is a series of symptoms similar to motion sickness that can arise from simulator use. Simulator sickness has been a problem for military aviation since the first simulators were introduced. The problem has also persisted despite the increasing fidelity and sense of immersion offered by new generations of simulators. As such, it is essential to understand the various problems so that trainers can ensure the best possible use of the simulators. This review will examine simulator sickness as it pertains to naval aviation training. Topics include: the prevailing theories on why symptoms develop, methods of measurement, contributing factors, effects on training, effects when used shipboard, aftereffects, countermeasures, and recommendations for future research involving virtual simulations in an aviation training environment.Geyer DJ, Biggs AT. The persistent issue of simulator sickness in naval aviation training. Aerosp Med Hum Perform. 2018; 89(4):396-405.
Novel graphical environment for virtual and real-world operations of tracked mobile manipulators
NASA Astrophysics Data System (ADS)
Chen, ChuXin; Trivedi, Mohan M.; Azam, Mir; Lassiter, Nils T.
1993-08-01
A simulation, animation, visualization and interactive control (SAVIC) environment has been developed for the design and operation of an integrated mobile manipulator system. This unique system possesses the abilities for (1) multi-sensor simulation, (2) kinematics and locomotion animation, (3) dynamic motion and manipulation animation, (4) transformation between real and virtual modes within the same graphics system, (5) ease in exchanging software modules and hardware devices between real and virtual world operations, and (6) interfacing with a real robotic system. This paper describes a working system and illustrates the concepts by presenting the simulation, animation and control methodologies for a unique mobile robot with articulated tracks, a manipulator, and sensory modules.
Virtual environment application with partial gravity simulation
NASA Technical Reports Server (NTRS)
Ray, David M.; Vanchau, Michael N.
1994-01-01
To support manned missions to the surface of Mars and missions requiring manipulation of payloads and locomotion in space, a training facility is required to simulate the conditions of both partial and microgravity. A partial gravity simulator (Pogo) which uses pneumatic suspension is being studied for use in virtual reality training. Pogo maintains a constant partial gravity simulation with a variation of simulated body force between 2.2 and 10 percent, depending on the type of locomotion inputs. this paper is based on the concept and application of a virtual environment system with Pogo including a head-mounted display and glove. The reality engine consists of a high end SGI workstation and PC's which drive Pogo's sensors and data acquisition hardware used for tracking and control. The tracking system is a hybrid of magnetic and optical trackers integrated for this application.
Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life
NASA Astrophysics Data System (ADS)
Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia
2011-03-01
Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.
DHM simulation in virtual environments: a case-study on control room design.
Zamberlan, M; Santos, V; Streit, P; Oliveira, J; Cury, R; Negri, T; Pastura, F; Guimarães, C; Cid, G
2012-01-01
This paper will present the workflow developed for the application of serious games in the design of complex cooperative work settings. The project was based on ergonomic studies and development of a control room among participative design process. Our main concerns were the 3D human virtual representation acquired from 3D scanning, human interaction, workspace layout and equipment designed considering ergonomics standards. Using Unity3D platform to design the virtual environment, the virtual human model can be controlled by users on dynamic scenario in order to evaluate the new work settings and simulate work activities. The results obtained showed that this virtual technology can drastically change the design process by improving the level of interaction between final users and, managers and human factors team.
Smoking cues in a virtual world provoke craving in cigarette smokers.
Baumann, Stephen B; Sayette, Michael A
2006-12-01
Twenty smoking-deprived cigarette smokers participated in a study to test the ability of smoking cues within a virtual world to provoke self-reported craving to smoke. Participants were exposed to 2 virtual-reality simulations displayed on a computer monitor: a control environment not containing any intentional smoking stimuli and a cue-exposure environment containing smoking stimuli. At various points, participants rated their urge to smoke on a scale of 0-100. Results indicated that baseline urge ratings were equivalent in both conditions, but the maximum increase in urge ratings was significantly higher in the cue-exposure environment than in the control environment. This is comparable to what in vivo studies have reported, but with the advantage of simulating more naturalistic and complex settings in a controlled environment. (c) 2006 APA, all rights reserved
Environments for online maritime simulators with cloud computing capabilities
NASA Astrophysics Data System (ADS)
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
Using immersive simulation for training first responders for mass casualty incidents.
Wilkerson, William; Avstreih, Dan; Gruppen, Larry; Beier, Klaus-Peter; Woolliscroft, James
2008-11-01
A descriptive study was performed to better understand the possible utility of immersive virtual reality simulation for training first responders in a mass casualty event. Utilizing a virtual reality cave automatic virtual environment (CAVE) and high-fidelity human patient simulator (HPS), a group of experts modeled a football stadium that experienced a terrorist explosion during a football game. Avatars (virtual patients) were developed by expert consensus that demonstrated a spectrum of injuries ranging from death to minor lacerations. A group of paramedics was assessed by observation for decisions made and action taken. A critical action checklist was created and used for direct observation and viewing videotaped recordings. Of the 12 participants, only 35.7% identified the type of incident they encountered. None identified a secondary device that was easily visible. All participants were enthusiastic about the simulation and provided valuable comments and insights. Learner feedback and expert performance review suggests that immersive training in a virtual environment has the potential to be a powerful tool to train first responders for high-acuity, low-frequency events, such as a terrorist attack.
Zhou, Y; Murata, T; Defanti, T A
2000-01-01
Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.
Can virtual reality be used to conduct mass prophylaxis clinic training? A pilot program.
Yellowlees, Peter; Cook, James N; Marks, Shayna L; Wolfe, Daniel; Mangin, Elanor
2008-03-01
To create and evaluate a pilot bioterrorism defense training environment using virtual reality technology. The present pilot project used Second Life, an internet-based virtual world system, to construct a virtual reality environment to mimic an actual setting that might be used as a Strategic National Stockpile (SNS) distribution site for northern California in the event of a bioterrorist attack. Scripted characters were integrated into the system as mock patients to analyze various clinic workflow scenarios. Users tested the virtual environment over two sessions. Thirteen users who toured the environment were asked to complete an evaluation survey. Respondents reported that the virtual reality system was relevant to their practice and had potential as a method of bioterrorism defense training. Computer simulations of bioterrorism defense training scenarios are feasible with existing personal computer technology. The use of internet-connected virtual environments holds promise for bioterrorism defense training. Recommendations are made for public health agencies regarding the implementation and benefits of using virtual reality for mass prophylaxis clinic training.
Constructing Virtual Training Demonstrations
2008-12-01
virtual environments have been shown to be effective for training, and distributed game -based architectures contribute an added benefit of wide...investigation of how a demonstration authoring toolset can be constructed from existing virtual training environments using 3-D multiplayer gaming ...intelligent agents project to create AI middleware for simulations and videogames . The result was SimBionic®, which enables users to graphically author
Guidelines for developing distributed virtual environment applications
NASA Astrophysics Data System (ADS)
Stytz, Martin R.; Banks, Sheila B.
1998-08-01
We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.
Virtual Reality in Schools: The Ultimate Educational Technology.
ERIC Educational Resources Information Center
Reid, Robert D.; Sykes, Wylmarie
1999-01-01
Discusses the use of virtual reality as an educational tool. Highlights include examples of virtual reality in public schools that lead to a more active learning process, simulated environments, integrating virtual reality into any curriculum, benefits to teachers and students, and overcoming barriers to implementation. (LRW)
Virtually-augmented interfaces for tactical aircraft.
Haas, M W
1995-05-01
The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and non-virtual concepts and devices across the visual, auditory and haptic sensory modalities. A fusion interface is a multi-sensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion-interface concepts. One of the virtual concepts to be investigated in the Fusion Interfaces for Tactical Environments facility (FITE) is the application of EEG and other physiological measures for virtual control of functions within the flight environment. FITE is a specialized flight simulator which allows efficient concept development through the use of rapid prototyping followed by direct experience of new fusion concepts. The FITE facility also supports evaluation of fusion concepts by operational fighter pilots in a high fidelity simulated air combat environment. The facility was utilized by a multi-disciplinary team composed of operational pilots, human-factors engineers, electronics engineers, computer scientists, and experimental psychologists to prototype and evaluate the first multi-sensory, virtually-augmented cockpit. The cockpit employed LCD-based head-down displays, a helmet-mounted display, three-dimensionally localized audio displays, and a haptic display. This paper will endeavor to describe the FITE facility architecture, some of the characteristics of the FITE virtual display and control devices, and the potential application of EEG and other physiological measures within the FITE facility.
Height effects in real and virtual environments.
Simeonov, Peter I; Hsiao, Hongwei; Dotson, Brian W; Ammons, Douglas E
2005-01-01
The study compared human perceptions of height, danger, and anxiety, as well as skin conductance and heart rate responses and postural instability effects, in real and virtual height environments. The 24 participants (12 men, 12 women), whose average age was 23.6 years, performed "lean-over-the-railing" and standing tasks on real and comparable virtual balconies, using a surround-screen virtual reality (SSVR) system. The results indicate that the virtual display of elevation provided realistic perceptual experience and induced some physiological responses and postural instability effects comparable to those found in a real environment. It appears that a simulation of elevated work environment in a SSVR system, although with reduced visual fidelity, is a valid tool for safety research. Potential applications of this study include the design of virtual environments that will help in safe evaluation of human performance at elevation, identification of risk factors leading to fall incidents, and assessment of new fall prevention strategies.
Distributed virtual environment for emergency medical training
NASA Astrophysics Data System (ADS)
Stytz, Martin R.; Banks, Sheila B.; Garcia, Brian W.; Godsell-Stytz, Gayl M.
1997-07-01
In many professions where individuals must work in a team in a high stress environment to accomplish a time-critical task, individual and team performance can benefit from joint training using distributed virtual environments (DVEs). One professional field that lacks but needs a high-fidelity team training environment is the field of emergency medicine. Currently, emergency department (ED) medical personnel train by using words to create a metal picture of a situation for the physician and staff, who then cooperate to solve the problems portrayed by the word picture. The need in emergency medicine for realistic virtual team training is critical because ED staff typically encounter rarely occurring but life threatening situations only once in their careers and because ED teams currently have no realistic environment in which to practice their team skills. The resulting lack of experience and teamwork makes diagnosis and treatment more difficult. Virtual environment based training has the potential to redress these shortfalls. The objective of our research is to develop a state-of-the-art virtual environment for emergency medicine team training. The virtual emergency room (VER) allows ED physicians and medical staff to realistically prepare for emergency medical situations by performing triage, diagnosis, and treatment on virtual patients within an environment that provides them with the tools they require and the team environment they need to realistically perform these three tasks. There are several issues that must be addressed before this vision is realized. The key issues deal with distribution of computations; the doctor and staff interface to the virtual patient and ED equipment; the accurate simulation of individual patient organs' response to injury, medication, and treatment; and an accurate modeling of the symptoms and appearance of the patient while maintaining a real-time interaction capability. Our ongoing work addresses all of these issues. In this paper we report on our prototype VER system and its distributed system architecture for an emergency department distributed virtual environment for emergency medical staff training. The virtual environment enables emergency department physicians and staff to develop their diagnostic and treatment skills using the virtual tools they need to perform diagnostic and treatment tasks. Virtual human imagery, and real-time virtual human response are used to create the virtual patient and present a scenario. Patient vital signs are available to the emergency department team as they manage the virtual case. The work reported here consists of the system architectures we developed for the distributed components of the virtual emergency room. The architectures we describe consist of the network level architecture as well as the software architecture for each actor within the virtual emergency room. We describe the role of distributed interactive simulation and other enabling technologies within the virtual emergency room project.
ERIC Educational Resources Information Center
Tsai, Fu-Hsing
2018-01-01
This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…
OR fire virtual training simulator: design and face validity.
Dorozhkin, Denis; Olasky, Jaisa; Jones, Daniel B; Schwaitzberg, Steven D; Jones, Stephanie B; Cao, Caroline G L; Molina, Marcos; Henriques, Steven; Wang, Jinling; Flinn, Jeff; De, Suvranu
2017-09-01
The Virtual Electrosurgical Skill Trainer is a tool for training surgeons the safe operation of electrosurgery tools in both open and minimally invasive surgery. This training includes a dedicated team-training module that focuses on operating room (OR) fire prevention and response. The module was developed to allow trainees, practicing surgeons, anesthesiologist, and nurses to interact with a virtual OR environment, which includes anesthesia apparatus, electrosurgical equipment, a virtual patient, and a fire extinguisher. Wearing a head-mounted display, participants must correctly identify the "fire triangle" elements and then successfully contain an OR fire. Within these virtual reality scenarios, trainees learn to react appropriately to the simulated emergency. A study targeted at establishing the face validity of the virtual OR fire simulator was undertaken at the 2015 Society of American Gastrointestinal and Endoscopic Surgeons conference. Forty-nine subjects with varying experience participated in this Institutional Review Board-approved study. The subjects were asked to complete the OR fire training/prevention sequence in the VEST simulator. Subjects were then asked to answer a subjective preference questionnaire consisting of sixteen questions, focused on the usefulness and fidelity of the simulator. On a 5-point scale, 12 of 13 questions were rated at a mean of 3 or greater (92%). Five questions were rated above 4 (38%), particularly those focusing on the simulator effectiveness and its usefulness in OR fire safety training. A total of 33 of the 49 participants (67%) chose the virtual OR fire trainer over the traditional training methods such as a textbook or an animal model. Training for OR fire emergencies in fully immersive VR environments, such as the VEST trainer, may be the ideal training modality. The face validity of the OR fire training module of the VEST simulator was successfully established on many aspects of the simulation.
Effects on Training Using Illumination in Virtual Environments
NASA Technical Reports Server (NTRS)
Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian
1999-01-01
Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.
Lorenz, D; Armbruster, W; Vogelgesang, C; Hoffmann, H; Pattar, A; Schmidt, D; Volk, T; Kubulus, D
2016-09-01
Chief emergency physicians are regarded as an important element in the care of the injured and sick following mass casualty accidents. Their education is very theoretical; practical content in contrast often falls short. Limitations are usually the very high costs of realistic (large-scale) exercises, poor reproducibility of the scenarios, and poor corresponding results. To substantially improve the educational level because of the complexity of mass casualty accidents, modified training concepts are required that teach the not only the theoretical but above all the practical skills considerably more intensively than at present. Modern training concepts should make it possible for the learner to realistically simulate decision processes. This article examines how interactive virtual environments are applicable for the education of emergency personnel and how they could be designed. Virtual simulation and training environments offer the possibility of simulating complex situations in an adequately realistic manner. The so-called virtual reality (VR) used in this context is an interface technology that enables free interaction in addition to a stereoscopic and spatial representation of virtual large-scale emergencies in a virtual environment. Variables in scenarios such as the weather, the number wounded, and the availability of resources, can be changed at any time. The trainees are able to practice the procedures in many virtual accident scenes and act them out repeatedly, thereby testing the different variants. With the aid of the "InSitu" project, it is possible to train in a virtual reality with realistically reproduced accident situations. These integrated, interactive training environments can depict very complex situations on a scale of 1:1. Because of the highly developed interactivity, the trainees can feel as if they are a direct part of the accident scene and therefore identify much more with the virtual world than is possible with desktop systems. Interactive, identifiable, and realistic training environments based on projector systems could in future enable a repetitive exercise with changes within a decision tree, in reproducibility, and within different occupational groups. With a hard- and software environment numerous accident situations can be depicted and practiced. The main expense is the creation of the virtual accident scenes. As the appropriate city models and other three-dimensional geographical data are already available, this expenditure is very low compared with the planning costs of a large-scale exercise.
Collaborative voxel-based surgical virtual environments.
Acosta, Eric; Muniz, Gilbert; Armonda, Rocco; Bowyer, Mark; Liu, Alan
2008-01-01
Virtual Reality-based surgical simulators can utilize Collaborative Virtual Environments (C-VEs) to provide team-based training. To support real-time interactions, C-VEs are typically replicated on each user's local computer and a synchronization method helps keep all local copies consistent. This approach does not work well for voxel-based C-VEs since large and frequent volumetric updates make synchronization difficult. This paper describes a method that allows multiple users to interact within a voxel-based C-VE for a craniotomy simulator being developed. Our C-VE method requires smaller update sizes and provides faster synchronization update rates than volumetric-based methods. Additionally, we address network bandwidth/latency issues to simulate networked haptic and bone drilling tool interactions with a voxel-based skull C-VE.
Grasping trajectories in a virtual environment adhere to Weber's law.
Ozana, Aviad; Berman, Sigal; Ganel, Tzvi
2018-06-01
Virtual-reality and telerobotic devices simulate local motor control of virtual objects within computerized environments. Here, we explored grasping kinematics within a virtual environment and tested whether, as in normal 3D grasping, trajectories in the virtual environment are performed analytically, violating Weber's law with respect to object's size. Participants were asked to grasp a series of 2D objects using a haptic system, which projected their movements to a virtual space presented on a computer screen. The apparatus also provided object-specific haptic information upon "touching" the edges of the virtual targets. The results showed that grasping movements performed within the virtual environment did not produce the typical analytical trajectory pattern obtained during 3D grasping. Unlike as in 3D grasping, grasping trajectories in the virtual environment adhered to Weber's law, which indicates relative resolution in size processing. In addition, the trajectory patterns differed from typical trajectories obtained during 3D grasping, with longer times to complete the movement, and with maximum grip apertures appearing relatively early in the movement. The results suggest that grasping movements within a virtual environment could differ from those performed in real space, and are subjected to irrelevant effects of perceptual information. Such atypical pattern of visuomotor control may be mediated by the lack of complete transparency between the interface and the virtual environment in terms of the provided visual and haptic feedback. Possible implications of the findings to movement control within robotic and virtual environments are further discussed.
Virtual Reality: Emerging Applications and Future Directions
ERIC Educational Resources Information Center
Ludlow, Barbara L.
2015-01-01
Virtual reality is an emerging technology that has resulted in rapid expansion in the development of virtual immersive environments for use as educational simulations in schools, colleges and universities. This article presents an overview of virtual reality, describes a number of applications currently being used by special educators for…
Cognitive evaluation by tasks in a virtual reality environment in multiple sclerosis.
Lamargue-Hamel, Delphine; Deloire, Mathilde; Saubusse, Aurore; Ruet, Aurélie; Taillard, Jacques; Philip, Pierre; Brochet, Bruno
2015-12-15
The assessment of cognitive impairment in multiple sclerosis (MS) requires large neuropsychological batteries that assess numerous domains. The relevance of these assessments to daily cognitive functioning is not well established. Cognitive ecological evaluation has not been frequently studied in MS. The aim of this study was to determine the interest of cognitive evaluation in a virtual reality environment in a sample of persons with MS with cognitive deficits. Thirty persons with MS with at least moderate cognitive impairment were assessed with two ecological evaluations, an in-house developed task in a virtual reality environment (Urban DailyCog®) and a divided attention task in a driving simulator. Classical neuropsychological testing was also used. Fifty-two percent of the persons with MS failed the driving simulator task and 80% failed the Urban DailyCog®. Virtual reality assessments are promising in identifying cognitive impairment in MS. Copyright © 2015 Elsevier B.V. All rights reserved.
Headphone and Head-Mounted Visual Displays for Virtual Environments
NASA Technical Reports Server (NTRS)
Begault, Duran R.; Ellis, Stephen R.; Wenzel, Elizabeth M.; Trejo, Leonard J. (Technical Monitor)
1998-01-01
A realistic auditory environment can contribute to both the overall subjective sense of presence in a virtual display, and to a quantitative metric predicting human performance. Here, the role of audio in a virtual display and the importance of auditory-visual interaction are examined. Conjectures are proposed regarding the effectiveness of audio compared to visual information for creating a sensation of immersion, the frame of reference within a virtual display, and the compensation of visual fidelity by supplying auditory information. Future areas of research are outlined for improving simulations of virtual visual and acoustic spaces. This paper will describe some of the intersensory phenomena that arise during operator interaction within combined visual and auditory virtual environments. Conjectures regarding audio-visual interaction will be proposed.
Development of a virtual flight simulator.
Kuntz Rangel, Rodrigo; Guimarães, Lamartine N F; de Assis Correa, Francisco
2002-10-01
We present the development of a flight simulator that allows the user to interact in a created environment by means of virtual reality devices. This environment simulates the sight of a pilot in an airplane cockpit. The environment is projected in a helmet visor and allows the pilot to see inside as well as outside the cockpit. The movement of the airplane is independent of the movement of the pilot's head, which means that the airplane might travel in one direction while the pilot is looking at a 30 degrees angle with respect to the traveled direction. In this environment, the pilot will be able to take off, fly, and land the airplane. So far, the objects in the environment are geometrical figures. This is an ongoing project, and only partial results are available now.
ERIC Educational Resources Information Center
Huang, Hsiu-Mei; Rauch, Ulrich; Liaw, Shu-Sheng
2010-01-01
The use of animation and multimedia for learning is now further extended by the provision of entire Virtual Reality Learning Environments (VRLE). This highlights a shift in Web-based learning from a conventional multimedia to a more immersive, interactive, intuitive and exciting VR learning environment. VRLEs simulate the real world through the…
Korayem, Moharam Habibnejad; Hoshiar, Ali Kafash; Ghofrani, Maedeh
2017-08-01
With the expansion of nanotechnology, robots based on atomic force microscope (AFM) have been widely used as effective tools for displacing nanoparticles and constructing nanostructures. One of the most limiting factors in AFM-based manipulation procedures is the inability of simultaneously observing the controlled pushing and displacing of nanoparticles while performing the operation. To deal with this limitation, a virtual reality environment has been used in this paper for observing the manipulation operation. In the simulations performed in this paper, first, the images acquired by the atomic force microscope have been processed and the positions and dimensions of nanoparticles have been determined. Then, by dynamically modelling the transfer of nanoparticles and simulating the critical force-time diagrams, a controlled displacement of nanoparticles has been accomplished. The simulations have been further developed for the use of rectangular, V-shape and dagger-shape cantilevers. The established virtual reality environment has made it possible to simulate the manipulation of biological particles in a liquid medium. Copyright © 2017 Elsevier Inc. All rights reserved.
Virtual environment display for a 3D audio room simulation
NASA Astrophysics Data System (ADS)
Chapin, William L.; Foster, Scott
1992-06-01
Recent developments in virtual 3D audio and synthetic aural environments have produced a complex acoustical room simulation. The acoustical simulation models a room with walls, ceiling, and floor of selected sound reflecting/absorbing characteristics and unlimited independent localizable sound sources. This non-visual acoustic simulation, implemented with 4 audio ConvolvotronsTM by Crystal River Engineering and coupled to the listener with a Poihemus IsotrakTM, tracking the listener's head position and orientation, and stereo headphones returning binaural sound, is quite compelling to most listeners with eyes closed. This immersive effect should be reinforced when properly integrated into a full, multi-sensory virtual environment presentation. This paper discusses the design of an interactive, visual virtual environment, complementing the acoustic model and specified to: 1) allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; 2) reinforce the listener's feeling of telepresence into the acoustical environment with visual and proprioceptive sensations; 3) enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and 4) serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations. The installed system implements a head-coupled, wide-angle, stereo-optic tracker/viewer and multi-computer simulation control. The portable demonstration system implements a head-mounted wide-angle, stereo-optic display, separate head and pointer electro-magnetic position trackers, a heterogeneous parallel graphics processing system, and object oriented C++ program code.
Virtual odors to transmit emotions in virtual agents
NASA Astrophysics Data System (ADS)
Delgado-Mata, Carlos; Aylett, Ruth
2003-04-01
In this paper we describe an emotional-behvioral architecture. The emotional engine sits at a higher layer than the behavior system, and can alter behavior patterns, the engine is designed to simulate Emotionally-Intelligent Agents in a Virtual Environment, where each agent senses its own emotions, and other creature emotions through a virtual smell sensor; senses obstacles and other moving creatures in the environment and reacts to them. The architecture consists of an emotion engine, behavior synthesis system, a motor layer and a library of sensors.
Virtual Glovebox (VGX) Aids Astronauts in Pre-Flight Training
NASA Technical Reports Server (NTRS)
2003-01-01
NASA's Virtual Glovebox (VGX) was developed to allow astronauts on Earth to train for complex biology research tasks in space. The astronauts may reach into the virtual environment, naturally manipulating specimens, tools, equipment, and accessories in a simulated microgravity environment as they would do in space. Such virtual reality technology also provides engineers and space operations staff with rapid prototyping, planning, and human performance modeling capabilities. Other Earth based applications being explored for this technology include biomedical procedural training and training for disarming bio-terrorism weapons.
Virtual Control Systems Environment (VCSE)
Atkins, Will
2018-02-14
Will Atkins, a Sandia National Laboratories computer engineer discusses cybersecurity research work for process control systems. Will explains his work on the Virtual Control Systems Environment project to develop a modeling and simulation framework of the U.S. electric grid in order to study and mitigate possible cyberattacks on infrastructure.
NASA Astrophysics Data System (ADS)
Kobayashi, Hayato; Osaki, Tsugutoyo; Okuyama, Tetsuro; Gramm, Joshua; Ishino, Akira; Shinohara, Ayumi
This paper describes an interactive experimental environment for autonomous soccer robots, which is a soccer field augmented by utilizing camera input and projector output. This environment, in a sense, plays an intermediate role between simulated environments and real environments. We can simulate some parts of real environments, e.g., real objects such as robots or a ball, and reflect simulated data into the real environments, e.g., to visualize the positions on the field, so as to create a situation that allows easy debugging of robot programs. The significant point compared with analogous work is that virtual objects are touchable in this system owing to projectors. We also show the portable version of our system that does not require ceiling cameras. As an application in the augmented environment, we address the learning of goalie strategies on real quadruped robots in penalty kicks. We make our robots utilize virtual balls in order to perform only quadruped locomotion in real environments, which is quite difficult to simulate accurately. Our robots autonomously learn and acquire more beneficial strategies without human intervention in our augmented environment than those in a fully simulated environment.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey
2003-01-01
The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.
NASA Technical Reports Server (NTRS)
Lehmer, R.; Ingram, C.; Jovic, S.; Alderete, J.; Brown, D.; Carpenter, D.; LaForce, S.; Panda, R.; Walker, J.; Chaplin, P.;
2006-01-01
The Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Project, an element cf NASA's Virtual Airspace Modeling and Simulation (VAMS) Project, has been developing a distributed simulation capability that supports an extensible and expandable real-time, human-in-the-loop airspace simulation environment. The VAST-RT system architecture is based on DoD High Level Architecture (HLA) and the VAST-RT HLA Toolbox, a common interface implementation that incorporates a number of novel design features. The scope of the initial VAST-RT integration activity (Capability 1) included the high-fidelity human-in-the-loop simulation facilities located at NASA/Ames Research Center and medium fidelity pseudo-piloted target generators, such as the Airspace Traffic Generator (ATG) being developed as part of VAST-RT, as well as other real-time tools. This capability has been demonstrated in a gate-to-gate simulation. VAST-RT's (Capability 2A) has been recently completed, and this paper will discuss the improved integration of the real-time assets into VAST-RT, including the development of tools to integrate data collected across the simulation environment into a single data set for the researcher. Current plans for the completion of the VAST-RT distributed simulation environment (Capability 2B) and its use to evaluate future airspace capacity enhancing concepts being developed by VAMS will be discussed. Additionally, the simulation environment's application to other airspace and airport research projects is addressed.
Milella, Ferdinando; Pinto, Carlo; Cant, Iain; White, Mark; Meyer, Georg
2018-01-01
Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as ‘presence’, when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user’s overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience. PMID:29390023
Cooper, Natalia; Milella, Ferdinando; Pinto, Carlo; Cant, Iain; White, Mark; Meyer, Georg
2018-01-01
Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as 'presence', when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user's overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience.
Virtual operating room for team training in surgery.
Abelson, Jonathan S; Silverman, Elliott; Banfelder, Jason; Naides, Alexandra; Costa, Ricardo; Dakin, Gregory
2015-09-01
We proposed to develop a novel virtual reality (VR) team training system. The objective of this study was to determine the feasibility of creating a VR operating room to simulate a surgical crisis scenario and evaluate the simulator for construct and face validity. We modified ICE STORM (Integrated Clinical Environment; Systems, Training, Operations, Research, Methods), a VR-based system capable of modeling a variety of health care personnel and environments. ICE STORM was used to simulate a standardized surgical crisis scenario, whereby participants needed to correct 4 elements responsible for loss of laparoscopic visualization. The construct and face validity of the environment were measured. Thirty-three participants completed the VR simulation. Attendings completed the simulation in less time than trainees (271 vs 201 seconds, P = .032). Participants felt the training environment was realistic and had a favorable impression of the simulation. All participants felt the workload of the simulation was low. Creation of a VR-based operating room for team training in surgery is feasible and can afford a realistic team training environment. Copyright © 2015 Elsevier Inc. All rights reserved.
Validation of smoking-related virtual environments for cue exposure therapy.
García-Rodríguez, Olaya; Pericot-Valverde, Irene; Gutiérrez-Maldonado, José; Ferrer-García, Marta; Secades-Villa, Roberto
2012-06-01
Craving is considered one of the main factors responsible for relapse after smoking cessation. Cue exposure therapy (CET) consists of controlled and repeated exposure to drug-related stimuli in order to extinguish associated responses. The main objective of this study was to assess the validity of 7 virtual reality environments for producing craving in smokers that can be used within the CET paradigm. Forty-six smokers and 44 never-smokers were exposed to 7 complex virtual environments with smoking-related cues that reproduce typical situations in which people smoke, and to a neutral virtual environment without smoking cues. Self-reported subjective craving and psychophysiological measures were recorded during the exposure. All virtual environments with smoking-related cues were able to generate subjective craving in smokers, while no increase was observed for the neutral environment. The most sensitive psychophysiological variable to craving increases was heart rate. The findings provide evidence of the utility of virtual reality for simulating real situations capable of eliciting craving. We also discuss how CET for smoking cessation can be improved through these virtual tools. Copyright © 2012 Elsevier Ltd. All rights reserved.
Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel
2015-01-01
The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants’ behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants’ height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother). PMID:26157414
Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel
2015-01-01
The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants' behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants' height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother).
Mission Simulation Facility: Simulation Support for Autonomy Development
NASA Technical Reports Server (NTRS)
Pisanich, Greg; Plice, Laura; Neukom, Christian; Flueckiger, Lorenzo; Wagner, Michael
2003-01-01
The Mission Simulation Facility (MSF) supports research in autonomy technology for planetary exploration vehicles. Using HLA (High Level Architecture) across distributed computers, the MSF connects users autonomy algorithms with provided or third-party simulations of robotic vehicles and planetary surface environments, including onboard components and scientific instruments. Simulation fidelity is variable to meet changing needs as autonomy technology advances in Technical Readiness Level (TRL). A virtual robot operating in a virtual environment offers numerous advantages over actual hardware, including availability, simplicity, and risk mitigation. The MSF is in use by researchers at NASA Ames Research Center (ARC) and has demonstrated basic functionality. Continuing work will support the needs of a broader user base.
NASA Technical Reports Server (NTRS)
Jex, Henry R.
1991-01-01
A review is given of a wide range of simulations in which operator steering control of a vehicle is involved and the dominant-clues, closed-loop bandwidth, measured operator effective time-delay, and ratio of bandwidth-to-inverse delay are summarized. A correlation of kinetosis with dynamic scene field-of-view is shown. The use of moving base simulators to improve the validity of locomotion teleoperations is discussed. some rules-of-thumb for good 'feel-system' simulation, such as for control manipulanda are given. Finally, simulation tests of teleoperators and virtual environments should include three types of measures: system performance, operator (or robot) 'behavior', and mental workload evaluations.
A haptic interface for virtual simulation of endoscopic surgery.
Rosenberg, L B; Stredney, D
1996-01-01
Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.
ERIC Educational Resources Information Center
Short, Daniel
2016-01-01
The tragedy of the commons is one of the principal tenets of ecology. Recent developments in experiential computer-based simulation of the tragedy of the commons are described. A virtual learning environment is developed using the popular video game "Minecraft". The virtual learning environment is used to experience first-hand depletion…
ERIC Educational Resources Information Center
Kim, Heesung; Ke, Fengfeng
2016-01-01
The pedagogical and design considerations for the use of a virtual reality (VR) learning environment are important for prospective and current teachers. However, empirical research investigating how preservice teachers interact with transformative content representation, facilitation, and learning activities in a VR educational simulation is still…
Global Village as Virtual Community (On Writing, Thinking, and Teacher Education).
ERIC Educational Resources Information Center
Polin, Linda
1993-01-01
Describes virtual communities known as Multi-User Simulated Environment (MUSE) or Multi-User Object Oriented environment (MOO), text-based computer "communities" whose inhabitants are a combination of the real people and constructed objects that people agree to treat as real. Describes their uses in the classroom. (SR)
Education about Hallucinations Using an Internet Virtual Reality System: A Qualitative Survey
ERIC Educational Resources Information Center
Yellowlees, Peter M.; Cook, James N.
2006-01-01
Objective: The authors evaluate an Internet virtual reality technology as an education tool about the hallucinations of psychosis. Method: This is a pilot project using Second Life, an Internet-based virtual reality system, in which a virtual reality environment was constructed to simulate the auditory and visual hallucinations of two patients…
Evaluation of glucose controllers in virtual environment: methodology and sample application.
Chassin, Ludovic J; Wilinska, Malgorzata E; Hovorka, Roman
2004-11-01
Adaptive systems to deliver medical treatment in humans are safety-critical systems and require particular care in both the testing and the evaluation phase, which are time-consuming, costly, and confounded by ethical issues. The objective of the present work is to develop a methodology to test glucose controllers of an artificial pancreas in a simulated (virtual) environment. A virtual environment comprising a model of the carbohydrate metabolism and models of the insulin pump and the glucose sensor is employed to simulate individual glucose excursions in subjects with type 1 diabetes. The performance of the control algorithm within the virtual environment is evaluated by considering treatment and operational scenarios. The developed methodology includes two dimensions: testing in relation to specific life style conditions, i.e. fasting, post-prandial, and life style (metabolic) disturbances; and testing in relation to various operating conditions, i.e. expected operating conditions, adverse operating conditions, and system failure. We define safety and efficacy criteria and describe the measures to be taken prior to clinical testing. The use of the methodology is exemplified by tuning and evaluating a model predictive glucose controller being developed for a wearable artificial pancreas focused on fasting conditions. Our methodology to test glucose controllers in a virtual environment is instrumental in anticipating the results of real clinical tests for different physiological conditions and for different operating conditions. The thorough testing in the virtual environment reduces costs and speeds up the development process.
Using Virtual Simulations in the Design of 21st Century Space Science Environments
NASA Technical Reports Server (NTRS)
Hutchinson, Sonya L.; Alves, Jeffery R.
1996-01-01
Space Technology has been rapidly increasing in the past decade. This can be attributed to the future construction of the International Space Station (ISS). New innovations must constantly be engineered to make ISS the safest, quality, research facility in space. Since space science must often be gathered by crew members, more attention must be geared to the human's safety and comfort. Virtual simulations are now being used to design environments that crew members can live in for long periods of time without harmful effects to their bodies. This paper gives a few examples of the ergonomic design problems that arise on manned space flights, and design solutions that follow NASA's strategic commitment to customer satisfaction. The conclusions show that virtual simulations are a great asset to 21st century design.
Mentally simulated movements in virtual reality: does Fitts's law hold in motor imagery?
Decety, J; Jeannerod, M
1995-12-14
This study was designed to investigate mentally simulated actions in a virtual reality environment. Naive human subjects (n = 15) were instructed to imagine themselves walking in a three-dimensional virtual environment toward gates of different apparent widths placed at three different apparent distances. Each subject performed nine blocks of six trials in a randomised order. The response time (reaction time and mental walking time) was measured as the duration between an acoustic go signal and a motor signal produced by the subject. There was a combined effect on response time of both gate width and distance. Response time increased for decreasing apparent gate widths when the gate was placed at different distances. These results support the notion that mentally simulated actions are governed by central motor rules.
NASA Technical Reports Server (NTRS)
Lunsford, Myrtis Leigh
1998-01-01
The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.
Virtual VMASC: A 3D Game Environment
NASA Technical Reports Server (NTRS)
Manepalli, Suchitra; Shen, Yuzhong; Garcia, Hector M.; Lawsure, Kaleen
2010-01-01
The advantages of creating interactive 3D simulations that allow viewing, exploring, and interacting with land improvements, such as buildings, in digital form are manifold and range from allowing individuals from anywhere in the world to explore those virtual land improvements online, to training military personnel in dealing with war-time environments, and to making those land improvements available in virtual worlds such as Second Life. While we haven't fully explored the true potential of such simulations, we have identified a requirement within our organization to use simulations like those to replace our front-desk personnel and allow visitors to query, naVigate, and communicate virtually with various entities within the building. We implemented the Virtual VMASC 3D simulation of the Virginia Modeling Analysis and Simulation Center (VMASC) office building to not only meet our front-desk requirement but also to evaluate the effort required in designing such a simulation and, thereby, leverage the experience we gained in future projects of this kind. This paper describes the goals we set for our implementation, the software approach taken, the modeling contribution made, and the technologies used such as XNA Game Studio, .NET framework, Autodesk software packages, and, finally, the applicability of our implementation on a variety of architectures including Xbox 360 and PC. This paper also summarizes the result of our evaluation and the lessons learned from our effort.
NASA Astrophysics Data System (ADS)
Simonnet, Mathieu; Jacobson, Dan; Vieilledent, Stephane; Tisseau, Jacques
Navigating consists of coordinating egocentric and allocentric spatial frames of reference. Virtual environments have afforded researchers in the spatial community with tools to investigate the learning of space. The issue of the transfer between virtual and real situations is not trivial. A central question is the role of frames of reference in mediating spatial knowledge transfer to external surroundings, as is the effect of different sensory modalities accessed in simulated and real worlds. This challenges the capacity of blind people to use virtual reality to explore a scene without graphics. The present experiment involves a haptic and auditory maritime virtual environment. In triangulation tasks, we measure systematic errors and preliminary results show an ability to learn configurational knowledge and to navigate through it without vision. Subjects appeared to take advantage of getting lost in an egocentric “haptic” view in the virtual environment to improve performances in the real environment.
2009-09-01
Environmental Medicine USN United States Navy VAE Virtual Air Environment VACP Visual, Auditory, Cognitive, Psychomotor (demand) VR Virtual Reality ...0 .5 m/s. Another useful approach to capturing leg, trunk, whole body, or movement tasks comes from virtual reality - based training research and...referred to as semi-automated forces (SAF). From: http://www.sedris.org/glossary.htm#C_grp. Constructive Models Abstractions from the reality to
Youngblood, Patricia; Harter, Phillip M; Srivastava, Sakti; Moffett, Shannon; Heinrichs, Wm LeRoy; Dev, Parvati
2008-01-01
Training interdisciplinary trauma teams to work effectively together using simulation technology has led to a reduction in medical errors in emergency department, operating room, and delivery room contexts. High-fidelity patient simulators (PSs)-the predominant method for training healthcare teams-are expensive to develop and implement and require that trainees be present in the same place at the same time. In contrast, online computer-based simulators are more cost effective and allow simultaneous participation by students in different locations and time zones. In this pilot study, the researchers created an online virtual emergency department (Virtual ED) for team training in crisis management, and compared the effectiveness of the Virtual ED with the PS. We hypothesized that there would be no difference in learning outcomes for graduating medical students trained with each method. In this pilot study, we used a pretest-posttest control group, experimental design in which 30 subjects were randomly assigned to either the Virtual ED or the PS system. In the Virtual ED each subject logged into the online environment and took the role of a team member. Four-person teams worked together in the Virtual ED, communicating in real time with live voice over Internet protocol, to manage computer-controlled patients who exhibited signs and symptoms of physical trauma. Each subject had the opportunity to be the team leader. The subjects' leadership behavior as demonstrated in both a pretest case and a posttest case was assessed by 3 raters, using a behaviorally anchored scale. In the PS environment, 4-person teams followed the same research protocol, using the same clinical scenarios in a Simulation Center. Guided by the Emergency Medicine Crisis Resource Management curriculum, both the Virtual ED and the PS groups applied the basic principles of team leadership and trauma management (Advanced Trauma Life Support) to manage 6 trauma cases-a pretest case, 4 training cases, and a posttest case. The subjects in each group were assessed individually with the same simulation method that they used for the training cases. Subjects who used either the Virtual ED or the PS showed significant improvement in performance between pretest and posttest cases (P < 0.05). In addition, there was no significant difference in subjects' performance between the 2 types of simulation, suggesting that the online Virtual ED may be as effective for learning team skills as the PS, the method widely used in Simulation Centers. Data on usability and attitudes toward both simulation methods as learning tools were equally positive. This study shows the potential value of using virtual learning environments for developing medical students' and resident physicians' team leadership and crisis management skills.
Virtual Reality: A New Learning Environment.
ERIC Educational Resources Information Center
Ferrington, Gary; Loge, Kenneth
1992-01-01
Discusses virtual reality (VR) technology and its possible uses in military training, medical education, industrial design and development, the media industry, and education. Three primary applications of VR in the learning process--visualization, simulation, and construction of virtual worlds--are described, and pedagogical and moral issues are…
Parallel-distributed mobile robot simulator
NASA Astrophysics Data System (ADS)
Okada, Hiroyuki; Sekiguchi, Minoru; Watanabe, Nobuo
1996-06-01
The aim of this project is to achieve an autonomous learning and growth function based on active interaction with the real world. It should also be able to autonomically acquire knowledge about the context in which jobs take place, and how the jobs are executed. This article describes a parallel distributed movable robot system simulator with an autonomous learning and growth function. The autonomous learning and growth function which we are proposing is characterized by its ability to learn and grow through interaction with the real world. When the movable robot interacts with the real world, the system compares the virtual environment simulation with the interaction result in the real world. The system then improves the virtual environment to match the real-world result more closely. This the system learns and grows. It is very important that such a simulation is time- realistic. The parallel distributed movable robot simulator was developed to simulate the space of a movable robot system with an autonomous learning and growth function. The simulator constructs a virtual space faithful to the real world and also integrates the interfaces between the user, the actual movable robot and the virtual movable robot. Using an ultrafast CG (computer graphics) system (FUJITSU AG series), time-realistic 3D CG is displayed.
Evolving a Neural Olfactorimotor System in Virtual and Real Olfactory Environments
Rhodes, Paul A.; Anderson, Todd O.
2012-01-01
To provide a platform to enable the study of simulated olfactory circuitry in context, we have integrated a simulated neural olfactorimotor system with a virtual world which simulates both computational fluid dynamics as well as a robotic agent capable of exploring the simulated plumes. A number of the elements which we developed for this purpose have not, to our knowledge, been previously assembled into an integrated system, including: control of a simulated agent by a neural olfactorimotor system; continuous interaction between the simulated robot and the virtual plume; the inclusion of multiple distinct odorant plumes and background odor; the systematic use of artificial evolution driven by olfactorimotor performance (e.g., time to locate a plume source) to specify parameter values; the incorporation of the realities of an imperfect physical robot using a hybrid model where a physical robot encounters a simulated plume. We close by describing ongoing work toward engineering a high dimensional, reversible, low power electronic olfactory sensor which will allow olfactorimotor neural circuitry evolved in the virtual world to control an autonomous olfactory robot in the physical world. The platform described here is intended to better test theories of olfactory circuit function, as well as provide robust odor source localization in realistic environments. PMID:23112772
McCorkle, Doug
2017-12-27
Ames Laboratory scientist Doug McCorkle explains osgBullet, a 3-D virtual simulation software, and how it helps engineers design complex products and systems in a realistic, real-time virtual environment.
Distracting people from sources of discomfort in a simulated aircraft environment.
Lewis, Laura; Patel, Harshada; Cobb, Sue; D'Cruz, Mirabelle; Bues, Matthias; Stefani, Oliver; Grobler, Tredeaux
2016-07-19
Comfort is an important factor in the acceptance of transport systems. In 2010 and 2011, the European Commission (EC) put forward its vision for air travel in the year 2050 which envisaged the use of in-flight virtual reality. This paper addressed the EC vision by investigating the effect of virtual environments on comfort. Research has shown that virtual environments can provide entertaining experiences and can be effective distracters from painful experiences. To determine the extent to which a virtual environment could distract people from sources of discomfort. Experiments which involved inducing discomfort commonly experienced in-flight (e.g. limited space, noise) in order to determine the extent to which viewing a virtual environment could distract people from discomfort. Virtual environments can fully or partially distract people from sources of discomfort, becoming more effective when they are interesting. They are also more effective at distracting people from discomfort caused by restricted space than noise disturbances. Virtual environments have the potential to enhance passenger comfort by providing positive distractions from sources of discomfort. Further research is required to understand more fully the reasons why the effect was stronger for one source of discomfort than the other.
Surgery applications of virtual reality
NASA Technical Reports Server (NTRS)
Rosen, Joseph
1994-01-01
Virtual reality is a computer-generated technology which allows information to be displayed in a simulated, bus lifelike, environment. In this simulated 'world', users can move and interact as if they were actually a part of that world. This new technology will be useful in many different fields, including the field of surgery. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations, simulate and perform surgical procedures (telesurgery), and predict the outcomes of surgery. The authors of this paper describe the basic components of a virtual reality surgical system. These components include: the virtual world, the virtual tools, the anatomical model, the software platform, the host computer, the interface, and the head-coupled display. In the chapter they also review the progress towards using virtual reality for surgical training, planning, telesurgery, and predicting outcomes. Finally, the authors present a training system being developed for the practice of new procedures in abdominal surgery.
Advances in edge-diffraction modeling for virtual-acoustic simulations
NASA Astrophysics Data System (ADS)
Calamia, Paul Thomas
In recent years there has been growing interest in modeling sound propagation in complex, three-dimensional (3D) virtual environments. With diverse applications for the military, the gaming industry, psychoacoustics researchers, architectural acousticians, and others, advances in computing power and 3D audio-rendering techniques have driven research and development aimed at closing the gap between the auralization and visualization of virtual spaces. To this end, this thesis focuses on improving the physical and perceptual realism of sound-field simulations in virtual environments through advances in edge-diffraction modeling. To model sound propagation in virtual environments, acoustical simulation tools commonly rely on geometrical-acoustics (GA) techniques that assume asymptotically high frequencies, large flat surfaces, and infinitely thin ray-like propagation paths. Such techniques can be augmented with diffraction modeling to compensate for the effect of surface size on the strength and directivity of a reflection, to allow for propagation around obstacles and into shadow zones, and to maintain soundfield continuity across reflection and shadow boundaries. Using a time-domain, line-integral formulation of the Biot-Tolstoy-Medwin (BTM) diffraction expression, this thesis explores various aspects of diffraction calculations for virtual-acoustic simulations. Specifically, we first analyze the periodic singularity of the BTM integrand and describe the relationship between the singularities and higher-order reflections within wedges with open angle less than 180°. Coupled with analytical approximations for the BTM expression, this analysis allows for accurate numerical computations and a continuous sound field in the vicinity of an arbitrary wedge geometry insonified by a point source. Second, we describe an edge-subdivision strategy that allows for fast diffraction calculations with low error relative to a numerically more accurate solution. Third, to address the considerable increase in propagation paths due to diffraction, we describe a simple procedure for identifying and culling insignificant diffraction components during a virtual-acoustic simulation. Finally, we present a novel method to find GA components using diffraction parameters that ensures continuity at reflection and shadow boundaries.
Piromchai, Patorn; Avery, Alex; Laopaiboon, Malinee; Kennedy, Gregor; O'Leary, Stephen
2015-09-09
Virtual reality simulation uses computer-generated imagery to present a simulated training environment for learners. This review seeks to examine whether there is evidence to support the introduction of virtual reality surgical simulation into ear, nose and throat surgical training programmes. 1. To assess whether surgeons undertaking virtual reality simulation-based training achieve surgical ('patient') outcomes that are at least as good as, or better than, those achieved through conventional training methods.2. To assess whether there is evidence from either the operating theatre, or from controlled (simulation centre-based) environments, that virtual reality-based surgical training leads to surgical skills that are comparable to, or better than, those achieved through conventional training. The Cochrane Ear, Nose and Throat Disorders Group (CENTDG) Trials Search Co-ordinator searched the CENTDG Trials Register; Central Register of Controlled Trials (CENTRAL 2015, Issue 6); PubMed; EMBASE; ERIC; CINAHL; Web of Science; ClinicalTrials.gov; ICTRP and additional sources for published and unpublished trials. The date of the search was 27 July 2015. We included all randomised controlled trials and controlled trials comparing virtual reality training and any other method of training in ear, nose or throat surgery. We used the standard methodological procedures expected by The Cochrane Collaboration. We evaluated both technical and non-technical aspects of skill competency. We included nine studies involving 210 participants. Out of these, four studies (involving 61 residents) assessed technical skills in the operating theatre (primary outcomes). Five studies (comprising 149 residents and medical students) assessed technical skills in controlled environments (secondary outcomes). The majority of the trials were at high risk of bias. We assessed the GRADE quality of evidence for most outcomes across studies as 'low'. Operating theatre environment (primary outcomes) In the operating theatre, there were no studies that examined two of three primary outcomes: real world patient outcomes and acquisition of non-technical skills. The third primary outcome (technical skills in the operating theatre) was evaluated in two studies comparing virtual reality endoscopic sinus surgery training with conventional training. In one study, psychomotor skill (which relates to operative technique or the physical co-ordination associated with instrument handling) was assessed on a 10-point scale. A second study evaluated the procedural outcome of time-on-task. The virtual reality group performance was significantly better, with a better psychomotor score (mean difference (MD) 1.66, 95% CI 0.52 to 2.81; 10-point scale) and a shorter time taken to complete the operation (MD -5.50 minutes, 95% CI -9.97 to -1.03). Controlled training environments (secondary outcomes) In a controlled environment five studies evaluated the technical skills of surgical trainees (one study) and medical students (three studies). One study was excluded from the analysis. Surgical trainees: One study (80 participants) evaluated the technical performance of surgical trainees during temporal bone surgery, where the outcome was the quality of the final dissection. There was no difference in the end-product scores between virtual reality and cadaveric temporal bone training. Medical students: Two other studies (40 participants) evaluated technical skills achieved by medical students in the temporal bone laboratory. Learners' knowledge of the flow of the operative procedure (procedural score) was better after virtual reality than conventional training (SMD 1.11, 95% CI 0.44 to 1.79). There was also a significant difference in end-product score between the virtual reality and conventional training groups (SMD 2.60, 95% CI 1.71 to 3.49). One study (17 participants) revealed that medical students acquired anatomical knowledge (on a scale of 0 to 10) better during virtual reality than during conventional training (MD 4.3, 95% CI 2.05 to 6.55). No studies in a controlled training environment assessed non-technical skills. There is limited evidence to support the inclusion of virtual reality surgical simulation into surgical training programmes, on the basis that it can allow trainees to develop technical skills that are at least as good as those achieved through conventional training. Further investigations are required to determine whether virtual reality training is associated with better real world outcomes for patients and the development of non-technical skills. Virtual reality simulation may be considered as an additional learning tool for medical students.
Effective Student Learning of Fractions with an Interactive Simulation
ERIC Educational Resources Information Center
Hensberry, Karina K. R.; Moore, Emily B.; Perkins, Katherine K.
2015-01-01
Computer technology, when coupled with reform-based teaching practices, has been shown to be an effective way to support student learning of mathematics. The quality of the technology itself, as well as how it is used, impacts how much students learn. Interactive simulations are dynamic virtual environments similar to virtual manipulatives that…
ERIC Educational Resources Information Center
Gunn, Therese; Jones, Lee; Bridge, Pete; Rowntree, Pam; Nissen, Lisa
2018-01-01
In recent years, simulation has increasingly underpinned the acquisition of pre-clinical skills by undergraduate medical imaging (diagnostic radiography) students. This project aimed to evaluate the impact of an innovative virtual reality (VR) learning environment on the development of technical proficiency by students. The study assessed the…
ERIC Educational Resources Information Center
Andrews, Dee H.; Dineen, Toni; Bell, Herbert H.
1999-01-01
Discusses the use of constructive modeling and virtual simulation in team training; describes a military application of constructive modeling, including technology issues and communication protocols; considers possible improvements; and discusses applications in team-learning environments other than military, including industry and education. (LRW)
ERIC Educational Resources Information Center
Peterson-Ahmad, Maria
2018-01-01
To meet the ever-increasing teaching standards, pre-service special educators need extensive and advanced opportunities for pedagogical preparation prior to entering the classroom. Providing opportunities for pre-service special educators to practice such strategies within a virtual simulation environment offers teacher preparation programs a way…
Spatial considerations for instructional development in a virtual environment
NASA Technical Reports Server (NTRS)
Mccarthy, Laurie; Pontecorvo, Michael; Grant, Frances; Stiles, Randy
1993-01-01
In this paper we discuss spatial considerations for instructional development in a virtual environment. For both the instructional developer and the student, the important spatial criteria are perspective, orientation, scale, level of visual detail, and granularity of simulation. Developing a representation that allows an instructional developer to specify spatial criteria and enables intelligent agents to reason about a given instructional problem is of paramount importance to the success of instruction delivered in a virtual environment, especially one that supports dynamic exploration or spans more than one scale of operation.
ERIC Educational Resources Information Center
Oser, Rachel; Fraser, Barry J.
2015-01-01
As society becomes increasingly global and experiential, research suggests that students can benefit from alternative learning environments that extend beyond the classroom. In providing students with laboratory experiences that otherwise would not be possible in high-school settings, virtual laboratories can simulate real laboratories and…
ERIC Educational Resources Information Center
Kim, Heesung; Ke, Fengfeng
2017-01-01
This experimental study was intended to examine whether the integration of game characteristics in the OpenSimulator-supported virtual reality (VR) learning environment can improve mathematical achievement for elementary school students. In this pre- and posttest experimental comparison study, data were collected from 132 fourth graders through an…
ERIC Educational Resources Information Center
Kim, Heesung; Ke, Fengfeng; Paek, Insu
2017-01-01
This experimental study was intended to examine whether game-based learning (GBL) that encompasses four particular game characteristics (challenges, a storyline, immediate rewards and the integration of game-play with learning content) in an OpenSimulator-supported virtual reality learning environment can improve perceived motivational quality of…
Using Immersive Virtual Environments for Certification
NASA Technical Reports Server (NTRS)
Lutz, R.; Cruz-Neira, C.
1998-01-01
Immersive virtual environments (VEs) technology has matured to the point where it can be utilized as a scientific and engineering problem solving tool. In particular, VEs are starting to be used to design and evaluate safety-critical systems that involve human operators, such as flight and driving simulators, complex machinery training, and emergency rescue strategies.
A Virtual Environment for Process Management. A Step by Step Implementation
ERIC Educational Resources Information Center
Mayer, Sergio Valenzuela
2003-01-01
In this paper it is presented a virtual organizational environment, conceived with the integration of three computer programs: a manufacturing simulation package, an automation of businesses processes (workflows), and business intelligence (Balanced Scorecard) software. It was created as a supporting tool for teaching IE, its purpose is to give…
Learning Relative Motion Concepts in Immersive and Non-immersive Virtual Environments
NASA Astrophysics Data System (ADS)
Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria
2013-12-01
The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop virtual environment (DVE) conditions. Our results show that after the simulation activities, both IVE and DVE groups exhibited a significant shift toward a scientific understanding in their conceptual models and epistemological beliefs about the nature of relative motion, and also a significant improvement on relative motion problem-solving tests. In addition, we analyzed students' performance on one-dimensional and two-dimensional questions in the relative motion problem-solving test separately and found that after training in the simulation, the IVE group performed significantly better than the DVE group on solving two-dimensional relative motion problems. We suggest that egocentric encoding of the scene in IVE (where the learner constitutes a part of a scene they are immersed in), as compared to allocentric encoding on a computer screen in DVE (where the learner is looking at the scene from "outside"), is more beneficial than DVE for studying more complex (two-dimensional) relative motion problems. Overall, our findings suggest that such aspects of virtual realities as immersivity, first-hand experience, and the possibility of changing different frames of reference can facilitate understanding abstract scientific phenomena and help in displacing intuitive misconceptions with more accurate mental models.
NASA Astrophysics Data System (ADS)
Demir, I.
2015-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.
Williamon, Aaron; Aufegger, Lisa; Eiholzer, Hubert
2014-01-01
Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all) how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of "real" performance could be recreated. Advanced violin students (n = 11) were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three "expert" virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for performance training.
Williamon, Aaron; Aufegger, Lisa; Eiholzer, Hubert
2014-01-01
Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all) how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of “real” performance could be recreated. Advanced violin students (n = 11) were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three “expert” virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for performance training. PMID:24550856
Virtual alternative to the oral examination for emergency medicine residents.
McGrath, Jillian; Kman, Nicholas; Danforth, Douglas; Bahner, David P; Khandelwal, Sorabh; Martin, Daniel R; Nagel, Rollin; Verbeck, Nicole; Way, David P; Nelson, Richard
2015-03-01
The oral examination is a traditional method for assessing the developing physician's medical knowledge, clinical reasoning and interpersonal skills. The typical oral examination is a face-to-face encounter in which examiners quiz examinees on how they would confront a patient case. The advantage of the oral exam is that the examiner can adapt questions to the examinee's response. The disadvantage is the potential for examiner bias and intimidation. Computer-based virtual simulation technology has been widely used in the gaming industry. We wondered whether virtual simulation could serve as a practical format for delivery of an oral examination. For this project, we compared the attitudes and performance of emergency medicine (EM) residents who took our traditional oral exam to those who took the exam using virtual simulation. EM residents (n=35) were randomized to a traditional oral examination format (n=17) or a simulated virtual examination format (n=18) conducted within an immersive learning environment, Second Life (SL). Proctors scored residents using the American Board of Emergency Medicine oral examination assessment instruments, which included execution of critical actions and ratings on eight competency categories (1-8 scale). Study participants were also surveyed about their oral examination experience. We observed no differences between virtual and traditional groups on critical action scores or scores on eight competency categories. However, we noted moderate effect sizes favoring the Second Life group on the clinical competence score. Examinees from both groups thought that their assessment was realistic, fair, objective, and efficient. Examinees from the virtual group reported a preference for the virtual format and felt that the format was less intimidating. The virtual simulated oral examination was shown to be a feasible alternative to the traditional oral examination format for assessing EM residents. Virtual environments for oral examinations should continue to be explored, particularly since they offer an inexpensive, more comfortable, yet equally rigorous alternative.
Virtual Alternative to the Oral Examination for Emergency Medicine Residents
McGrath, Jillian; Kman, Nicholas; Danforth, Douglas; Bahner, David P.; Khandelwal, Sorabh; Martin, Daniel R.; Nagel, Rollin; Verbeck, Nicole; Way, David P.; Nelson, Richard
2015-01-01
Introduction The oral examination is a traditional method for assessing the developing physician’s medical knowledge, clinical reasoning and interpersonal skills. The typical oral examination is a face-to-face encounter in which examiners quiz examinees on how they would confront a patient case. The advantage of the oral exam is that the examiner can adapt questions to the examinee’s response. The disadvantage is the potential for examiner bias and intimidation. Computer-based virtual simulation technology has been widely used in the gaming industry. We wondered whether virtual simulation could serve as a practical format for delivery of an oral examination. For this project, we compared the attitudes and performance of emergency medicine (EM) residents who took our traditional oral exam to those who took the exam using virtual simulation. Methods EM residents (n=35) were randomized to a traditional oral examination format (n=17) or a simulated virtual examination format (n=18) conducted within an immersive learning environment, Second Life (SL). Proctors scored residents using the American Board of Emergency Medicine oral examination assessment instruments, which included execution of critical actions and ratings on eight competency categories (1–8 scale). Study participants were also surveyed about their oral examination experience. Results We observed no differences between virtual and traditional groups on critical action scores or scores on eight competency categories. However, we noted moderate effect sizes favoring the Second Life group on the clinical competence score. Examinees from both groups thought that their assessment was realistic, fair, objective, and efficient. Examinees from the virtual group reported a preference for the virtual format and felt that the format was less intimidating. Conclusion The virtual simulated oral examination was shown to be a feasible alternative to the traditional oral examination format for assessing EM residents. Virtual environments for oral examinations should continue to be explored, particularly since they offer an inexpensive, more comfortable, yet equally rigorous alternative. PMID:25834684
Ketelhut, Diane Jass; Niemi, Steven M
2007-01-01
This article examines several new and exciting communication technologies. Many of the technologies were developed by the entertainment industry; however, other industries are adopting and modifying them for their own needs. These new technologies allow people to collaborate across distance and time and to learn in simulated work contexts. The article explores the potential utility of these technologies for advancing laboratory animal care and use through better education and training. Descriptions include emerging technologies such as augmented reality and multi-user virtual environments, which offer new approaches with different capabilities. Augmented reality interfaces, characterized by the use of handheld computers to infuse the virtual world into the real one, result in deeply immersive simulations. In these simulations, users can access virtual resources and communicate with real and virtual participants. Multi-user virtual environments enable multiple participants to simultaneously access computer-based three-dimensional virtual spaces, called "worlds," and to interact with digital tools. They allow for authentic experiences that promote collaboration, mentoring, and communication. Because individuals may learn or train differently, it is advantageous to combine the capabilities of these technologies and applications with more traditional methods to increase the number of students who are served by using current methods alone. The use of these technologies in animal care and use programs can create detailed training and education environments that allow students to learn the procedures more effectively, teachers to assess their progress more objectively, and researchers to gain insights into animal care.
The Application of Modeling and Simulation to the Behavioral Deficit of Autism
NASA Technical Reports Server (NTRS)
Anton, John J.
2010-01-01
This abstract describes a research effort to apply technological advances in virtual reality simulation and computer-based games to create behavioral modification programs for individuals with Autism Spectrum Disorder (ASD). The research investigates virtual social skills training within a 3D game environment to diminish the impact of ASD social impairments and to increase learning capacity for optimal intellectual capability. Individuals with autism will encounter prototypical social contexts via computer interface and will interact with 3D avatars with predefined roles within a game-like environment. Incremental learning objectives will combine to form a collaborative social environment. A secondary goal of the effort is to begin the research and development of virtual reality exercises aimed at triggering the release of neurotransmitters to promote critical aspects of synaptic maturation at an early age to change the course of the disease.
Using Immersive Virtual Reality for Electrical Substation Training
ERIC Educational Resources Information Center
Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana
2015-01-01
Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…
Proposal of Modification Strategy of NC Program in the Virtual Manufacturing Environment
NASA Astrophysics Data System (ADS)
Narita, Hirohisa; Chen, Lian-Yi; Fujimoto, Hideo; Shirase, Keiichi; Arai, Eiji
Virtual manufacturing will be a key technology in process planning, because there are no evaluation tools for cutting conditions. Therefore, virtual machining simulator (VMSim), which can predict end milling processes, has been developed. The modification strategy of NC program using VMSim is proposed in this paper.
Simulating Geriatric Home Safety Assessments in a Three-Dimensional Virtual World
ERIC Educational Resources Information Center
Andrade, Allen D.; Cifuentes, Pedro; Mintzer, Michael J.; Roos, Bernard A.; Anam, Ramanakumar; Ruiz, Jorge G.
2012-01-01
Virtual worlds could offer inexpensive and safe three-dimensional environments in which medical trainees can learn to identify home safety hazards. Our aim was to evaluate the feasibility, usability, and acceptability of virtual worlds for geriatric home safety assessments and to correlate performance efficiency in hazard identification with…
Efficacy of a Virtual Teaching Assistant in an Open Laboratory Environment for Electric Circuits
ERIC Educational Resources Information Center
Saleheen, Firdous; Wang, Zicong; Picone, Joseph; Butz, Brian P.; Won, Chang-Hee
2018-01-01
In order to provide an on-demand, open electrical engineering laboratory, we developed an innovative software-based Virtual Open Laboratory Teaching Assistant (VOLTA). This web-based virtual assistant provides laboratory instructions, equipment usage videos, circuit simulation assistance, and hardware implementation diagnostics. VOLTA allows…
Real-time, interactive, visually updated simulator system for telepresence
NASA Technical Reports Server (NTRS)
Schebor, Frederick S.; Turney, Jerry L.; Marzwell, Neville I.
1991-01-01
Time delays and limited sensory feedback of remote telerobotic systems tend to disorient teleoperators and dramatically decrease the operator's performance. To remove the effects of time delays, key components were designed and developed of a prototype forward simulation subsystem, the Global-Local Environment Telerobotic Simulator (GLETS) that buffers the operator from the remote task. GLETS totally immerses an operator in a real-time, interactive, simulated, visually updated artificial environment of the remote telerobotic site. Using GLETS, the operator will, in effect, enter into a telerobotic virtual reality and can easily form a gestalt of the virtual 'local site' that matches the operator's normal interactions with the remote site. In addition to use in space based telerobotics, GLETS, due to its extendable architecture, can also be used in other teleoperational environments such as toxic material handling, construction, and undersea exploration.
Assessment of radiation awareness training in immersive virtual environments
NASA Astrophysics Data System (ADS)
Whisker, Vaughn E., III
The prospect of new nuclear power plant orders in the near future and the graying of the current workforce create a need to train new personnel faster and better. Immersive virtual reality (VR) may offer a solution to the training challenge. VR technology presented in a CAVE Automatic Virtual Environment (CAVE) provides a high-fidelity, one-to-one scale environment where areas of the power plant can be recreated and virtual radiation environments can be simulated, making it possible to safely expose workers to virtual radiation in the context of the actual work environment. The use of virtual reality for training is supported by many educational theories; constructivism and discovery learning, in particular. Educational theory describes the importance of matching the training to the task. Plant access training and radiation worker training, common forms of training in the nuclear industry, rely on computer-based training methods in most cases, which effectively transfer declarative knowledge, but are poor at transferring skills. If an activity were to be added, the training would provide personnel with the opportunity to develop skills and apply their knowledge so they could be more effective when working in the radiation environment. An experiment was developed to test immersive virtual reality's suitability for training radiation awareness. Using a mixed methodology of quantitative and qualitative measures, the subjects' performances before and after training were assessed. First, subjects completed a pre-test to measure their knowledge prior to completing any training. Next they completed unsupervised computer-based training, which consisted of a PowerPoint presentation and a PDF document. After completing a brief orientation activity in the virtual environment, one group of participants received supplemental radiation awareness training in a simulated radiation environment presented in the CAVE, while a second group, the control group, moved directly to the assessment phase of the experiment. The CAVE supplied an activity-based training environment where learners were able to use a virtual survey meter to explore the properties of radiation sources and the effects of time and distance on radiation exposure. Once the training stage had ended, the subjects completed an assessment activity where they were asked to complete four tasks in a simulated radiation environment in the CAVE, which was designed to provide a more authentic assessment than simply testing understanding using a quiz. After the practicum, the subjects completed a post-test. Survey information was also collected to assist the researcher with interpretation of the collected data. Response to the training was measured by completion time, radiation exposure received, successful completion of the four tasks in the practicum, and scores on the post-test. These results were combined to create a radiation awareness score. In addition, observational data was collected as the subjects completed the tasks. The radiation awareness scores of the control group and the group that received supplemental training in the virtual environment were compared. T-tests showed that the effect of the supplemental training was not significant; however, calculation of the effect size showed a small-to-medium effect of the training. The CAVE group received significantly less radiation exposure during the assessment activity, and they completed the activities on an average of one minute faster. These results indicate that the training was effective, primarily for instilling radiation sensitivity. Observational data collected during the assessment supports this conclusion. The training environment provided by the immersive virtual reality recreated a radiation environment where learners could apply knowledge they had been taught by computer-based training. Activity-based training has been shown to be a more effective way to transfer skills because of the similarity between the training environment and the application environment. Virtual reality enables the training environment to look and feel like the application environment. Because of this, radiation awareness training in an immersive virtual environment should be considered by the nuclear industry, which is supported by the results of this experiment.
Simple force feedback for small virtual environments
NASA Astrophysics Data System (ADS)
Schiefele, Jens; Albert, Oliver; van Lier, Volker; Huschka, Carsten
1998-08-01
In today's civil flight training simulators only the cockpit and all its interaction devices exist as physical mockups. All other elements such as flight behavior, motion, sound, and the visual system are virtual. As an extension to this approach `Virtual Flight Simulation' tries to subsidize the cockpit mockup by a 3D computer generated image. The complete cockpit including the exterior view is displayed on a Head Mounted Display (HMD), a BOOM, or a Cave Animated Virtual Environment. In most applications a dataglove or virtual pointers are used as input devices. A basic problem of such a Virtual Cockpit simulation is missing force feedback. A pilot cannot touch and feel buttons, knobs, dials, etc. he tries to manipulate. As a result, it is very difficult to generate realistic inputs into VC systems. `Seating Bucks' are used in automotive industry to overcome the problem of missing force feedback. Only a seat, steering wheel, pedal, stick shift, and radio panel are physically available. All other geometry is virtual and therefore untouchable but visible in the output device. In extension to this concept a `Seating Buck' for commercial transport aircraft cockpits was developed. Pilot seat, side stick, pedals, thrust-levers, and flaps lever are physically available. All other panels are simulated by simple flat plastic panels. They are located at the same location as their real counterparts only lacking the real input devices. A pilot sees the entire photorealistic cockpit in a HMD as 3D geometry but can only touch the physical parts and plastic panels. In order to determine task performance with the developed Seating Buck, a test series was conducted. Users press buttons, adapt dials, and turn knobs. In a first test, a complete virtual environment was used. The second setting had a plastic panel replacing all input devices. Finally, as cross reference the participants had to repeat the test with a complete physical mockup of the input devices. All panels and physical devices can be easily relocated to simulate a different type of cockpit. Maximal 30 minutes are needed for a complete adaptation. So far, an Airbus A340 and a generic cockpit are supported.
The Effects of Virtual Weather on Presence
NASA Astrophysics Data System (ADS)
Wissmath, Bartholomäus; Weibel, David; Mast, Fred W.
In modern societies people tend to spend more time in front of computer screens than outdoors. Along with an increasing degree of realism displayed in digital environments, simulated weather appears more and more realistic and more often implemented in digital environments. Research has found that the actual weather influences behavior and mood. In this paper we experimentally examine the effects of virtual weather on the sense of presence. Thereby we found individuals (N=30) to immerse deeper in digital environments displaying fair weather conditions than in environments displaying bad weather. We also investigate whether virtual weather can influence behavior. The possible implications of theses findings for presence theory as well as digital environment designers will be discussed.
Baig, Hasan; Madsen, Jan
2017-01-15
Simulation and behavioral analysis of genetic circuits is a standard approach of functional verification prior to their physical implementation. Many software tools have been developed to perform in silico analysis for this purpose, but none of them allow users to interact with the model during runtime. The runtime interaction gives the user a feeling of being in the lab performing a real world experiment. In this work, we present a user-friendly software tool named D-VASim (Dynamic Virtual Analyzer and Simulator), which provides a virtual laboratory environment to simulate and analyze the behavior of genetic logic circuit models represented in an SBML (Systems Biology Markup Language). Hence, SBML models developed in other software environments can be analyzed and simulated in D-VASim. D-VASim offers deterministic as well as stochastic simulation; and differs from other software tools by being able to extract and validate the Boolean logic from the SBML model. D-VASim is also capable of analyzing the threshold value and propagation delay of a genetic circuit model. D-VASim is available for Windows and Mac OS and can be downloaded from bda.compute.dtu.dk/downloads/. haba@dtu.dk, jama@dtu.dk. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Virtual Habitat -a dynamic simulation of closed life support systems -human model status
NASA Astrophysics Data System (ADS)
Markus Czupalla, M. Sc.; Zhukov, Anton; Hwang, Su-Au; Schnaitmann, Jonas
In order to optimize Life Support Systems on a system level, stability questions must be in-vestigated. To do so the exploration group of the Technical University of Munich (TUM) is developing the "Virtual Habitat" (V-HAB) dynamic LSS simulation software. V-HAB shall provide the possibility to conduct dynamic simulations of entire mission scenarios for any given LSS configuration. The Virtual Habitat simulation tool consists of four main modules: • Closed Environment Module (CEM) -monitoring of compounds in a closed environment • Crew Module (CM) -dynamic human simulation • P/C Systems Module (PCSM) -dynamic P/C subsystems • Plant Module (PM) -dynamic plant simulation The core module of the simulation is the dynamic and environment sensitive human module. Introduced in its basic version in 2008, the human module has been significantly updated since, increasing its capabilities and maturity significantly. In this paper three newly added human model subsystems (thermal regulation, digestion and schedule controller) are introduced touching also on the human stress subsystem which is cur-rently under development. Upon the introduction of these new subsystems, the integration of these into the overall V-HAB human model is discussed, highlighting the impact on the most important I/F. The overall human model capabilities shall further be summarized and presented based on meaningful test cases. In addition to the presentation of the results, the correlation strategy for the Virtual Habitat human model shall be introduced assessing the models current confidence level and giving an outlook on the future correlation strategy. Last but not least, the remaining V-HAB mod-ules shall be introduced shortly showing how the human model is integrated into the overall simulation.
ERIC Educational Resources Information Center
Jones, Greg; Warren, Scott
2009-01-01
Using video games, virtual simulations, and other digital spaces for learning can be a time-consuming process; aside from technical issues that may absorb class time, students take longer to achieve gains in learning in virtual environments. Greg Jones and Scott Warren describe how intelligent agents, in-game characters that respond to the context…
ERIC Educational Resources Information Center
Read, Stephen J.; Miller, Lynn C.; Appleby, Paul Robert; Nwosu, Mary E.; Reynaldo, Sadina; Lauren, Ada; Putcha, Anila
2006-01-01
A socially optimized learning approach, which integrates diverse theoretical perspectives, places men who have sex with men (MSM) in an interactive virtual environment designed to simulate the emotional, interpersonal, and contextual narrative of an actual sexual encounter while challenging and changing MSM's more automatic patterns of risky…
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.
2004-01-01
Virtual Reality has been defined in many different ways and now means different things in various contexts. VR can range from simple environments presented on a desktop computer to fully immersive multisensory environments experienced through complex headgear and bodysuits. In all of its manifestations, VR is basically a way of simulating or…
Avatars Go to Class: A Virtual Environment Soil Science Activity
ERIC Educational Resources Information Center
Mamo, M.; Namuth-Covert, D.; Guru, A.; Nugent, G.; Phillips, L.; Sandall, L.; Kettler, T.; McCallister, D.
2011-01-01
Web 2.0 technology is expanding rapidly from social and gaming uses into the educational applications. Specifically, the multi-user virtual environment (MUVE), such as SecondLife, allows educators to fill the gap of first-hand experience by creating simulated realistic evolving problems/games. In a pilot study, a team of educators at the…
Virtual Learning Environment for Interactive Engagement with Advanced Quantum Mechanics
NASA Astrophysics Data System (ADS)
Pedersen, Mads Kock; Skyum, Birk; Heck, Robert; Müller, Romain; Bason, Mark; Lieberoth, Andreas; Sherson, Jacob F.
2016-06-01
A virtual learning environment can engage university students in the learning process in ways that the traditional lectures and lab formats cannot. We present our virtual learning environment StudentResearcher, which incorporates simulations, multiple-choice quizzes, video lectures, and gamification into a learning path for quantum mechanics at the advanced university level. StudentResearcher is built upon the experiences gathered from workshops with the citizen science game Quantum Moves at the high-school and university level, where the games were used extensively to illustrate the basic concepts of quantum mechanics. The first test of this new virtual learning environment was a 2014 course in advanced quantum mechanics at Aarhus University with 47 enrolled students. We found increased learning for the students who were more active on the platform independent of their previous performances.
Towards Gesture-Based Multi-User Interactions in Collaborative Virtual Environments
NASA Astrophysics Data System (ADS)
Pretto, N.; Poiesi, F.
2017-11-01
We present a virtual reality (VR) setup that enables multiple users to participate in collaborative virtual environments and interact via gestures. A collaborative VR session is established through a network of users that is composed of a server and a set of clients. The server manages the communication amongst clients and is created by one of the users. Each user's VR setup consists of a Head Mounted Display (HMD) for immersive visualisation, a hand tracking system to interact with virtual objects and a single-hand joypad to move in the virtual environment. We use Google Cardboard as a HMD for the VR experience and a Leap Motion for hand tracking, thus making our solution low cost. We evaluate our VR setup though a forensics use case, where real-world objects pertaining to a simulated crime scene are included in a VR environment, acquired using a smartphone-based 3D reconstruction pipeline. Users can interact using virtual gesture-based tools such as pointers and rulers.
Translating the simulation of procedural drilling techniques for interactive neurosurgical training.
Stredney, Don; Rezai, Ali R; Prevedello, Daniel M; Elder, J Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J
2013-10-01
Through previous efforts we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. These volumetric data help drive an interactive multisensory, ie, visual (stereo), aural (stereo), and tactile, simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the Congress of Neurological Surgeons simulation initiative. To deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. We discuss issues of biofidelity and our methods to provide objective, quantitative and automated assessment for the residents. We conclude with a discussion of our experiences by reporting preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum.
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
Simulating Humans as Integral Parts of Spacecraft Missions
NASA Technical Reports Server (NTRS)
Bruins, Anthony C.; Rice, Robert; Nguyen, Lac; Nguyen, Heidi; Saito, Tim; Russell, Elaine
2006-01-01
The Collaborative-Virtual Environment Simulation Tool (C-VEST) software was developed for use in a NASA project entitled "3-D Interactive Digital Virtual Human." The project is oriented toward the use of a comprehensive suite of advanced software tools in computational simulations for the purposes of human-centered design of spacecraft missions and of the spacecraft, space suits, and other equipment to be used on the missions. The C-VEST software affords an unprecedented suite of capabilities for three-dimensional virtual-environment simulations with plug-in interfaces for physiological data, haptic interfaces, plug-and-play software, realtime control, and/or playback control. Mathematical models of the mechanics of the human body and of the aforementioned equipment are implemented in software and integrated to simulate forces exerted on and by astronauts as they work. The computational results can then support the iterative processes of design, building, and testing in applied systems engineering and integration. The results of the simulations provide guidance for devising measures to counteract effects of microgravity on the human body and for the rapid development of virtual (that is, simulated) prototypes of advanced space suits, cockpits, and robots to enhance the productivity, comfort, and safety of astronauts. The unique ability to implement human-in-the-loop immersion also makes the C-VEST software potentially valuable for use in commercial and academic settings beyond the original space-mission setting.
What makes virtual agents believable?
NASA Astrophysics Data System (ADS)
Bogdanovych, Anton; Trescak, Tomas; Simoff, Simeon
2016-01-01
In this paper we investigate the concept of believability and make an attempt to isolate individual characteristics (features) that contribute to making virtual characters believable. As the result of this investigation we have produced a formalisation of believability and based on this formalisation built a computational framework focused on simulation of believable virtual agents that possess the identified features. In order to test whether the identified features are, in fact, responsible for agents being perceived as more believable, we have conducted a user study. In this study we tested user reactions towards the virtual characters that were created for a simulation of aboriginal inhabitants of a particular area of Sydney, Australia in 1770 A.D. The participants of our user study were exposed to short simulated scenes, in which virtual agents performed some behaviour in two different ways (while possessing a certain aspect of believability vs. not possessing it). The results of the study indicate that virtual agents that appear resource bounded, are aware of their environment, own interaction capabilities and their state in the world, agents that can adapt to changes in the environment and exist in correct social context are those that are being perceived as more believable. Further in the paper we discuss these and other believability features and provide a quantitative analysis of the level of contribution for each such feature to the overall perceived believability of a virtual agent.
2003-06-01
NASA’s Virtual Glovebox (VGX) was developed to allow astronauts on Earth to train for complex biology research tasks in space. The astronauts may reach into the virtual environment, naturally manipulating specimens, tools, equipment, and accessories in a simulated microgravity environment as they would do in space. Such virtual reality technology also provides engineers and space operations staff with rapid prototyping, planning, and human performance modeling capabilities. Other Earth based applications being explored for this technology include biomedical procedural training and training for disarming bio-terrorism weapons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Markidis, S.; Rizwan, U.
The use of virtual nuclear control room can be an effective and powerful tool for training personnel working in the nuclear power plants. Operators could experience and simulate the functioning of the plant, even in critical situations, without being in a real power plant or running any risk. 3D models can be exported to Virtual Reality formats and then displayed in the Virtual Reality environment providing an immersive 3D experience. However, two major limitations of this approach are that 3D models exhibit static textures, and they are not fully interactive and therefore cannot be used effectively in training personnel. Inmore » this paper we first describe a possible solution for embedding the output of a computer application in a 3D virtual scene, coupling real-world applications and VR systems. The VR system reported here grabs the output of an application running on an X server; creates a texture with the output and then displays it on a screen or a wall in the virtual reality environment. We then propose a simple model for providing interaction between the user in the VR system and the running simulator. This approach is based on the use of internet-based application that can be commanded by a laptop or tablet-pc added to the virtual environment. (authors)« less
A review of haptic simulator for oral and maxillofacial surgery based on virtual reality.
Chen, Xiaojun; Hu, Junlei
2018-06-01
Traditional medical training in oral and maxillofacial surgery (OMFS) may be limited by its low efficiency and high price due to the shortage of cadaver resources. With the combination of visual rendering and feedback force, surgery simulators become increasingly popular in hospitals and medical schools as an alternative to the traditional training. Areas covered: The major goal of this review is to provide a comprehensive reference source of current and future developments of haptic OMFS simulators based on virtual reality (VR) for relevant researchers. Expert commentary: Visual rendering, haptic rendering, tissue deformation, and evaluation are key components of haptic surgery simulator based on VR. Compared with traditional medical training, virtual and tactical fusion of virtual environment in surgery simulator enables considerably vivid sensation, and the operators have more opportunities to practice surgical skills and receive objective evaluation as reference.
Simulating the decentralized processes of the human immune system in a virtual anatomy model.
Sarpe, Vladimir; Jacob, Christian
2013-01-01
Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.
An Overview of Virtual Acoustic Simulation of Aircraft Flyover Noise
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.
2013-01-01
Methods for testing human subject response to aircraft flyover noise have greatly advanced in recent years as a result of advances in simulation technology. Capabilities have been developed which now allow subjects to be immersed both visually and aurally in a three-dimensional, virtual environment. While suitable for displaying recorded aircraft noise, the true potential is found when synthesizing aircraft flyover noise because it allows the flexibility and freedom to study sounds from aircraft not yet flown. A virtual acoustic simulation method is described which is built upon prediction-based source noise synthesis, engineering-based propagation modeling, and empirically-based receiver modeling. This source-path-receiver paradigm allows complete control over all aspects of flyover auralization. With this capability, it is now possible to assess human response to flyover noise by systematically evaluating source noise reductions within the context of a system level simulation. Examples of auralized flyover noise and movie clips representative of an immersive aircraft flyover environment are made in the presentation.
A Data Management System for International Space Station Simulation Tools
NASA Technical Reports Server (NTRS)
Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)
2002-01-01
Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.
Modeling, validation and analysis of a Whegs robot in the USARSim environment
NASA Astrophysics Data System (ADS)
Taylor, Brian K.; Balakirsky, Stephen; Messina, Elena; Quinn, Roger D.
2008-04-01
Simulation of robots in a virtual domain has multiple benefits. End users can use the simulation as a training tool to increase their skill with the vehicle without risking damage to the robot or surrounding environment. Simulation allows researchers and developers to benchmark robot performance in a range of scenarios without having the physical robot or environment present. The simulation can also help guide and generate new design concepts. USARSim (Unified System for Automation and Robot Simulation) is a tool that is being used to accomplish these goals, particularly within the realm of search and rescue. It is based on the Unreal Tournament 2004 gaming engine, which approximates the physics of how a robot interacts with its environment. A family of vehicles that can benefit from simulation in USARSim are Whegs TM robots. Developed in the Biorobotics Laboratory at Case Western Reserve University, Whegs TM robots are highly mobile ground vehicles that use abstracted biological principles to achieve a robust level of locomotion, including passive gait adaptation and enhanced climbing abilities. This paper describes a Whegs TM robot model that was constructed in USARSim. The model was configured with the same kinds of behavioral characteristics found in real Whegs TM vehicles. Once these traits were implemented, a validation study was performed using identical performance metrics measured on both the virtual and real vehicles to quantify vehicle performance and to ensure that the virtual robot's performance matched that of the real robot.
NASA Astrophysics Data System (ADS)
Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock
2017-01-01
The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.
A serious gaming/immersion environment to teach clinical cancer genetics.
Nosek, Thomas M; Cohen, Mark; Matthews, Anne; Papp, Klara; Wolf, Nancy; Wrenn, Gregg; Sher, Andrew; Coulter, Kenneth; Martin, Jessica; Wiesner, Georgia L
2007-01-01
We are creating an interactive, simulated "Cancer Genetics Tower" for the self-paced learning of Clinical Cancer Genetics by medical students (go to: http://casemed.case.edu/cancergenetics). The environment uses gaming theory to engage the students into achieving specific learning objectives. The first few levels contain virtual laboratories where students achieve the basic underpinnings of Cancer Genetics. The next levels apply these principles to clinical practice. A virtual attending physician and four virtual patients, available for questioning through virtual video conferencing, enrich each floor. The pinnacle clinical simulation challenges the learner to integrate all information and demonstrate mastery, thus "winning" the game. A pilot test of the program by 17 medical students yielded very favorable feedback; the students found the Tower a "great way to teach", it held their attention, and it made learning fun. A majority of the students preferred the Tower over other resources to learn Cancer Genetics.
Training software using virtual-reality technology and pre-calculated effective dose data.
Ding, Aiping; Zhang, Di; Xu, X George
2009-05-01
This paper describes the development of a software package, called VR Dose Simulator, which aims to provide interactive radiation safety and ALARA training to radiation workers using virtual-reality (VR) simulations. Combined with a pre-calculated effective dose equivalent (EDE) database, a virtual radiation environment was constructed in VR authoring software, EON Studio, using 3-D models of a real nuclear power plant building. Models of avatars representing two workers were adopted with arms and legs of the avatar being controlled in the software to simulate walking and other postures. Collision detection algorithms were developed for various parts of the 3-D power plant building and avatars to confine the avatars to certain regions of the virtual environment. Ten different camera viewpoints were assigned to conveniently cover the entire virtual scenery in different viewing angles. A user can control the avatar to carry out radiological engineering tasks using two modes of avatar navigation. A user can also specify two types of radiation source: Cs and Co. The location of the avatar inside the virtual environment during the course of the avatar's movement is linked to the EDE database. The accumulative dose is calculated and displayed on the screen in real-time. Based on the final accumulated dose and the completion status of all virtual tasks, a score is given to evaluate the performance of the user. The paper concludes that VR-based simulation technologies are interactive and engaging, thus potentially useful in improving the quality of radiation safety training. The paper also summarizes several challenges: more streamlined data conversion, realistic avatar movement and posture, more intuitive implementation of the data communication between EON Studio and VB.NET, and more versatile utilization of EDE data such as a source near the body, etc., all of which needs to be addressed in future efforts to develop this type of software.
NASA Technical Reports Server (NTRS)
Dumas, Joseph D., II
1998-01-01
Several virtual reality I/O peripherals were successfully configured and integrated as part of the author's 1997 Summer Faculty Fellowship work. These devices, which were not supported by the developers of VR software packages, use new software drivers and configuration files developed by the author to allow them to be used with simulations developed using those software packages. The successful integration of these devices has added significant capability to the ANVIL lab at MSFC. In addition, the author was able to complete the integration of a networked virtual reality simulation of the Space Shuttle Remote Manipulator System docking Space Station modules which was begun as part of his 1996 Fellowship. The successful integration of this simulation demonstrates the feasibility of using VR technology for ground-based training as well as on-orbit operations.
NASA Astrophysics Data System (ADS)
Ciunel, St.; Tica, B.
2016-08-01
The paper presents the studies made on a similar biomechanical system composed by neck, head and thorax bones. The models were defined in a CAD environment which includes Adams algorithm for dynamic simulations. The virtual models and the entire morphology were obtained starting with CT images made on a living human subject. The main movements analyzed were: axial rotation (left-right), lateral bending (left-right) and flexion- extension movement. After simulation was obtained the entire biomechanical behavior based on data tables or diagrams. That virtual model composed by neck and head can be included in complex system (as a car system) and supposed to several impact simulations (virtual crash tests). Also, our research team built main components of a testing device for dummy car crash neck-head system using anatomical data.
Cobbett, Shelley; Snelgrove-Clarke, Erna
2016-10-01
Clinical simulations can provide students with realistic clinical learning environments to increase their knowledge, self-confidence, and decrease their anxiety prior to entering clinical practice settings. To compare the effectiveness of two maternal newborn clinical simulation scenarios; virtual clinical simulation and face-to-face high fidelity manikin simulation. Randomized pretest-posttest design. A public research university in Canada. Fifty-six third year Bachelor of Science in Nursing students. Participants were randomized to either face-to-face or virtual clinical simulation and then to dyads for completion of two clinical simulations. Measures included: (1) Nursing Anxiety and Self-Confidence with Clinical Decision Making Scale (NASC-CDM) (White, 2011), (2) knowledge pretest and post-test related to preeclampsia and group B strep, and (3) Simulation Completion Questionnaire. Before and after each simulation students completed a knowledge test and the NASC-CDM and the Simulation Completion Questionnaire at study completion. There were no statistically significant differences in student knowledge and self-confidence between face-to-face and virtual clinical simulations. Anxiety scores were higher for students in the virtual clinical simulation than for those in the face-to-face simulation. Students' self-reported preference was face-to-face citing the similarities to practicing in a 'real' situation and the immediate debrief. Students not liking the virtual clinical simulation most often cited technological issues as their rationale. Given the equivalency of knowledge and self-confidence when undergraduate nursing students participate in either maternal newborn clinical scenarios of face-to-face or virtual clinical simulation identified in this trial, it is important to take into the consideration costs and benefits/risks of simulation implementation. Copyright © 2016 Elsevier Ltd. All rights reserved.
VIPER: Virtual Intelligent Planetary Exploration Rover
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Flueckiger, Lorenzo; Nguyen, Laurent; Washington, Richard
2001-01-01
Simulation and visualization of rover behavior are critical capabilities for scientists and rover operators to construct, test, and validate plans for commanding a remote rover. The VIPER system links these capabilities. using a high-fidelity virtual-reality (VR) environment. a kinematically accurate simulator, and a flexible plan executive to allow users to simulate and visualize possible execution outcomes of a plan under development. This work is part of a larger vision of a science-centered rover control environment, where a scientist may inspect and explore the environment via VR tools, specify science goals, and visualize the expected and actual behavior of the remote rover. The VIPER system is constructed from three generic systems, linked together via a minimal amount of customization into the integrated system. The complete system points out the power of combining plan execution, simulation, and visualization for envisioning rover behavior; it also demonstrates the utility of developing generic technologies. which can be combined in novel and useful ways.
Stanton, D; Foreman, N; Wilson, P N
1998-01-01
In this chapter we review some of the ways in which the skills learned in virtual environments (VEs) transfer to real situations, and in particular how information about the spatial layouts of virtual buildings acquired from the exploration of three-dimensional computer-simulations transfers to their real equivalents. Four experiments are briefly described which examined VR use by disabled children. We conclude that spatial information of the kind required for navigation transfers effectively from virtual to real situations. Spatial skills in disabled children showed progressive improvement with repeated exploration of virtual environments. The results are discussed in relation to the potential future benefits of VR in special needs education and training.
ERIC Educational Resources Information Center
Plumert, Jodie M.; Kearney, Joseph K.; Cremer, James F.
2004-01-01
This study examined gap choices and crossing behavior in children and adults using an immersive, interactive bicycling simulator. Ten- and 12-year-olds and adults rode a bicycle mounted on a stationary trainer through a virtual environment consisting of a street with 6 intersections. Participants faced continuous cross traffic traveling at 25mph…
Live Virtual Constructive Distributed Test Environment Characterization Report
NASA Technical Reports Server (NTRS)
Murphy, Jim; Kim, Sam K.
2013-01-01
This report documents message latencies observed over various Live, Virtual, Constructive, (LVC) simulation environment configurations designed to emulate possible system architectures for the Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project integrated tests. For each configuration, four scenarios with progressively increasing air traffic loads were used to determine system throughput and bandwidth impacts on message latency.
Manually locating physical and virtual reality objects.
Chen, Karen B; Kimmel, Ryan A; Bartholomew, Aaron; Ponto, Kevin; Gleicher, Michael L; Radwin, Robert G
2014-09-01
In this study, we compared how users locate physical and equivalent three-dimensional images of virtual objects in a cave automatic virtual environment (CAVE) using the hand to examine how human performance (accuracy, time, and approach) is affected by object size, location, and distance. Virtual reality (VR) offers the promise to flexibly simulate arbitrary environments for studying human performance. Previously, VR researchers primarily considered differences between virtual and physical distance estimation rather than reaching for close-up objects. Fourteen participants completed manual targeting tasks that involved reaching for corners on equivalent physical and virtual boxes of three different sizes. Predicted errors were calculated from a geometric model based on user interpupillary distance, eye location, distance from the eyes to the projector screen, and object. Users were 1.64 times less accurate (p < .001) and spent 1.49 times more time (p = .01) targeting virtual versus physical box corners using the hands. Predicted virtual targeting errors were on average 1.53 times (p < .05) greater than the observed errors for farther virtual targets but not significantly different for close-up virtual targets. Target size, location, and distance, in addition to binocular disparity, affected virtual object targeting inaccuracy. Observed virtual box inaccuracy was less than predicted for farther locations, suggesting possible influence of cues other than binocular vision. Human physical interaction with objects in VR for simulation, training, and prototyping involving reaching and manually handling virtual objects in a CAVE are more accurate than predicted when locating farther objects.
Adaptive space warping to enhance passive haptics in an arthroscopy surgical simulator.
Spillmann, Jonas; Tuchschmid, Stefan; Harders, Matthias
2013-04-01
Passive haptics, also known as tactile augmentation, denotes the use of a physical counterpart to a virtual environment to provide tactile feedback. Employing passive haptics can result in more realistic touch sensations than those from active force feedback, especially for rigid contacts. However, changes in the virtual environment would necessitate modifications of the physical counterparts. In recent work space warping has been proposed as one solution to overcome this limitation. In this technique virtual space is distorted such that a variety of virtual models can be mapped onto one single physical object. In this paper, we propose as an extension adaptive space warping; we show how this technique can be employed in a mixed-reality surgical training simulator in order to map different virtual patients onto one physical anatomical model. We developed methods to warp different organ geometries onto one physical mock-up, to handle different mechanical behaviors of the virtual patients, and to allow interactive modifications of the virtual structures, while the physical counterparts remain unchanged. Various practical examples underline the wide applicability of our approach. To the best of our knowledge this is the first practical usage of such a technique in the specific context of interactive medical training.
NASA Technical Reports Server (NTRS)
Schnase, John L.; Tamkin, Glenn S.; Ripley, W. David III; Stong, Savannah; Gill, Roger; Duffy, Daniel Q.
2012-01-01
Scientific data services are becoming an important part of the NASA Center for Climate Simulation's mission. Our technological response to this expanding role is built around the concept of a Virtual Climate Data Server (vCDS), repetitive provisioning, image-based deployment and distribution, and virtualization-as-a-service. The vCDS is an iRODS-based data server specialized to the needs of a particular data-centric application. We use RPM scripts to build vCDS images in our local computing environment, our local Virtual Machine Environment, NASA s Nebula Cloud Services, and Amazon's Elastic Compute Cloud. Once provisioned into one or more of these virtualized resource classes, vCDSs can use iRODS s federation capabilities to create an integrated ecosystem of managed collections that is scalable and adaptable to changing resource requirements. This approach enables platform- or software-asa- service deployment of vCDS and allows the NCCS to offer virtualization-as-a-service: a capacity to respond in an agile way to new customer requests for data services.
Liaw, Sok Ying; Chan, Sally Wai-Chi; Chen, Fun-Gee; Hooi, Shing Chuan; Siau, Chiang
2014-09-17
Virtual patient simulation has grown substantially in health care education. A virtual patient simulation was developed as a refresher training course to reinforce nursing clinical performance in assessing and managing deteriorating patients. The objective of this study was to describe the development of the virtual patient simulation and evaluate its efficacy, by comparing with a conventional mannequin-based simulation, for improving the nursing students' performances in assessing and managing patients with clinical deterioration. A randomized controlled study was conducted with 57 third-year nursing students who were recruited through email. After a baseline evaluation of all participants' clinical performance in a simulated environment, the experimental group received a 2-hour fully automated virtual patient simulation while the control group received 2-hour facilitator-led mannequin-based simulation training. All participants were then re-tested one day (first posttest) and 2.5 months (second posttest) after the intervention. The participants from the experimental group completed a survey to evaluate their learning experiences with the newly developed virtual patient simulation. Compared to their baseline scores, both experimental and control groups demonstrated significant improvements (P<.001) in first and second post-test scores. While the experimental group had significantly lower (P<.05) second post-test scores compared with the first post-test scores, no significant difference (P=.94) was found between these two scores for the control group. The scores between groups did not differ significantly over time (P=.17). The virtual patient simulation was rated positively. A virtual patient simulation for a refreshing training course on assessing and managing clinical deterioration was developed. Although the randomized controlled study did not show that the virtual patient simulation was superior to mannequin-based simulation, both simulations have demonstrated to be effective refresher learning strategies for improving nursing students' clinical performance. Given the greater resource requirements of mannequin-based simulation, the virtual patient simulation provides a more promising alternative learning strategy to mitigate the decay of clinical performance over time.
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.
1991-01-01
Natural environments have a content, i.e., the objects in them; a geometry, i.e., a pattern of rules for positioning and displacing the objects; and a dynamics, i.e., a system of rules describing the effects of forces acting on the objects. Human interaction with most common natural environments has been optimized by centuries of evolution. Virtual environments created through the human-computer interface similarly have a content, geometry, and dynamics, but the arbitrary character of the computer simulation creating them does not insure that human interaction with these virtual environments will be natural. The interaction, indeed, could be supernatural but it also could be impossible. An important determinant of the comprehensibility of a virtual environment is the correspondence between the environmental frames of reference and those associated with the control of environmental objects. The effects of rotation and displacement of control frames of reference with respect to corresponding environmental references differ depending upon whether perceptual judgement or manual tracking performance is measured. The perceptual effects of frame of reference displacement may be analyzed in terms of distortions in the process of virtualizing the synthetic environment space. The effects of frame of reference displacement and rotation have been studied by asking subjects to estimate exocentric direction in a virtual space.
NASA Technical Reports Server (NTRS)
1990-01-01
While a new technology called 'virtual reality' is still at the 'ground floor' level, one of its basic components, 3D computer graphics is already in wide commercial use and expanding. Other components that permit a human operator to 'virtually' explore an artificial environment and to interact with it are being demonstrated routinely at Ames and elsewhere. Virtual reality might be defined as an environment capable of being virtually entered - telepresence, it is called - or interacted with by a human. The Virtual Interface Environment Workstation (VIEW) is a head-mounted stereoscopic display system in which the display may be an artificial computer-generated environment or a real environment relayed from remote video cameras. Operator can 'step into' this environment and interact with it. The DataGlove has a series of fiber optic cables and sensors that detect any movement of the wearer's fingers and transmit the information to a host computer; a computer generated image of the hand will move exactly as the operator is moving his gloved hand. With appropriate software, the operator can use the glove to interact with the computer scene by grasping an object. The DataSuit is a sensor equipped full body garment that greatly increases the sphere of performance for virtual reality simulations.
ERIC Educational Resources Information Center
Vine, Juliet
2015-01-01
The Work-Integrated Simulation for Translators module is part of a three year undergraduate degree in translation. The semester long module aims to simulate several aspects of the translation process using the Blackboard virtual learning environment's Wikis as the interface for completing translation tasks. For each translation task, one of the…
NASA Technical Reports Server (NTRS)
Murphy, James R.; Otto, Neil M.
2017-01-01
NASA's Unmanned Aircraft Systems Integration in the National Airspace System Project is conducting human in the loop simulations and flight testing intended to reduce barriers associated with enabling routine airspace access for unmanned aircraft. The primary focus of these tests is interaction of the unmanned aircraft pilot with the display of detect and avoid alerting and guidance information. The project's integrated test and evaluation team was charged with developing the test infrastructure. As with any development effort, compromises in the underlying system architecture and design were made to allow for the rapid prototyping and open-ended nature of the research. In order to accommodate these design choices, a distributed test environment was developed incorporating Live, Virtual, Constructive, (LVC) concepts. The LVC components form the core infrastructure support simulation of UAS operations by integrating live and virtual aircraft in a realistic air traffic environment. This LVC infrastructure enables efficient testing by leveraging the use of existing assets distributed across multiple NASA Centers. Using standard LVC concepts enable future integration with existing simulation infrastructure.
NASA Technical Reports Server (NTRS)
Murphy, Jim; Otto, Neil
2017-01-01
NASA's Unmanned Aircraft Systems Integration in the National Airspace System Project is conducting human in the loop simulations and flight testing intended to reduce barriers associated with enabling routine airspace access for unmanned aircraft. The primary focus of these tests is interaction of the unmanned aircraft pilot with the display of detect and avoid alerting and guidance information. The projects integrated test and evaluation team was charged with developing the test infrastructure. As with any development effort, compromises in the underlying system architecture and design were made to allow for the rapid prototyping and open-ended nature of the research. In order to accommodate these design choices, a distributed test environment was developed incorporating Live, Virtual, Constructive, (LVC) concepts. The LVC components form the core infrastructure support simulation of UAS operations by integrating live and virtual aircraft in a realistic air traffic environment. This LVC infrastructure enables efficient testing by leveraging the use of existing assets distributed across multiple NASA Centers. Using standard LVC concepts enable future integration with existing simulation infrastructure.
The expert surgical assistant. An intelligent virtual environment with multimodal input.
Billinghurst, M; Savage, J; Oppenheimer, P; Edmond, C
1996-01-01
Virtual Reality has made computer interfaces more intuitive but not more intelligent. This paper shows how an expert system can be coupled with multimodal input in a virtual environment to provide an intelligent simulation tool or surgical assistant. This is accomplished in three steps. First, voice and gestural input is interpreted and represented in a common semantic form. Second, a rule-based expert system is used to infer context and user actions from this semantic representation. Finally, the inferred user actions are matched against steps in a surgical procedure to monitor the user's progress and provide automatic feedback. In addition, the system can respond immediately to multimodal commands for navigational assistance and/or identification of critical anatomical structures. To show how these methods are used we present a prototype sinus surgery interface. The approach described here may easily be extended to a wide variety of medical and non-medical training applications by making simple changes to the expert system database and virtual environment models. Successful implementation of an expert system in both simulated and real surgery has enormous potential for the surgeon both in training and clinical practice.
Simulation Exploration through Immersive Parallel Planes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M
We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less
Simulation Exploration through Immersive Parallel Planes: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny
We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less
Ergonomic approaches to designing educational materials for immersive multi-projection system
NASA Astrophysics Data System (ADS)
Shibata, Takashi; Lee, JaeLin; Inoue, Tetsuri
2014-02-01
Rapid advances in computer and display technologies have made it possible to present high quality virtual reality (VR) environment. To use such virtual environments effectively, research should be performed into how users perceive and react to virtual environment in view of particular human factors. We created a VR simulation of sea fish for science education, and we conducted an experiment to examine how observers perceive the size and depth of an object within their reach and evaluated their visual fatigue. We chose a multi-projection system for presenting the educational VR simulation, because this system can provide actual-size objects and produce stereo images located close to the observer. The results of the experiment show that estimation of size and depth was relatively accurate when subjects used physical actions to assess them. Presenting images within the observer's reach is suggested to be useful for education in VR environment. Evaluation of visual fatigue shows that the level of symptoms from viewing stereo images with a large disparity in VR environment was low in a short time.
Virtual geotechnical laboratory experiments using a simulator
NASA Astrophysics Data System (ADS)
Penumadu, Dayakar; Zhao, Rongda; Frost, David
2000-04-01
The details of a test simulator that provides a realistic environment for performing virtual laboratory experimentals in soil mechanics is presented. A computer program Geo-Sim that can be used to perform virtual experiments, and allow for real-time observations of material response is presented. The results of experiments, for a given set of input parameters, are obtained with the test simulator using well-trained artificial neural-network-based soil models for different soil types and stress paths. Multimedia capabilities are integrated in Geo-Sim, using software that links and controls a laser disc player with a real-time parallel processing ability. During the simulation of a virtual experiment, relevant portions of the video image of a previously recorded test on an actual soil specimen are dispalyed along with the graphical presentation of response from the feedforward ANN model predictions. The pilot simulator developed to date includes all aspects related to performing a triaxial test on cohesionless soil under undrained and drained conditions. The benefits of the test simulator are also presented.
Virtual Reality for Pediatric Sedation: A Randomized Controlled Trial Using Simulation.
Zaveri, Pavan P; Davis, Aisha B; O'Connell, Karen J; Willner, Emily; Aronson Schinasi, Dana A; Ottolini, Mary
2016-02-09
Team training for procedural sedation for pediatric residents has traditionally consisted of didactic presentations and simulated scenarios using high-fidelity mannequins. We assessed the effectiveness of a virtual reality module in teaching preparation for and management of sedation for procedures. After developing a virtual reality environment in Second Life® (Linden Lab, San Francisco, CA) where providers perform and recover patients from procedural sedation, we conducted a randomized controlled trial to assess the effectiveness of the virtual reality module versus a traditional web-based educational module. A 20 question pre- and post-test was administered to assess knowledge change. All subjects participated in a simulated pediatric procedural sedation scenario that was video recorded for review and assessed using a 32-point checklist. A brief survey elicited feedback on the virtual reality module and the simulation scenario. The median score on the assessment checklist was 75% for the intervention group and 70% for the control group (P = 0.32). For the knowledge tests, there was no statistically significant difference between the groups (P = 0.14). Users had excellent reviews of the virtual reality module and reported that the module added to their education. Pediatric residents performed similarly in simulation and on a knowledge test after a virtual reality module compared with a traditional web-based module on procedural sedation. Although users enjoyed the virtual reality experience, these results question the value virtual reality adds in improving the performance of trainees. Further inquiry is needed into how virtual reality provides true value in simulation-based education.
Virtual Reality for Pediatric Sedation: A Randomized Controlled Trial Using Simulation
Davis, Aisha B; O'Connell, Karen J; Willner, Emily; Aronson Schinasi, Dana A; Ottolini, Mary
2016-01-01
Introduction: Team training for procedural sedation for pediatric residents has traditionally consisted of didactic presentations and simulated scenarios using high-fidelity mannequins. We assessed the effectiveness of a virtual reality module in teaching preparation for and management of sedation for procedures. Methods: After developing a virtual reality environment in Second Life® (Linden Lab, San Francisco, CA) where providers perform and recover patients from procedural sedation, we conducted a randomized controlled trial to assess the effectiveness of the virtual reality module versus a traditional web-based educational module. A 20 question pre- and post-test was administered to assess knowledge change. All subjects participated in a simulated pediatric procedural sedation scenario that was video recorded for review and assessed using a 32-point checklist. A brief survey elicited feedback on the virtual reality module and the simulation scenario. Results: The median score on the assessment checklist was 75% for the intervention group and 70% for the control group (P = 0.32). For the knowledge tests, there was no statistically significant difference between the groups (P = 0.14). Users had excellent reviews of the virtual reality module and reported that the module added to their education. Conclusions: Pediatric residents performed similarly in simulation and on a knowledge test after a virtual reality module compared with a traditional web-based module on procedural sedation. Although users enjoyed the virtual reality experience, these results question the value virtual reality adds in improving the performance of trainees. Further inquiry is needed into how virtual reality provides true value in simulation-based education. PMID:27014520
Beavis, A; Saunderson, J; Ward, J
2012-06-01
Recently there has been great interest in the use of simulation training, with the view to enhance safety within radiotherapy practice. We have developed a Virtual Environment for Radiotherapy Training (VERT) which facilitates this, including the simulation of a number of 'Physics practices'. One such process is the calibration of an ionisation chamber for use in Linac photon beams. The VERT system was used to provide a life sized 3D virtual environment within which we were able to simulate the calibration of a departmental chamber for 6MV and 15 MV beams following the UK 1990 Code of Practice. The characteristics of the beams are fixed parameters in the simulation, whereas default (Absorbed dose to water) correction factors of the chambers are configurable thereby dictating their response in the virtual x-ray beam. When the simulation is started, a random, realistic temperature and pressure is assigned to the bunker. Measurement and chamber positional errors are assigned to the chambers. A virtual water phantom was placed on the Linac couch and irradiated through the side using a 10 × 10 field. With a chamber at the appropriate depths and irradiated iso-centrically, the Quality Indices (QI) of the beams were obtained. The two chambers were 'inter-compared', allowing the departmental chamber calibration factor to be calculated from that of the reference chamber. For the virtual 6/15 MV beams, the QI were found to be 0.668/ 0.761 and the inter-comparison ratios 0.4408/ 0.4402 respectively. The departmental chamber calibration factors were calculated; applying these and appropriate environmental corrections allowed the output of the Linac to be confirmed. We have shown how a virtual training environment can be used to demonstrate practical processes and reinforce learning. The UK CoP was used here, however any relevant protocol could be demonstrated. Two of the authors (Beavis and Ward) are Founders of Vertual Ltd, a spin-out company created to commercialise the research presented in this abstract. © 2012 American Association of Physicists in Medicine.
Virtual worlds and team training.
Dev, Parvati; Youngblood, Patricia; Heinrichs, W Leroy; Kusumoto, Laura
2007-06-01
An important component of all emergency medicine residency programs is managing trauma effectively as a member of an emergency medicine team, but practice on live patients is often impractical and mannequin-based simulators are expensive and require all trainees to be physically present at the same location. This article describes a project to develop and evaluate a computer-based simulator (the Virtual Emergency Department) for distance training in teamwork and leadership in trauma management. The virtual environment provides repeated practice opportunities with life-threatening trauma cases in a safe and reproducible setting.
What can virtual patient simulation offer mental health nursing education?
Guise, V; Chambers, M; Välimäki, M
2012-06-01
This paper discusses the use of simulation in nursing education and training, including potential benefits and barriers associated with its use. In particular, it addresses the hitherto scant application of diverse simulation devices and dedicated simulation scenarios in psychiatric and mental health nursing. It goes on to describe a low-cost, narrative-based virtual patient simulation technique which has the potential for wide application within health and social care education. An example of the implementation of this technology in a web-based pilot course for acute mental health nurses is given. This particular virtual patient technique is a simulation type ideally suited to promoting essential mental health nursing skills such as critical thinking, communication and decision making. Furthermore, it is argued that it is particularly amenable to e-learning and blended learning environments, as well as being an apt tool where multilingual simulations are required. The continued development, implementation and evaluation of narrative virtual patient simulations across a variety of health and social care programmes would help ascertain their success as an educational tool. © 2011 Blackwell Publishing.
Zapf, Marc P; Matteucci, Paul B; Lovell, Nigel H; Zheng, Steven; Suaning, Gregg J
2014-01-01
Simulated prosthetic vision (SPV) in normally sighted subjects is an established way of investigating the prospective efficacy of visual prosthesis designs in visually guided tasks such as mobility. To perform meaningful SPV mobility studies in computer-based environments, a credible representation of both the virtual scene to navigate and the experienced artificial vision has to be established. It is therefore prudent to make optimal use of existing hardware and software solutions when establishing a testing framework. The authors aimed at improving the realism and immersion of SPV by integrating state-of-the-art yet low-cost consumer technology. The feasibility of body motion tracking to control movement in photo-realistic virtual environments was evaluated in a pilot study. Five subjects were recruited and performed an obstacle avoidance and wayfinding task using either keyboard and mouse, gamepad or Kinect motion tracking. Walking speed and collisions were analyzed as basic measures for task performance. Kinect motion tracking resulted in lower performance as compared to classical input methods, yet results were more uniform across vision conditions. The chosen framework was successfully applied in a basic virtual task and is suited to realistically simulate real-world scenes under SPV in mobility research. Classical input peripherals remain a feasible and effective way of controlling the virtual movement. Motion tracking, despite its limitations and early state of implementation, is intuitive and can eliminate between-subject differences due to familiarity to established input methods.
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.
2009-05-01
It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.
NASA Astrophysics Data System (ADS)
Beckhaus, Steffi
Virtual Reality aims at creating an artificial environment that can be perceived as a substitute to a real setting. Much effort in research and development goes into the creation of virtual environments that in their majority are perceivable only by eyes and hands. The multisensory nature of our perception, however, allows and, arguably, also expects more than that. As long as we are not able to simulate and deliver a fully sensory believable virtual environment to a user, we could make use of the fully sensory, multi-modal nature of real objects to fill in for this deficiency. The idea is to purposefully integrate real artifacts into the application and interaction, instead of dismissing anything real as hindering the virtual experience. The term virtual reality - denoting the goal, not the technology - shifts from a core virtual reality to an “enriched” reality, technologically encompassing both the computer generated and the real, physical artifacts. Together, either simultaneously or in a hybrid way, real and virtual jointly provide stimuli that are perceived by users through their senses and are later formed into an experience by the user's mind.
Virtual Tissues and Developmental Systems Biology (book chapter)
Virtual tissue (VT) models provide an in silico environment to simulate cross-scale properties in specific tissues or organs based on knowledge of the underlying biological networks. These integrative models capture the fundamental interactions in a biological system and enable ...
Skill training in multimodal virtual environments.
Gopher, Daniel
2012-01-01
Multimodal, immersive, virtual reality (VR) techniques open new perspectives for perceptual-motor skill trainers. They also introduce new risks and dangers. This paper describes the benefits and pitfalls of multimodal training and the cognitive building blocks of a multimodal, VR training simulators.
On validating remote sensing simulations using coincident real data
NASA Astrophysics Data System (ADS)
Wang, Mingming; Yao, Wei; Brown, Scott; Goodenough, Adam; van Aardt, Jan
2016-05-01
The remote sensing community often requires data simulation, either via spectral/spatial downsampling or through virtual, physics-based models, to assess systems and algorithms. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model is one such first-principles, physics-based model for simulating imagery for a range of modalities. Complex simulation of vegetation environments subsequently has become possible, as scene rendering technology and software advanced. This in turn has created questions related to the validity of such complex models, with potential multiple scattering, bidirectional distribution function (BRDF), etc. phenomena that could impact results in the case of complex vegetation scenes. We selected three sites, located in the Pacific Southwest domain (Fresno, CA) of the National Ecological Observatory Network (NEON). These sites represent oak savanna, hardwood forests, and conifer-manzanita-mixed forests. We constructed corresponding virtual scenes, using airborne LiDAR and imaging spectroscopy data from NEON, ground-based LiDAR data, and field-collected spectra to characterize the scenes. Imaging spectroscopy data for these virtual sites then were generated using the DIRSIG simulation environment. This simulated imagery was compared to real AVIRIS imagery (15m spatial resolution; 12 pixels/scene) and NEON Airborne Observation Platform (AOP) data (1m spatial resolution; 180 pixels/scene). These tests were performed using a distribution-comparison approach for select spectral statistics, e.g., established the spectra's shape, for each simulated versus real distribution pair. The initial comparison results of the spectral distributions indicated that the shapes of spectra between the virtual and real sites were closely matched.
Indirect Measures of Learning Transfer between Real and Virtual Environments
ERIC Educational Resources Information Center
Garrett, Michael; McMahon, Mark
2013-01-01
This paper reports on research undertaken to determine the effectiveness of a 3D simulation environment used to train mining personnel in emergency evacuation procedures, designated the Fires in Underground Mines Evacuation Simulator (FUMES). Owing to the operational constraints of the mining facility, methods for measuring learning transfer were…
Networking Labs in the Online Environment: Indicators for Success
ERIC Educational Resources Information Center
Lahoud, Hilmi A.; Krichen, Jack P.
2010-01-01
Several techniques have been used to provide hands-on educational experiences to online learners, including remote labs, simulation software, and virtual labs, which offer a more structured environment, including simulations and scheduled asynchronous access to physical resources. This exploratory study investigated how these methods can be used…
Advanced helmet mounted display (AHMD)
NASA Astrophysics Data System (ADS)
Sisodia, Ashok; Bayer, Michael; Townley-Smith, Paul; Nash, Brian; Little, Jay; Cassarly, William; Gupta, Anurag
2007-04-01
Due to significantly increased U.S. military involvement in deterrent, observer, security, peacekeeping and combat roles around the world, the military expects significant future growth in the demand for deployable virtual reality trainers with networked simulation capability of the battle space visualization process. The use of HMD technology in simulated virtual environments has been initiated by the demand for more effective training tools. The AHMD overlays computer-generated data (symbology, synthetic imagery, enhanced imagery) augmented with actual and simulated visible environment. The AHMD can be used to support deployable reconfigurable training solutions as well as traditional simulation requirements, UAV augmented reality, air traffic control and Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) applications. This paper will describe the design improvements implemented for production of the AHMD System.
The experiment editor: supporting inquiry-based learning with virtual labs
NASA Astrophysics Data System (ADS)
Galan, D.; Heradio, R.; de la Torre, L.; Dormido, S.; Esquembre, F.
2017-05-01
Inquiry-based learning is a pedagogical approach where students are motivated to pose their own questions when facing problems or scenarios. In physics learning, students are turned into scientists who carry out experiments, collect and analyze data, formulate and evaluate hypotheses, and so on. Lab experimentation is essential for inquiry-based learning, yet there is a drawback with traditional hands-on labs in the high costs associated with equipment, space, and maintenance staff. Virtual laboratories are helpful to reduce these costs. This paper enriches the virtual lab ecosystem by providing an integrated environment to automate experimentation tasks. In particular, our environment supports: (i) scripting and running experiments on virtual labs, and (ii) collecting and analyzing data from the experiments. The current implementation of our environment supports virtual labs created with the authoring tool Easy Java/Javascript Simulations. Since there are public repositories with hundreds of freely available labs created with this tool, the potential applicability to our environment is considerable.
How to avoid simulation sickness in virtual environments during user displacement
NASA Astrophysics Data System (ADS)
Kemeny, A.; Colombet, F.; Denoual, T.
2015-03-01
Driving simulation (DS) and Virtual Reality (VR) share the same technologies for visualization and 3D vision and may use the same technics for head movement tracking. They experience also similar difficulties when rendering the displacements of the observer in virtual environments, especially when these displacements are carried out using driver commands, including steering wheels, joysticks and nomad devices. High values for transport delay, the time lag between the action and the corresponding rendering cues and/or visual-vestibular conflict, due to the discrepancies perceived by the human visual and vestibular systems when driving or displacing using a control device, induces the so-called simulation sickness. While the visual transport delay can be efficiently reduced using high frequency frame rate, the visual-vestibular conflict is inherent to VR, when not using motion platforms. In order to study the impact of displacements on simulation sickness, we have tested various driving scenarios in Renault's 5-sided ultra-high resolution CAVE. First results indicate that low speed displacements with longitudinal and lateral accelerations under a given perception thresholds are well accepted by a large number of users and relatively high values are only accepted by experienced users and induce VR induced symptoms and effects (VRISE) for novice users, with a worst case scenario corresponding to rotational displacements. These results will be used for optimization technics at Arts et Métiers ParisTech for motion sickness reduction in virtual environments for industrial, research, educational or gaming applications.
APFELBAUM, HENRY; PELAH, ADAR; PELI, ELI
2007-01-01
Virtual reality locomotion simulators are a promising tool for evaluating the effectiveness of vision aids to mobility for people with low vision. This study examined two factors to gain insight into the verisimilitude requirements of the test environment: the effects of treadmill walking and the suitability of using controls as surrogate patients. Ten “tunnel vision” patients with retinitis pigmentosa (RP) were tasked with identifying which side of a clearly visible obstacle their heading through the virtual environment would lead them, and were scored both on accuracy and on their distance from the obstacle when they responded. They were tested both while walking on a treadmill and while standing, as they viewed a scene representing progress through a shopping mall. Control subjects, each wearing a head-mounted field restriction to simulate the vision of a paired patient, were also tested. At wide angles of approach, controls and patients performed with a comparably high degree of accuracy, and made their choices at comparable distances from the obstacle. At narrow angles of approach, patients’ accuracy increased when walking, while controls’ accuracy decreased. When walking, both patients and controls delayed their decisions until closer to the obstacle. We conclude that a head-mounted field restriction is not sufficient for simulating tunnel vision, but that the improved performance observed for walking compared to standing suggests that a walking interface (such as a treadmill) may be essential for eliciting natural perceptually-guided behavior in virtual reality locomotion simulators. PMID:18167511
Apfelbaum, Henry; Pelah, Adar; Peli, Eli
2007-01-01
Virtual reality locomotion simulators are a promising tool for evaluating the effectiveness of vision aids to mobility for people with low vision. This study examined two factors to gain insight into the verisimilitude requirements of the test environment: the effects of treadmill walking and the suitability of using controls as surrogate patients. Ten "tunnel vision" patients with retinitis pigmentosa (RP) were tasked with identifying which side of a clearly visible obstacle their heading through the virtual environment would lead them, and were scored both on accuracy and on their distance from the obstacle when they responded. They were tested both while walking on a treadmill and while standing, as they viewed a scene representing progress through a shopping mall. Control subjects, each wearing a head-mounted field restriction to simulate the vision of a paired patient, were also tested. At wide angles of approach, controls and patients performed with a comparably high degree of accuracy, and made their choices at comparable distances from the obstacle. At narrow angles of approach, patients' accuracy increased when walking, while controls' accuracy decreased. When walking, both patients and controls delayed their decisions until closer to the obstacle. We conclude that a head-mounted field restriction is not sufficient for simulating tunnel vision, but that the improved performance observed for walking compared to standing suggests that a walking interface (such as a treadmill) may be essential for eliciting natural perceptually-guided behavior in virtual reality locomotion simulators.
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Lin, Yuh-Lang
2004-01-01
During the research project, sounding datasets were generated for the region surrounding 9 major airports, including Dallas, TX, Boston, MA, New York, NY, Chicago, IL, St. Louis, MO, Atlanta, GA, Miami, FL, San Francico, CA, and Los Angeles, CA. The numerical simulation of winter and summer environments during which no instrument flight rule impact was occurring at these 9 terminals was performed using the most contemporary version of the Terminal Area PBL Prediction System (TAPPS) model nested from 36 km to 6 km to 1 km horizontal resolution and very detailed vertical resolution in the planetary boundary layer. The soundings from the 1 km model were archived at 30 minute time intervals for a 24 hour period and the vertical dependent variables as well as derived quantities, i.e., 3-dimensional wind components, temperatures, pressures, mixing ratios, turbulence kinetic energy and eddy dissipation rates were then interpolated to 5 m vertical resolution up to 1000 m elevation above ground level. After partial validation against field experiment datasets for Dallas as well as larger scale and much coarser resolution observations at the other 8 airports, these sounding datasets were sent to NASA for use in the Virtual Air Space and Modeling program. The application of these datasets being to determine representative airport weather environments to diagnose the response of simulated wake vortices to realistic atmospheric environments. These virtual datasets are based on large scale observed atmospheric initial conditions that are dynamically interpolated in space and time. The 1 km nested-grid simulated datasets providing a very coarse and highly smoothed representation of airport environment meteorological conditions. Details concerning the airport surface forcing are virtually absent from these simulated datasets although the observed background atmospheric processes have been compared to the simulated fields and the fields were found to accurately replicate the flows surrounding the airport where coarse verification data were available as well as where airport scale datasets were available.
ERIC Educational Resources Information Center
Thies, Anna-Lena; Weissenstein, Anne; Haulsen, Ivo; Marschall, Bernhard; Friederichs, Hendrik
2014-01-01
Simulation as a tool for medical education has gained considerable importance in the past years. Various studies have shown that the mastering of basic skills happens best if taught in a realistic and workplace-based context. It is necessary that simulation itself takes place in the realistic background of a genuine clinical or in an accordingly…
The Potential of Simulated Environments in Teacher Education: Current and Future Possibilities
ERIC Educational Resources Information Center
Dieker, Lisa A.; Rodriguez, Jacqueline A.; Lignugaris/Kraft, Benjamin; Hynes, Michael C.; Hughes, Charles E.
2014-01-01
The future of virtual environments is evident in many fields but is just emerging in the field of teacher education. In this article, the authors provide a summary of the evolution of simulation in the field of teacher education and three factors that need to be considered as these environments further develop. The authors provide a specific…
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
Kim, Aram; Darakjian, Nora; Finley, James M
2017-02-21
Virtual reality (VR) has recently been explored as a tool for neurorehabilitation to enable individuals with Parkinson's disease (PD) to practice challenging skills in a safe environment. Current technological advances have enabled the use of affordable, fully immersive head-mounted displays (HMDs) for potential therapeutic applications. However, while previous studies have used HMDs in individuals with PD, these were only used for short bouts of walking. Clinical applications of VR for gait training would likely involve an extended exposure to the virtual environment, which has the potential to cause individuals with PD to experience simulator-related adverse effects due to their age or pathology. Thus, our objective was to evaluate the safety of using an HMD for longer bouts of walking in fully immersive VR for older adults and individuals with PD. Thirty-three participants (11 healthy young, 11 healthy older adults, and 11 individuals with PD) were recruited for this study. Participants walked for 20 min while viewing a virtual city scene through an HMD (Oculus Rift DK2). Safety was evaluated using the mini-BESTest, measures of center of pressure (CoP) excursion, and questionnaires addressing symptoms of simulator sickness (SSQ) and measures of stress and arousal. Most participants successfully completed all trials without any discomfort. There were no significant changes for any of our groups in symptoms of simulator sickness or measures of static and dynamic balance after exposure to the virtual environment. Surprisingly, measures of stress decreased in all groups while the PD group also increased the level of arousal after exposure. Older adults and individuals with PD were able to successfully use immersive VR during walking without adverse effects. This provides systematic evidence supporting the safety of immersive VR for gait training in these populations.
Virtual reality training improves students' knowledge structures of medical concepts.
Stevens, Susan M; Goldsmith, Timothy E; Summers, Kenneth L; Sherstyuk, Andrei; Kihmm, Kathleen; Holten, James R; Davis, Christopher; Speitel, Daniel; Maris, Christina; Stewart, Randall; Wilks, David; Saland, Linda; Wax, Diane; Panaiotis; Saiki, Stanley; Alverson, Dale; Caudell, Thomas P
2005-01-01
Virtual environments can provide training that is difficult to achieve under normal circumstances. Medical students can work on high-risk cases in a realistic, time-critical environment, where students practice skills in a cognitively demanding and emotionally compelling situation. Research from cognitive science has shown that as students acquire domain expertise, their semantic organization of core domain concepts become more similar to those of an expert's. In the current study, we hypothesized that students' knowledge structures would become more expert-like as a result of their diagnosing and treating a patient experiencing a hematoma within a virtual environment. Forty-eight medical students diagnosed and treated a hematoma case within a fully immersed virtual environment. Student's semantic organization of 25 case-related concepts was assessed prior to and after training. Students' knowledge structures became more integrated and similar to an expert knowledge structure of the concepts as a result of the learning experience. The methods used here for eliciting, representing, and evaluating knowledge structures offer a sensitive and objective means for evaluating student learning in virtual environments and medical simulations.
Cater, J P; Huffman, S D
1995-01-01
This paper presents a unique virtual reality training and assessment tool developed under a NASA grant, "Research in Human Factors Aspects of Enhanced Virtual Environments for Extravehicular Activity (EVA) Training and Simulation." The Remote Access Virtual Environment Network (RAVEN) was created to train and evaluate the verbal, mental and physical coordination required between the intravehicular (IVA) astronaut operating the Remote Manipulator System (RMS) arm and the EVA astronaut standing in foot restraints on the end of the RMS. The RAVEN system currently allows the EVA astronaut to approach the Hubble Space Telescope (HST) under control of the IVA astronaut and grasp, remove, and replace the Wide Field Planetary Camera drawer from its location in the HST. Two viewpoints, one stereoscopic and one monoscopic, were created all linked by Ethernet, that provided the two trainees with the appropriate training environments.
Design Concerns in the Engineering of Virtual Worlds for Learning
ERIC Educational Resources Information Center
Rapanotti, Lucia; Hall, Jon G.
2011-01-01
The convergence of 3D simulation and social networking into current multi-user virtual environments has opened the door to new forms of interaction for learning in order to complement the face-to-face and Web 2.0-based systems. Yet, despite a growing user community, design knowledge for virtual worlds remains patchy, particularly when it comes to…
Levy
1996-08-01
New interactive computer technologies are having a significant influence on medical education, training, and practice. The newest innovation in computer technology, virtual reality, allows an individual to be immersed in a dynamic computer-generated, three-dimensional environment and can provide realistic simulations of surgical procedures. A new virtual reality hysteroscope passes through a sensing device that synchronizes movements with a three-dimensional model of a uterus. Force feedback is incorporated into this model, so the user actually experiences the collision of an instrument against the uterine wall or the sensation of the resistance or drag of a resectoscope as it cuts through a myoma in a virtual environment. A variety of intrauterine pathologies and procedures are simulated, including hyperplasia, cancer, resection of a uterine septum, polyp, or myoma, and endometrial ablation. This technology will be incorporated into comprehensive training programs that will objectively assess hand-eye coordination and procedural skills. It is possible that by incorporating virtual reality into hysteroscopic training programs, a decrease in the learning curve and the number of complications presently associated with the procedures may be realized. Prospective studies are required to assess these potential benefits.
Modeling Behavior and Variation for Crowd Animation
2009-08-01
Understanding Motion Capture for Computer Animation and Video Games . Morgan Kaufmann Publishers Inc., 1999. ISBN 0124906303. 2.2 [69] Mark Mizuguchi, John...simulation of crowds of virtual characters is needed for applications such as films, games , and virtual reality environments. These simulations are...Discussion and Future Work 95 Bibliography 99 viii List of Figures 1.1 Films and games are applications that motivate our work. Left: A scene from
Design of an immersive simulator for assisted power wheelchair driving.
Devigne, Louise; Babel, Marie; Nouviale, Florian; Narayanan, Vishnu K; Pasteau, Francois; Gallien, Philippe
2017-07-01
Driving a power wheelchair is a difficult and complex visual-cognitive task. As a result, some people with visual and/or cognitive disabilities cannot access the benefits of a power wheelchair because their impairments prevent them from driving safely. In order to improve their access to mobility, we have previously designed a semi-autonomous assistive wheelchair system which progressively corrects the trajectory as the user manually drives the wheelchair and smoothly avoids obstacles. Developing and testing such systems for wheelchair driving assistance requires a significant amount of material resources and clinician time. With Virtual Reality technology, prototypes can be developed and tested in a risk-free and highly flexible Virtual Environment before equipping and testing a physical prototype. Additionally, users can "virtually" test and train more easily during the development process. In this paper, we introduce a power wheelchair driving simulator allowing the user to navigate with a standard wheelchair in an immersive 3D Virtual Environment. The simulation framework is designed to be flexible so that we can use different control inputs. In order to validate the framework, we first performed tests on the simulator with able-bodied participants during which the user's Quality of Experience (QoE) was assessed through a set of questionnaires. Results show that the simulator is a promising tool for future works as it generates a good sense of presence and requires rather low cognitive effort from users.
NPSNET: Aural cues for virtual world immersion
NASA Astrophysics Data System (ADS)
Dahl, Leif A.
1992-09-01
NPSNET is a low-cost visual and aural simulation system designed and implemented at the Naval Postgraduate School. NPSNET is an example of a virtual world simulation environment that incorporates real-time aural cues through software-hardware interaction. In the current implementation of NPSNET, a graphics workstation functions in the sound server role which involves sending and receiving networked sound message packets across a Local Area Network, composed of multiple graphics workstations. The network messages contain sound file identification information that is transmitted from the sound server across an RS-422 protocol communication line to a serial to Musical Instrument Digital Interface (MIDI) converter. The MIDI converter, in turn relays the sound byte to a sampler, an electronic recording and playback device. The sampler correlates the hexadecimal input to a specific note or stored sound and sends it as an audio signal to speakers via an amplifier. The realism of a simulation is improved by involving multiple participant senses and removing external distractions. This thesis describes the incorporation of sound as aural cues, and the enhancement they provide in the virtual simulation environment of NPSNET.
Salimi, Zohreh; Ferguson-Pell, Martin
2018-06-01
Although wheelchair ergometers provide a safe and controlled environment for studying or training wheelchair users, until recently they had a major disadvantage in only being capable of simulating straight-line wheelchair propulsion. Virtual reality has helped overcome this problem and broaden the usability of wheelchair ergometers. However, for a wheelchair ergometer to be validly used in research studies, it needs to be able to simulate the biomechanics of real world wheelchair propulsion. In this paper, three versions of a wheelchair simulator were developed. They provide a sophisticated wheelchair ergometer in an immersive virtual reality environment. They are intended for manual wheelchair propulsion and all are able to simulate simple translational inertia. In addition, each of the systems reported uses a different approach to simulate wheelchair rotation and accommodate rotational inertial effects. The first system does not provide extra resistance against rotation and relies on merely linear inertia, hypothesizing that it can provide acceptable replication of biomechanics of wheelchair maneuvers. The second and third systems, however, are designed to simulate rotational inertia. System II uses mechanical compensation, and System III uses visual compensation simulating the influence that rotational inertia has on the visual perception of wheelchair movement in response to rotation at different speeds.
Simulating geriatric home safety assessments in a three-dimensional virtual world.
Andrade, Allen D; Cifuentes, Pedro; Mintzer, Michael J; Roos, Bernard A; Anam, Ramanakumar; Ruiz, Jorge G
2012-01-01
Virtual worlds could offer inexpensive and safe three-dimensional environments in which medical trainees can learn to identify home safety hazards. Our aim was to evaluate the feasibility, usability, and acceptability of virtual worlds for geriatric home safety assessments and to correlate performance efficiency in hazard identification with spatial ability, self-efficacy, cognitive load, and presence. In this study, 30 medical trainees found the home safety simulation easy to use, and their self-efficacy was improved. Men performed better than women in hazard identification. Presence and spatial ability were correlated significantly with performance. Educators should consider spatial ability and gender differences when implementing virtual world training for geriatric home safety assessments.
NASA Astrophysics Data System (ADS)
Pedro Sánchez, Juan; Sáenz, Jacobo; de la Torre, Luis; Carreras, Carmen; Yuste, Manuel; Heradio, Rubén; Dormido, Sebastián
2016-05-01
This work describes two experiments: "study of the diffraction of light: Fraunhofer approximation" and "the photoelectric effect". Both of them count with a virtual, simulated, version of the experiment as well as with a real one which can be operated remotely. The two previous virtual and remote labs (built using Easy Java(script) Simulations) are integrated in UNILabs, a network of online interactive laboratories based on the free Learning Management System Moodle. In this web environment, students can find not only the virtual and remote labs but also manuals with related theory, the user interface description for each application, and so on.
Virtual environment display for a 3D audio room simulation
NASA Technical Reports Server (NTRS)
Chapin, William L.; Foster, Scott H.
1992-01-01
The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.
Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno GH
2006-01-01
Background The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. Results The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. Conclusion The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines. The system is currently in use or under implementation in several healthcare disciplines at more than ten universities worldwide. Future aims include structuring the exchange of cases between teachers and academic institutions by building a VP library function. We intend to follow up the positive results presented in this paper with other studies looking at the learning outcomes, critical thinking and patient management. Studying the potential of Web-SP as an assessment tool will also be performed. More information about Web-SP: PMID:16504041
Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno G H
2006-02-21
The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines. The system is currently in use or under implementation in several healthcare disciplines at more than ten universities worldwide. Future aims include structuring the exchange of cases between teachers and academic institutions by building a VP library function. We intend to follow up the positive results presented in this paper with other studies looking at the learning outcomes, critical thinking and patient management. Studying the potential of Web-SP as an assessment tool will also be performed. More information about Web-SP: http://websp.lime.ki.se.
Medical simulation technology: educational overview, industry leaders, and what's missing.
Spooner, Nicholas; Hurst, Stephen; Khadra, Mohamed
2012-01-01
Modern medical simulation technology (MST) debuted in 1960 with the development of Resusci Annie (Laerdal 2007), which assisted students in the acquisition of proper ventilation and compression techniques used during basic life support. Following a steady stream of subsequent technological advances and innovations, MST manufacturers are now able to offer training aids capable of facilitating innovative learning in such diverse areas as human patient simulators, simulated clinical environments, virtual procedure stations, virtual medical environments, electronic tutors, and performance recording. The authors list a number of the most popular MSTs presently available while citing evaluative efforts undertaken to date regarding the efficacy of MST to the medical profession. They conclude by proposing a variety of simulation innovations of prospective interest to both medical and technology personnel while offering healthcare administrators a series of recommended considerations when planning to integrate MST into existing medical systems.
NASA Astrophysics Data System (ADS)
Chuah, Kee Man; Chen, Chwen Jen; Teh, Chee Siong
Virtual reality (VR) has been prevalently used as a tool to help students learn and to simulate situations that are too hazardous to practice in real life. The present study aims to explore the capability of VR to achieve these two purposes and demonstrate a novel application of the result, using VR to help school students learn about road safety skills, which are impractical to be carried out in real-life situations. This paper describes the system design of the VR-based learning environment known as Virtual Simulated Traffics for Road Safety Education (ViSTREET) and its various features. An overview of the technical procedures for its development is also included. Ultimately, this paper highlights the potential use of VR in addressing the learning problem concerning road safety education programme in Malaysia.
Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P
2004-01-01
Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully between Western Australia and UNM. We successfully demonstrated the ability to fully immerse participants in a distributed virtual environment independent of distance for collaborative team interaction in medical simulation designed for education and training. The ability to make mistakes in a safe environment is well received by students and has a positive impact on their understanding, as well as memory of the principles involved in correcting those mistakes. Bringing people together as virtual teams for interactive experiential learning and collaborative training, independent of distance, provides a platform for distributed "just-in-time" training, performance assessment and credentialing. Further validation is necessary to determine the potential value of the distributed VRE in knowledge transfer, improved future performance and should entail training participants to competence in using these tools.
The Virtual Environment for Reactor Applications (VERA): Design and architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, John A., E-mail: turnerja@ornl.gov; Clarno, Kevin; Sieger, Matt
VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL). CASL was established for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both software and numerical perspectives, along with the goalsmore » and constraints that drove major design decisions, and their implications. We explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the use of VERA tools for a variety of challenging applications within the nuclear industry.« less
Development of a low-cost virtual reality workstation for training and education
NASA Technical Reports Server (NTRS)
Phillips, James A.
1996-01-01
Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) it involves 3-dimensional computer graphics; (2) it includes real-time feedback and response to user actions; and (3) it must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, but the high cost of VR technology has limited its practical application to fields with big budgets, such as military combat simulation, commercial pilot training, and certain projects within the space program. However, in the last year there has been a revolution in the cost of VR technology. The speed of inexpensive personal computers has increased dramatically, especially with the introduction of the Pentium processor and the PCI bus for IBM-compatibles, and the cost of high-quality virtual reality peripherals has plummeted. The result is that many public schools, colleges, and universities can afford a PC-based workstation capable of running immersive virtual reality applications. My goal this summer was to assemble and evaluate such a system.
An intersubject variable regional anesthesia simulator with a virtual patient architecture.
Ullrich, Sebastian; Grottke, Oliver; Fried, Eduard; Frommen, Thorsten; Liao, Wei; Rossaint, Rolf; Kuhlen, Torsten; Deserno, Thomas M
2009-11-01
The main purpose is to provide an intuitive VR-based training environment for regional anesthesia (RA). The research question is how to process subject-specific datasets, organize them in a meaningful way and how to perform the simulation for peripheral regions. We propose a flexible virtual patient architecture and methods to process datasets. Image acquisition, image processing (especially segmentation), interactive nerve modeling and permutations (nerve instantiation) are described in detail. The simulation of electric impulse stimulation and according responses are essential for the training of peripheral RA and solved by an approach based on the electric distance. We have created an XML-based virtual patient database with several subjects. Prototypes of the simulation are implemented and run on multimodal VR hardware (e.g., stereoscopic display and haptic device). A first user pilot study has confirmed our approach. The virtual patient architecture enables support for arbitrary scenarios on different subjects. This concept can also be used for other simulators. In future work, we plan to extend the simulation and conduct further evaluations in order to provide a tool for routine training for RA.
Lytton, William W; Neymotin, Samuel A; Hines, Michael L
2008-06-30
In an effort to design a simulation environment that is more similar to that of neurophysiology, we introduce a virtual slice setup in the NEURON simulator. The virtual slice setup runs continuously and permits parameter changes, including changes to synaptic weights and time course and to intrinsic cell properties. The virtual slice setup permits shocks to be applied at chosen locations and activity to be sampled intra- or extracellularly from chosen locations. By default, a summed population display is shown during a run to indicate the level of activity and no states are saved. Simulations can run for hours of model time, therefore it is not practical to save all of the state variables. These, in any case, are primarily of interest at discrete times when experiments are being run: the simulation can be stopped momentarily at such times to save activity patterns. The virtual slice setup maintains an automated notebook showing shocks and parameter changes as well as user comments. We demonstrate how interaction with a continuously running simulation encourages experimental prototyping and can suggest additional dynamical features such as ligand wash-in and wash-out-alternatives to typical instantaneous parameter change. The virtual slice setup currently uses event-driven cells and runs at approximately 2 min/h on a laptop.
Simulation Environment Synchronizing Real Equipment for Manufacturing Cell
NASA Astrophysics Data System (ADS)
Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro
Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.
Taming Wild Horses: The Need for Virtual Time-based Scheduling of VMs in Network Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoginath, Srikanth B; Perumalla, Kalyan S; Henz, Brian J
2012-01-01
The next generation of scalable network simulators employ virtual machines (VMs) to act as high-fidelity models of traffic producer/consumer nodes in simulated networks. However, network simulations could be inaccurate if VMs are not scheduled according to virtual time, especially when many VMs are hosted per simulator core in a multi-core simulator environment. Since VMs are by default free-running, on the outset, it is not clear if, and to what extent, their untamed execution affects the results in simulated scenarios. Here, we provide the first quantitative basis for establishing the need for generalized virtual time scheduling of VMs in network simulators,more » based on an actual prototyped implementations. To exercise breadth, our system is tested with multiple disparate applications: (a) a set of message passing parallel programs, (b) a computer worm propagation phenomenon, and (c) a mobile ad-hoc wireless network simulation. We define and use error metrics and benchmarks in scaled tests to empirically report the poor match of traditional, fairness-based VM scheduling to VM-based network simulation, and also clearly show the better performance of our simulation-specific scheduler, with up to 64 VMs hosted on a 12-core simulator node.« less
A Cooperative Approach to Virtual Machine Based Fault Injection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naughton III, Thomas J; Engelmann, Christian; Vallee, Geoffroy R
Resilience investigations often employ fault injection (FI) tools to study the effects of simulated errors on a target system. It is important to keep the target system under test (SUT) isolated from the controlling environment in order to maintain control of the experiement. Virtual machines (VMs) have been used to aid these investigations due to the strong isolation properties of system-level virtualization. A key challenge in fault injection tools is to gain proper insight and context about the SUT. In VM-based FI tools, this challenge of target con- text is increased due to the separation between host and guest (VM).more » We discuss an approach to VM-based FI that leverages virtual machine introspection (VMI) methods to gain insight into the target s context running within the VM. The key to this environment is the ability to provide basic information to the FI system that can be used to create a map of the target environment. We describe a proof- of-concept implementation and a demonstration of its use to introduce simulated soft errors into an iterative solver benchmark running in user-space of a guest VM.« less
An efficient and scalable deformable model for virtual reality-based medical applications.
Choi, Kup-Sze; Sun, Hanqiu; Heng, Pheng-Ann
2004-09-01
Modeling of tissue deformation is of great importance to virtual reality (VR)-based medical simulations. Considerable effort has been dedicated to the development of interactively deformable virtual tissues. In this paper, an efficient and scalable deformable model is presented for virtual-reality-based medical applications. It considers deformation as a localized force transmittal process which is governed by algorithms based on breadth-first search (BFS). The computational speed is scalable to facilitate real-time interaction by adjusting the penetration depth. Simulated annealing (SA) algorithms are developed to optimize the model parameters by using the reference data generated with the linear static finite element method (FEM). The mechanical behavior and timing performance of the model have been evaluated. The model has been applied to simulate the typical behavior of living tissues and anisotropic materials. Integration with a haptic device has also been achieved on a generic personal computer (PC) platform. The proposed technique provides a feasible solution for VR-based medical simulations and has the potential for multi-user collaborative work in virtual environment.
NASA Astrophysics Data System (ADS)
Li, Jing; Wu, Huayi; Yang, Chaowei; Wong, David W.; Xie, Jibo
2011-09-01
Geoscientists build dynamic models to simulate various natural phenomena for a better understanding of our planet. Interactive visualizations of these geoscience models and their outputs through virtual globes on the Internet can help the public understand the dynamic phenomena related to the Earth more intuitively. However, challenges arise when the volume of four-dimensional data (4D), 3D in space plus time, is huge for rendering. Datasets loaded from geographically distributed data servers require synchronization between ingesting and rendering data. Also the visualization capability of display clients varies significantly in such an online visualization environment; some may not have high-end graphic cards. To enhance the efficiency of visualizing dynamic volumetric data in virtual globes, this paper proposes a systematic framework, in which an octree-based multiresolution data structure is implemented to organize time series 3D geospatial data to be used in virtual globe environments. This framework includes a view-dependent continuous level of detail (LOD) strategy formulated as a synchronized part of the virtual globe rendering process. Through the octree-based data retrieval process, the LOD strategy enables the rendering of the 4D simulation at a consistent and acceptable frame rate. To demonstrate the capabilities of this framework, data of a simulated dust storm event are rendered in World Wind, an open source virtual globe. The rendering performances with and without the octree-based LOD strategy are compared. The experimental results show that using the proposed data structure and processing strategy significantly enhances the visualization performance when rendering dynamic geospatial phenomena in virtual globes.
Technically Speaking: Why Should You Use Virtual Grower?
USDA-ARS?s Scientific Manuscript database
Virtual Grower is a free, easy-to-use software program that every grower who heats their greenhouse should install on their computer. The program enables growers to simulate their own greenhouse and predict how changes or investments could impact the growing environment, heating costs, and crop res...
The development of the virtual reality system for the treatment of the fears of public speaking.
Jo, H J; Ku, J H; Jang, D P; Shin, M B; Ahn, H B; Lee, J M; Cho, B H; Kim, S I
2001-01-01
The fear of public speaking is a kind of social phobias. The patients having the fear of public speaking show some symptoms like shame and timidity in the daily personal relationship. They are afraid that the other person would be puzzled, feel insulted, and they also fear that they should be underestimated for their mistakes. For the treatment of the fear of public speaking, the cognitive-behavioral therapy has been generally used. The cognitive-behavioral therapy is the method that makes the patients gradually experience some situations inducing the fears and overcome those at last. Recently, the virtual reality technology has been introduced as an alternative method for providing phobic situations. In this study, we developed the public speaking simulator and the virtual environments for the treatment of the fear of public speaking. The head-mounted display, the head-tracker and the 3 dimensional sound system were used for the immersive virtual environment. The imagery of the virtual environment consists of a seminar room and 8 virtual audiences. The patient will speak in front of these virtual audiences and the therapist can control motions, facial expressions, sounds, and voices of each virtual audience.
Kiryu, Tohru; So, Richard H Y
2007-09-25
Around three years ago, in the special issue on augmented and virtual reality in rehabilitation, the topics of simulator sickness was briefly discussed in relation to vestibular rehabilitation. Simulator sickness with virtual reality applications have also been referred to as visually induced motion sickness or cybersickness. Recently, study on cybersickness has been reported in entertainment, training, game, and medical environment in several journals. Virtual stimuli can enlarge sensation of presence, but they sometimes also evoke unpleasant sensation. In order to safely apply augmented and virtual reality for long-term rehabilitation treatment, sensation of presence and cybersickness should be appropriately controlled. This issue presents the results of five studies conducted to evaluate visually-induced effects and speculate influences of virtual rehabilitation. In particular, the influence of visual and vestibular stimuli on cardiovascular responses are reported in terms of academic contribution.
Kiryu, Tohru; So, Richard HY
2007-01-01
Around three years ago, in the special issue on augmented and virtual reality in rehabilitation, the topics of simulator sickness was briefly discussed in relation to vestibular rehabilitation. Simulator sickness with virtual reality applications have also been referred to as visually induced motion sickness or cybersickness. Recently, study on cybersickness has been reported in entertainment, training, game, and medical environment in several journals. Virtual stimuli can enlarge sensation of presence, but they sometimes also evoke unpleasant sensation. In order to safely apply augmented and virtual reality for long-term rehabilitation treatment, sensation of presence and cybersickness should be appropriately controlled. This issue presents the results of five studies conducted to evaluate visually-induced effects and speculate influences of virtual rehabilitation. In particular, the influence of visual and vestibular stimuli on cardiovascular responses are reported in terms of academic contribution. PMID:17894857
Rapid prototyping 3D virtual world interfaces within a virtual factory environment
NASA Technical Reports Server (NTRS)
Kosta, Charles Paul; Krolak, Patrick D.
1993-01-01
On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.
Makransky, Guido; Bonde, Mads T; Wulff, Julie S G; Wandall, Jakob; Hood, Michelle; Creed, Peter A; Bache, Iben; Silahtaroglu, Asli; Nørremølle, Anne
2016-03-25
Simulation based learning environments are designed to improve the quality of medical education by allowing students to interact with patients, diagnostic laboratory procedures, and patient data in a virtual environment. However, few studies have evaluated whether simulation based learning environments increase students' knowledge, intrinsic motivation, and self-efficacy, and help them generalize from laboratory analyses to clinical practice and health decision-making. An entire class of 300 University of Copenhagen first-year undergraduate students, most with a major in medicine, received a 2-h training session in a simulation based learning environment. The main outcomes were pre- to post- changes in knowledge, intrinsic motivation, and self-efficacy, together with post-intervention evaluation of the effect of the simulation on student understanding of everyday clinical practice were demonstrated. Knowledge (Cohen's d = 0.73), intrinsic motivation (d = 0.24), and self-efficacy (d = 0.46) significantly increased from the pre- to post-test. Low knowledge students showed the greatest increases in knowledge (d = 3.35) and self-efficacy (d = 0.61), but a non-significant increase in intrinsic motivation (d = 0.22). The medium and high knowledge students showed significant increases in knowledge (d = 1.45 and 0.36, respectively), motivation (d = 0.22 and 0.31), and self-efficacy (d = 0.36 and 0.52, respectively). Additionally, 90 % of students reported a greater understanding of medical genetics, 82 % thought that medical genetics was more interesting, 93 % indicated that they were more interested and motivated, and had gained confidence by having experienced working on a case story that resembled the real working situation of a doctor, and 78 % indicated that they would feel more confident counseling a patient after the simulation. The simulation based learning environment increased students' learning, intrinsic motivation, and self-efficacy (although the strength of these effects differed depending on their pre-test knowledge), and increased the perceived relevance of medical educational activities. The results suggest that simulations can help future generations of doctors transfer new understanding of disease mechanisms gained in virtual laboratory settings into everyday clinical practice.
Carman, Margaret; Xu, Shu; Rushton, Sharron; Smallheer, Benjamin A; Williams, Denise; Amarasekara, Sathya; Oermann, Marilyn H
Acute care nurse practitioner (ACNP) programs that use high-fidelity simulation as a teaching tool need to consider innovative strategies to provide distance-based students with learning experiences that are comparable to those in a simulation laboratory. The purpose of this article is to describe the use of virtual simulations in a distance-based ACNP program and student performance in the simulations. Virtual simulations using iSimulate were integrated into the ACNP course to promote the translation of content into a clinical context and enable students to develop their knowledge and decision-making skills. With these simulations, students worked as a team, even though they were at different sites from each other and from the faculty, to manage care of an acutely ill patient. The students were assigned to simulation groups of 4 students each. One week before the simulation, they reviewed past medical records. The virtual simulation sessions were recorded and then evaluated. The evaluation tools assessed 8 areas of performance and included key behaviors in each of these areas to be performed by students in the simulation. More than 80% of the student groups performed the key behaviors. Virtual simulations provide a learning platform that allows live interaction between students and faculty, at a distance, and application of content to clinical situations. With simulation, learners have an opportunity to practice assessment and decision-making in emergency and high-risk situations. Simulations not only are valuable for student learning but also provide a nonthreatening environment for staff to practice, receive feedback on their skills, and improve their confidence.
Meyer, Georg F.; Wong, Li Ting; Timson, Emma; Perfect, Philip; White, Mark D.
2012-01-01
We argue that objective fidelity evaluation of virtual environments, such as flight simulation, should be human-performance-centred and task-specific rather than measure the match between simulation and physical reality. We show how principled experimental paradigms and behavioural models to quantify human performance in simulated environments that have emerged from research in multisensory perception provide a framework for the objective evaluation of the contribution of individual cues to human performance measures of fidelity. We present three examples in a flight simulation environment as a case study: Experiment 1: Detection and categorisation of auditory and kinematic motion cues; Experiment 2: Performance evaluation in a target-tracking task; Experiment 3: Transferrable learning of auditory motion cues. We show how the contribution of individual cues to human performance can be robustly evaluated for each task and that the contribution is highly task dependent. The same auditory cues that can be discriminated and are optimally integrated in experiment 1, do not contribute to target-tracking performance in an in-flight refuelling simulation without training, experiment 2. In experiment 3, however, we demonstrate that the auditory cue leads to significant, transferrable, performance improvements with training. We conclude that objective fidelity evaluation requires a task-specific analysis of the contribution of individual cues. PMID:22957068
Translating the Simulation of Procedural Drilling Techniques for Interactive Neurosurgical Training
Stredney, Don; Rezai, Ali R.; Prevedello, Daniel M.; Elder, J. Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J.
2014-01-01
Background Through previous and concurrent efforts, we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. This volumetric data helps drive an interactive multi-sensory, i.e., visual (stereo), aural (stereo), and tactile simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the CNS simulation initiative. Objective The goal of this multi-level development is to deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. Methods We discuss issues of biofidelity as well as our methods to provide objective, quantitative automated assessment for the residents. Results We conclude with a discussion of our experiences by reporting on preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. Conclusion We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum. PMID:24051887
Role of virtual reality for cerebral palsy management.
Weiss, Patrice L Tamar; Tirosh, Emanuel; Fehlings, Darcy
2014-08-01
Virtual reality is the use of interactive simulations to present users with opportunities to perform in virtual environments that appear, sound, and less frequently, feel similar to real-world objects and events. Interactive computer play refers to the use of a game where a child interacts and plays with virtual objects in a computer-generated environment. Because of their distinctive attributes that provide ecologically realistic and motivating opportunities for active learning, these technologies have been used in pediatric rehabilitation over the past 15 years. The ability of virtual reality to create opportunities for active repetitive motor/sensory practice adds to their potential for neuroplasticity and learning in individuals with neurologic disorders. The objectives of this article is to provide an overview of how virtual reality and gaming are used clinically, to present the results of several example studies that demonstrate their use in research, and to briefly remark on future developments. © The Author(s) 2014.
A training system of orientation and mobility for blind people using acoustic virtual reality.
Seki, Yoshikazu; Sato, Tetsuji
2011-02-01
A new auditory orientation training system was developed for blind people using acoustic virtual reality (VR) based on a head-related transfer function (HRTF) simulation. The present training system can reproduce a virtual training environment for orientation and mobility (O&M) instruction, and the trainee can walk through the virtual training environment safely by listening to sounds such as vehicles, stores, ambient noise, etc., three-dimensionally through headphones. The system can reproduce not only sound sources but also sound reflection and insulation, so that the trainee can learn both sound location and obstacle perception skills. The virtual training environment is described in extensible markup language (XML), and the O&M instructor can edit it easily according to the training curriculum. Evaluation experiments were conducted to test the efficiency of some features of the system. Thirty subjects who had not acquired O&M skills attended the experiments. The subjects were separated into three groups: a no-training group, a virtual-training group using the present system, and a real-training group in real environments. The results suggested that virtual-training can reduce "veering" more than real-training and also can reduce stress as much as real training. The subjective technical and anxiety scores also improved.
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.
2017-12-01
The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.
LVC interaction within a mixed-reality training system
NASA Astrophysics Data System (ADS)
Pollock, Brice; Winer, Eliot; Gilbert, Stephen; de la Cruz, Julio
2012-03-01
The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems.
Dynamic simulation of perturbation responses in a closed-loop virtual arm model.
Du, Yu-Fan; He, Xin; Lan, Ning
2010-01-01
A closed-loop virtual arm (VA) model has been developed in SIMULINK environment by adding spinal reflex circuits and propriospinal neural networks to the open-loop VA model developed in early study [1]. An improved virtual muscle model (VM4.0) is used to speed up simulation and to generate more precise recruitment of muscle force at low levels of muscle activation. Time delays in the reflex loops are determined by their synaptic connections and afferent transmission back to the spinal cord. Reflex gains are properly selected so that closed-loop responses are stable. With the closed-loop VA model, we are developing an approach to evaluate system behaviors by dynamic simulation of perturbation responses. Joint stiffness is calculated based on simulated perturbation responses by a least-squares algorithm in MATLAB. This method of dynamic simulation will be essential for further evaluation of feedforward and reflex control of arm movement and position.
Perceptual effects in auralization of virtual rooms
NASA Astrophysics Data System (ADS)
Kleiner, Mendel; Larsson, Pontus; Vastfjall, Daniel; Torres, Rendell R.
2002-05-01
By using various types of binaural simulation (or ``auralization'') of physical environments, it is now possible to study basic perceptual issues relevant to room acoustics, as well to simulate the acoustic conditions found in concert halls and other auditoria. Binaural simulation of physical spaces in general is also important to virtual reality systems. This presentation will begin with an overview of the issues encountered in the auralization of room and other environments. We will then discuss the influence of various approximations in room modeling, in particular, edge- and surface scattering, on the perceived room response. Finally, we will discuss cross-modal effects, such as the influence of visual cues on the perception of auditory cues, and the influence of cross-modal effects on the judgement of ``perceived presence'' and the rating of room acoustic quality.
VEDA: a web-based virtual environment for dynamic atomic force microscopy.
Melcher, John; Hu, Shuiqing; Raman, Arvind
2008-06-01
We describe here the theory and applications of virtual environment dynamic atomic force microscopy (VEDA), a suite of state-of-the-art simulation tools deployed on nanoHUB (www.nanohub.org) for the accurate simulation of tip motion in dynamic atomic force microscopy (dAFM) over organic and inorganic samples. VEDA takes advantage of nanoHUB's cyberinfrastructure to run high-fidelity dAFM tip dynamics computations on local clusters and the teragrid. Consequently, these tools are freely accessible and the dAFM simulations are run using standard web-based browsers without requiring additional software. A wide range of issues in dAFM ranging from optimal probe choice, probe stability, and tip-sample interaction forces, power dissipation, to material property extraction and scanning dynamics over hetereogeneous samples can be addressed.
Invited Article: VEDA: A web-based virtual environment for dynamic atomic force microscopy
NASA Astrophysics Data System (ADS)
Melcher, John; Hu, Shuiqing; Raman, Arvind
2008-06-01
We describe here the theory and applications of virtual environment dynamic atomic force microscopy (VEDA), a suite of state-of-the-art simulation tools deployed on nanoHUB (www.nanohub.org) for the accurate simulation of tip motion in dynamic atomic force microscopy (dAFM) over organic and inorganic samples. VEDA takes advantage of nanoHUB's cyberinfrastructure to run high-fidelity dAFM tip dynamics computations on local clusters and the teragrid. Consequently, these tools are freely accessible and the dAFM simulations are run using standard web-based browsers without requiring additional software. A wide range of issues in dAFM ranging from optimal probe choice, probe stability, and tip-sample interaction forces, power dissipation, to material property extraction and scanning dynamics over hetereogeneous samples can be addressed.
Utah Virtual Lab: JAVA interactivity for teaching science and statistics on line.
Malloy, T E; Jensen, G C
2001-05-01
The Utah on-line Virtual Lab is a JAVA program run dynamically off a database. It is embedded in StatCenter (www.psych.utah.edu/learn/statsampler.html), an on-line collection of tools and text for teaching and learning statistics. Instructors author a statistical virtual reality that simulates theories and data in a specific research focus area by defining independent, predictor, and dependent variables and the relations among them. Students work in an on-line virtual environment to discover the principles of this simulated reality: They go to a library, read theoretical overviews and scientific puzzles, and then go to a lab, design a study, collect and analyze data, and write a report. Each student's design and data analysis decisions are computer-graded and recorded in a database; the written research report can be read by the instructor or by other students in peer groups simulating scientific conventions.
A systematic review of phacoemulsification cataract surgery in virtual reality simulators.
Lam, Chee Kiang; Sundaraj, Kenneth; Sulaiman, Mohd Nazri
2013-01-01
The aim of this study was to review the capability of virtual reality simulators in the application of phacoemulsification cataract surgery training. Our review included the scientific publications on cataract surgery simulators that had been developed by different groups of researchers along with commercialized surgical training products, such as EYESI® and PhacoVision®. The review covers the simulation of the main cataract surgery procedures, i.e., corneal incision, capsulorrhexis, phacosculpting, and intraocular lens implantation in various virtual reality surgery simulators. Haptics realism and visual realism of the procedures are the main elements in imitating the actual surgical environment. The involvement of ophthalmology in research on virtual reality since the early 1990s has made a great impact on the development of surgical simulators. Most of the latest cataract surgery training systems are able to offer high fidelity in visual feedback and haptics feedback, but visual realism, such as the rotational movements of an eyeball with response to the force applied by surgical instruments, is still lacking in some of them. The assessment of the surgical tasks carried out on the simulators showed a significant difference in the performance before and after the training.
NASA Astrophysics Data System (ADS)
Berland, Matthew W.
As scientists use the tools of computational and complex systems theory to broaden science perspectives (e.g., Bar-Yam, 1997; Holland, 1995; Wolfram, 2002), so can middle-school students broaden their perspectives using appropriate tools. The goals of this dissertation project are to build, study, evaluate, and compare activities designed to foster both computational and complex systems fluencies through collaborative constructionist virtual and physical robotics. In these activities, each student builds an agent (e.g., a robot-bird) that must interact with fellow students' agents to generate a complex aggregate (e.g., a flock of robot-birds) in a participatory simulation environment (Wilensky & Stroup, 1999a). In a participatory simulation, students collaborate by acting in a common space, teaching each other, and discussing content with one another. As a result, the students improve both their computational fluency and their complex systems fluency, where fluency is defined as the ability to both consume and produce relevant content (DiSessa, 2000). To date, several systems have been designed to foster computational and complex systems fluencies through computer programming and collaborative play (e.g., Hancock, 2003; Wilensky & Stroup, 1999b); this study suggests that, by supporting the relevant fluencies through collaborative play, they become mutually reinforcing. In this work, I will present both the design of the VBOT virtual/physical constructionist robotics learning environment and a comparative study of student interaction with the virtual and physical environments across four middle-school classrooms, focusing on the contrast in systems perspectives differently afforded by the two environments. In particular, I found that while performance gains were similar overall, the physical environment supported agent perspectives on aggregate behavior, and the virtual environment supported aggregate perspectives on agent behavior. The primary research questions are: (1) What are the relative affordances of virtual and physical constructionist robotics systems towards computational and complex systems fluencies? (2) What can middle school students learn using computational/complex systems learning environments in a collaborative setting? (3) In what ways are these environments and activities effective in teaching students computational and complex systems fluencies?
Laparoscopic skills acquisition: a study of simulation and traditional training.
Marlow, Nicholas; Altree, Meryl; Babidge, Wendy; Field, John; Hewett, Peter; Maddern, Guy J
2014-12-01
Training in basic laparoscopic skills can be undertaken using traditional methods, where trainees are educated by experienced surgeons through a process of graduated responsibility or by simulation-based training. This study aimed to assess whether simulation trained individuals reach the same level of proficiency in basic laparoscopic skills as traditional trained participants when assessed in a simulated environment. A prospective study was undertaken. Participants were allocated to one of two cohorts according to surgical experience. Participants from the inexperienced cohort were randomized to receive training in basic laparoscopic skills on either a box trainer or a virtual reality simulator. They were then assessed on the simulator on which they did not receive training. Participants from the experienced cohort, considered to have received traditional training in basic laparoscopic skills, did not receive simulation training and were randomized to either the box trainer or virtual reality simulator for skills assessment. The assessment scores from different cohorts on either simulator were then compared. A total of 138 participants completed the assessment session, 101 in the inexperienced simulation-trained cohort and 37 on the experienced traditionally trained cohort. There was no statistically significant difference between the training outcomes of simulation and traditionally trained participants, irrespective of the simulator type used. The results demonstrated that participants trained on either a box trainer or virtual reality simulator achieved a level of basic laparoscopic skills assessed in a simulated environment that was not significantly different from participants who had been traditionally trained in basic laparoscopic skills. © 2013 Royal Australasian College of Surgeons.
Adamovich, Sergei; Fluet, Gerard G.; Merians, Alma S.; Mathai, Abraham; Qiu, Qinyin
2010-01-01
Current neuroscience has identified several constructs to increase the effectiveness of upper extremity rehabilitation. One is the use of progressive, skill acquisition-oriented training. Another approach emphasizes the use of bilateral activities. Building on these principles, this paper describes the design and feasibility testing of a robotic / virtual environment system designed to train the arm of persons who have had strokes. The system provides a variety of assistance modes, scalable workspaces and hand-robot interfaces allowing persons with strokes to train multiple joints in three dimensions. The simulations utilize assistance algorithms that adjust task difficulty both online and offline in relation to subject performance. Several distinctive haptic effects have been incorporated into the simulations. An adaptive master-slave relationship between the unimpaired and impaired arm encourages active movement of the subject's hemiparetic arm during a bimanual task. Adaptive anti-gravity support and damping stabilize the arm during virtual reaching and placement tasks. An adaptive virtual spring provides assistance to complete the movement if the subject is unable to complete the task in time. Finally, haptically rendered virtual objects help to shape the movement trajectory during a virtual placement task. A proof of concept study demonstrated this system to be safe, feasible and worthy of further study. PMID:19666345
NASA Astrophysics Data System (ADS)
Terzopoulos, Demetri; Qureshi, Faisal Z.
Computer vision and sensor networks researchers are increasingly motivated to investigate complex multi-camera sensing and control issues that arise in the automatic visual surveillance of extensive, highly populated public spaces such as airports and train stations. However, they often encounter serious impediments to deploying and experimenting with large-scale physical camera networks in such real-world environments. We propose an alternative approach called "Virtual Vision", which facilitates this type of research through the virtual reality simulation of populated urban spaces, camera sensor networks, and computer vision on commodity computers. We demonstrate the usefulness of our approach by developing two highly automated surveillance systems comprising passive and active pan/tilt/zoom cameras that are deployed in a virtual train station environment populated by autonomous, lifelike virtual pedestrians. The easily reconfigurable virtual cameras distributed in this environment generate synthetic video feeds that emulate those acquired by real surveillance cameras monitoring public spaces. The novel multi-camera control strategies that we describe enable the cameras to collaborate in persistently observing pedestrians of interest and in acquiring close-up videos of pedestrians in designated areas.
Virtual Simulated Care Coordination Rounds for Nursing Students.
Badowski, Donna M
Implementation of the Affordable Care Act has nursing education reflecting on paradigm shifts in order to prepare nursing students for the evolving health care environment. The traditional focus of nursing education on nursing care in acute care settings does not provide learning experiences in care coordination and transitional care management skills. Virtual simulated care coordination rounds, using the National League for Nursing Advancing Care Excellence resources, offer nursing students an innovative experience in care coordination and transition care management.
Design and implementation of dynamic hybrid Honeypot network
NASA Astrophysics Data System (ADS)
Qiao, Peili; Hu, Shan-Shan; Zhai, Ji-Qiang
2013-05-01
The method of constructing a dynamic and self-adaptive virtual network is suggested to puzzle adversaries, delay and divert attacks, exhaust attacker resources and collect attacking information. The concepts of Honeypot and Honeyd, which is the frame of virtual Honeypot are introduced. The techniques of network scanning including active fingerprint recognition are analyzed. Dynamic virtual network system is designed and implemented. A virtual network similar to real network topology is built according to the collected messages from real environments in this system. By doing this, the system can perplex the attackers when Hackers attack and can further analyze and research the attacks. The tests to this system prove that this design can successfully simulate real network environment and can be used in network security analysis.
Kolarik, Andrew J; Cirstea, Silvia; Pardhan, Shahina
2013-02-01
Totally blind listeners often demonstrate better than normal capabilities when performing spatial hearing tasks. Accurate representation of three-dimensional auditory space requires the processing of available distance information between the listener and the sound source; however, auditory distance cues vary greatly depending upon the acoustic properties of the environment, and it is not known which distance cues are important to totally blind listeners. Our data show that totally blind listeners display better performance compared to sighted age-matched controls for distance discrimination tasks in anechoic and reverberant virtual rooms simulated using a room-image procedure. Totally blind listeners use two major auditory distance cues to stationary sound sources, level and direct-to-reverberant ratio, more effectively than sighted controls for many of the virtual distances tested. These results show that significant compensation among totally blind listeners for virtual auditory spatial distance leads to benefits across a range of simulated acoustic environments. No significant differences in performance were observed between listeners with partial non-correctable visual losses and sighted controls, suggesting that sensory compensation for virtual distance does not occur for listeners with partial vision loss.
Can walking motions improve visually induced rotational self-motion illusions in virtual reality?
Riecke, Bernhard E; Freiberg, Jacob B; Grechkin, Timofey Y
2015-02-04
Illusions of self-motion (vection) can provide compelling sensations of moving through virtual environments without the need for complex motion simulators or large tracked physical walking spaces. Here we explore the interaction between biomechanical cues (stepping along a rotating circular treadmill) and visual cues (viewing simulated self-rotation) for providing stationary users a compelling sensation of rotational self-motion (circular vection). When tested individually, biomechanical and visual cues were similarly effective in eliciting self-motion illusions. However, in combination they yielded significantly more intense self-motion illusions. These findings provide the first compelling evidence that walking motions can be used to significantly enhance visually induced rotational self-motion perception in virtual environments (and vice versa) without having to provide for physical self-motion or motion platforms. This is noteworthy, as linear treadmills have been found to actually impair visually induced translational self-motion perception (Ash, Palmisano, Apthorp, & Allison, 2013). Given the predominant focus on linear walking interfaces for virtual-reality locomotion, our findings suggest that investigating circular and curvilinear walking interfaces offers a promising direction for future research and development and can help to enhance self-motion illusions, presence and immersion in virtual-reality systems. © 2015 ARVO.
NASA Astrophysics Data System (ADS)
Ribeiro, Allan; Santos, Helen
With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.
HVS: an image-based approach for constructing virtual environments
NASA Astrophysics Data System (ADS)
Zhang, Maojun; Zhong, Li; Sun, Lifeng; Li, Yunhao
1998-09-01
Virtual Reality Systems can construct virtual environment which provide an interactive walkthrough experience. Traditionally, walkthrough is performed by modeling and rendering 3D computer graphics in real-time. Despite the rapid advance of computer graphics technique, the rendering engine usually places a limit on scene complexity and rendering quality. This paper presents a approach which uses the real-world image or synthesized image to comprise a virtual environment. The real-world image or synthesized image can be recorded by camera, or synthesized by off-line multispectral image processing for Landsat TM (Thematic Mapper) Imagery and SPOT HRV imagery. They are digitally warped on-the-fly to simulate walking forward/backward, to left/right and 360-degree watching around. We have developed a system HVS (Hyper Video System) based on these principles. HVS improves upon QuickTime VR and Surround Video in the walking forward/backward.
Real Time Bicycle Simulation Study of Bicyclists’ Behaviors and their Implication on Safety
DOT National Transportation Integrated Search
2017-06-30
The main goal of this study was to build a bicycle simulator and study the interaction between cyclists and other roadway users. The simulator developed was used in conjunction with Oculus Rift goggles to create a virtual cycling environment. The vir...
A proposal for an open source graphical environment for simulating x-ray optics
NASA Astrophysics Data System (ADS)
Sanchez del Rio, Manuel; Rebuffi, Luca; Demsar, Janez; Canestrari, Niccolo; Chubar, Oleg
2014-09-01
A new graphic environment to drive X-ray optics simulation packages such as SHADOW and SRW is proposed. The aim is to simulate a virtual experiment, including the description of the electron beam and simulate the emitted radiation, the optics, the scattering by the sample and radiation detection. Python is chosen as common interaction language. The ingredients of the new application, a glossary of variables for optical component, the selection of visualization tools, and the integration of all these components in a high level workflow environment built on Orange are presented.
MASCARET: creating virtual learning environments from system modelling
NASA Astrophysics Data System (ADS)
Querrec, Ronan; Vallejo, Paola; Buche, Cédric
2013-03-01
The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.
Virtual humans and formative assessment to train diagnostic skills in bulimia nervosa.
Gutiérrez-Maldonado, José; Ferrer-Garcia, Marta; Pla, Joana; Andrés-Pueyo, Antonio
2014-01-01
Carrying out a diagnostic interview requires skills that need to be taught in a controlled environment. Virtual Reality (VR) environments are increasingly used in the training of professionals, as they offer the most realistic alternative while not requiring students to face situations for which they are yet unprepared. The results of the training of diagnostic skills can also be generalized to any other situation in which effective communication skills play a major role. Our aim with this study has been to develop a procedure of formative assessment in order to increment the effectiveness of virtual learning simulation systems and then to assess their efficacy.
Airlift Operation Modeling Using Discrete Event Simulation (DES)
2009-12-01
Java ......................................................................................................20 2. Simkit...JRE Java Runtime Environment JVM Java Virtual Machine lbs Pounds LAM Load Allocation Mode LRM Landing Spot Reassignment Mode LEGO Listener Event...SOFTWARE DEVELOPMENT ENVIRONMENT The following are the software tools and development environment used for constructing the models. 1. Java Java
NASA Astrophysics Data System (ADS)
Navvab, Mojtaba; Bisegna, Fabio; Gugliermetti, Franco
2013-05-01
Saint Rocco Museum, a historical building in Venice, Italy is used as a case study to explore the performance of its' lighting system and visible light impact on viewing the large size art works. The transition from threedimensional architectural rendering to the three-dimensional virtual luminance mapping and visualization within a virtual environment is described as an integrated optical method for its application toward preservation of the cultural heritage of the space. Lighting simulation programs represent color as RGB triplets in a devicedependent color space such as ITU-R BT709. Prerequisite for this is a 3D-model which can be created within this computer aided virtual environment. The onsite measured surface luminance, chromaticity and spectral data were used as input to an established real-time indirect illumination and a physically based algorithms to produce the best approximation for RGB to be used as an input to generate the image of the objects. Conversion of RGB to and from spectra has been a major undertaking in order to match the infinite number of spectra to create the same colors that were defined by RGB in the program. The ability to simulate light intensity, candle power and spectral power distributions provide opportunity to examine the impact of color inter-reflections on historical paintings. VR offers an effective technique to quantify the visible light impact on human visual performance under precisely controlled representation of light spectrum that could be experienced in 3D format in a virtual environment as well as historical visual archives. The system can easily be expanded to include other measurements and stimuli.
Providing Interactive Access to Cave Geology for All Students, Regardless of Physical Ability
NASA Astrophysics Data System (ADS)
Atchison, C. `; Stredney, D.; Hittle, B.; Irving, K.; Toomey, R. S., III; Lemon, N. N.; Price, A.; Kerwin, T.
2013-12-01
Based on an identified need to accommodate students with mobility impairments in field-based instructional experiences, this presentation will discuss current efforts to promote participation, broaden diversity, and impart a historical perspective in the geosciences through the use of an interactive virtual environment. Developed through the integration of emerging simulation technologies, this prototypical virtual environment is created from LIDAR data of the Historic Tour route of Mammoth Cave National Park. The educational objectives of the simulation focus on four primary locations within the tour route that provide evidence of the hydrologic impact on the cave and karst formation. The overall objective is to provide a rich experience of a geological field-based learning for all students, regardless of their physical abilities. Employing a virtual environment that interchangeably uses two and three-dimensional representation of geoscience content, this synthetic field-based cave and karst module will provide an opportunity to assess the effectiveness in engaging the student community, and its efficacy in the curriculum when used as an alternative representation of a traditional field experience. The expected outcome is that based on the level of interactivity, the simulated environment will provide adequate pedagogical representation for content transfer without the need for physical experience in the uncontrolled field environment. Additionally, creating such an environment will impact all able-bodied students by providing supplemental resources that can both precede a traditional field experience and allow for students to re-examine a field site long after a the field experience, in both current formal and informal educational settings.
Virtual endoscopy in neurosurgery: a review.
Neubauer, André; Wolfsberger, Stefan
2013-01-01
Virtual endoscopy is the computerized creation of images depicting the inside of patient anatomy reconstructed in a virtual reality environment. It permits interactive, noninvasive, 3-dimensional visual inspection of anatomical cavities or vessels. This can aid in diagnostics, potentially replacing an actual endoscopic procedure, and help in the preparation of a surgical intervention by bridging the gap between plain 2-dimensional radiologic images and the 3-dimensional depiction of anatomy during actual endoscopy. If not only the endoscopic vision but also endoscopic handling, including realistic haptic feedback, is simulated, virtual endoscopy can be an effective training tool for novice surgeons. In neurosurgery, the main fields of the application of virtual endoscopy are third ventriculostomy, endonasal surgery, and the evaluation of pathologies in cerebral blood vessels. Progress in this very active field of research is achieved through cooperation between the technical and the medical communities. While the technology advances and new methods for modeling, reconstruction, and simulation are being developed, clinicians evaluate existing simulators, steer the development of new ones, and explore new fields of application. This review introduces some of the most interesting virtual reality systems for endoscopic neurosurgery developed in recent years and presents clinical studies conducted either on areas of application or specific systems. In addition, benefits and limitations of single products and simulated neuroendoscopy in general are pointed out.
Noise and Vibration Risk Prevention Virtual Web for Ubiquitous Training
ERIC Educational Resources Information Center
Redel-Macías, María Dolores; Cubero-Atienza, Antonio J.; Martínez-Valle, José Miguel; Pedrós-Pérez, Gerardo; del Pilar Martínez-Jiménez, María
2015-01-01
This paper describes a new Web portal offering experimental labs for ubiquitous training of university engineering students in work-related risk prevention. The Web-accessible computer program simulates the noise and machine vibrations met in the work environment, in a series of virtual laboratories that mimic an actual laboratory and provide the…
Learning to Communicate in a Virtual World: The Case of a JFL Classroom
ERIC Educational Resources Information Center
Yamazaki, Kasumi
2015-01-01
The proliferation of online simulation games across the globe in many different languages offers Computer Assisted Language Learning (CALL) researchers an opportunity to examine how language learning occurs in such virtual environments. While there has recently been an increase in the number of exploratory studies involving learning experiences of…
Immersive Training Systems: Virtual Reality and Education and Training.
ERIC Educational Resources Information Center
Psotka, Joseph
1995-01-01
Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…
Relating Narrative, Inquiry, and Inscriptions: Supporting Consequential Play
ERIC Educational Resources Information Center
Barab, Sasha A.; Sadler, Troy D.; Heiselt, Conan; Hickey, Daniel; Zuiker, Steven
2007-01-01
In this paper we describe our research using a multi-user virtual environment, "Quest Atlantis," to embed fourth grade students in an aquatic habitat simulation. Specifically targeted towards engaging students in a rich inquiry investigation, we layered a socio-scientific narrative and an interactive rule set into a multi-user virtual environment…
Improving Students' Problem Solving in a Virtual Chemistry Simulation through Metacognitive Messages
ERIC Educational Resources Information Center
Beal, Carole R.; Stevens, Ronald H.
2011-01-01
Recent assessments indicate that American students do not score well on tests of scientific problem solving, relative to students in other nations. IMMEX is a web-based virtual environment that provides students with opportunities to solve science problems by viewing information resources through a suite of menu options, developing a hypothesis…
NASA Technical Reports Server (NTRS)
Montgomery, Kevin; Bruyns, Cynthia D.
2002-01-01
We present schemes for real-time generalized interactions such as probing, piercing, cauterizing and ablating virtual tissues. These methods have been implemented in a robust, real-time (haptic rate) surgical simulation environment allowing us to model procedures including animal dissection, microsurgery, hysteroscopy, and cleft lip repair.
Photorealistic virtual anatomy based on Chinese Visible Human data.
Heng, P A; Zhang, S X; Xie, Y M; Wong, T T; Chui, Y P; Cheng, C Y
2006-04-01
Virtual reality based learning of human anatomy is feasible when a database of 3D organ models is available for the learner to explore, visualize, and dissect in virtual space interactively. In this article, we present our latest work on photorealistic virtual anatomy applications based on the Chinese Visible Human (CVH) data. We have focused on the development of state-of-the-art virtual environments that feature interactive photo-realistic visualization and dissection of virtual anatomical models constructed from ultra-high resolution CVH datasets. We also outline our latest progress in applying these highly accurate virtual and functional organ models to generate realistic look and feel to advanced surgical simulators. (c) 2006 Wiley-Liss, Inc.
Reinforce Networking Theory with OPNET Simulation
ERIC Educational Resources Information Center
Guo, Jinhua; Xiang, Weidong; Wang, Shengquan
2007-01-01
As networking systems have become more complex and expensive, hands-on experiments based on networking simulation have become essential for teaching the key computer networking topics to students. The simulation approach is the most cost effective and highly useful because it provides a virtual environment for an assortment of desirable features…
Developing Simulations in Multi-User Virtual Environments to Enhance Healthcare Education
ERIC Educational Resources Information Center
Rogers, Luke
2011-01-01
Computer-based clinical simulations are a powerful teaching and learning tool because of their ability to expand healthcare students' clinical experience by providing practice-based learning. Despite the benefits of traditional computer-based clinical simulations, there are significant issues that arise when incorporating them into a flexible,…
Web-Based Testing Tools for Electrical Engineering Courses
2001-09-01
ideas of distance learning are based on forming “ virtual teams” [2]. Each team is equipped with the same software packages and share information via...using virtual laboratories where they can simulate a laboratory experience in a web-based environment. They can also control laboratory devices over...possible to create a set of virtual laboratories that allow students to interact with the learning material at the same time that the student is
Kuric, Katelyn M; Harris, Bryan T; Morton, Dean; Azevedo, Bruno; Lin, Wei-Shao
2017-09-29
This clinical report describes a digital workflow using extraoral digital photographs and volumetric datasets from cone beam computed tomography (CBCT) imaging to create a 3-dimensional (3D), virtual patient with photorealistic appearance. In a patient with microstomia, hinge axis approximation, diagnostic casts simulating postextraction alveolar ridge profile, and facial simulation of prosthetic treatment outcome were completed in a 3D, virtual environment. The approach facilitated the diagnosis, communication, and patient acceptance of the treatment of maxillary and mandibular computer-aided design and computer-aided manufacturing (CAD-CAM) of immediate dentures at increased occlusal vertical dimension. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
The Virtual Environment for Reactor Applications (VERA): Design and architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, John A.; Clarno, Kevin; Sieger, Matt
VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL), the first DOE Hub, which was established in July 2010 for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both amore » software and a numerical perspective, along with the goals and constraints that drove the major design decisions and their implications. As a result, we explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the application of VERA tools for a variety of challenging problems within the nuclear industry.« less
The Virtual Environment for Reactor Applications (VERA): Design and architecture
Turner, John A.; Clarno, Kevin; Sieger, Matt; ...
2016-09-08
VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL), the first DOE Hub, which was established in July 2010 for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both amore » software and a numerical perspective, along with the goals and constraints that drove the major design decisions and their implications. As a result, we explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the application of VERA tools for a variety of challenging problems within the nuclear industry.« less
Identifying Anxiety Through Tracked Head Movements in a Virtual Classroom.
Won, Andrea Stevenson; Perone, Brian; Friend, Michelle; Bailenson, Jeremy N
2016-06-01
Virtual reality allows the controlled simulation of complex social settings, such as classrooms, and thus provides an opportunity to test a range of theories in the social sciences in a way that is both naturalistic and controlled. Importantly, virtual environments also allow the body movements of participants in the virtual world to be tracked and recorded. In the following article, we discuss how tracked head movements were correlated with participants' reports of anxiety in a simulation of a classroom. Participants who reported a high sense of awareness of and concern about the other virtual people in the room showed different patterns of head movement (more lateral head movement, indicating scanning behavior) from those who reported a low level of concern. We discuss the implications of this research for understanding nonverbal behavior associated with anxiety and for the design of online educational systems.
2011-09-01
Anthony Ciavarelli Second Reader: Roberto de Beauclair THIS PAGE INTENTIONALLY LEFT BLANK i...Ciavarelli Thesis Co-Advisor Roberto de Beauclair Second Reader Mathias Kolsch Chair, Modeling, Virtual Environments, and Simulation
Telearch - Integrated visual simulation environment for collaborative virtual archaeology.
NASA Astrophysics Data System (ADS)
Kurillo, Gregorij; Forte, Maurizio
Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.
Relating Narrative, Inquiry, and Inscriptions: Supporting Consequential Play
NASA Astrophysics Data System (ADS)
Barab, Sasha A.; Sadler, Troy D.; Heiselt, Conan; Hickey, Daniel; Zuiker, Steven
2007-02-01
In this paper we describe our research using a multi-user virtual environment, Quest Atlantis, to embed fourth grade students in an aquatic habitat simulation. Specifically targeted towards engaging students in a rich inquiry investigation, we layered a socio-scientific narrative and an interactive rule set into a multi-user virtual environment gaming engine to establish a virtual world through which students learned about science inquiry, water quality concepts, and the challenges in balancing scientific and socio-economic factors. Overall, students were clearly engaged, participated in rich scientific discourse, submitted quality work, and learned science content. Further, through participation in this narrative, students developed a rich perceptual, conceptual, and ethical understanding of science. This study suggests that multi-user virtual worlds can be effectively leveraged to support academic content learning.
Erratum to: Relating Narrative, Inquiry, and Inscriptions: Supporting Consequential Play
NASA Astrophysics Data System (ADS)
Barab, Sasha A.; Sadler, Troy D.; Heiselt, Conan; Hickey, Daniel; Zuiker, Steven
2010-08-01
In this paper we describe our research using a multi-user virtual environment, Quest Atlantis, to embed fourth grade students in an aquatic habitat simulation. Specifically targeted towards engaging students in a rich inquiry investigation, we layered a socio-scientific narrative and an interactive rule set into a multi-user virtual environment gaming engine to establish a virtual world through which students learned about science inquiry, water quality concepts, and the challenges in balancing scientific and socio-economic factors. Overall, students were clearly engaged, participated in rich scientific discourse, submitted quality work, and learned science content. Further, through participation in this narrative, students developed a rich perceptual, conceptual, and ethical understanding of science. This study suggests that multi-user virtual worlds can be effectively leveraged to support academic content learning.
Virtual Labs and Virtual Worlds
NASA Astrophysics Data System (ADS)
Boehler, Ted
2006-12-01
Virtual Labs and Virtual Worlds Coastline Community College has under development several virtual lab simulations and activities that range from biology, to language labs, to virtual discussion environments. Imagine a virtual world that students enter online, by logging onto their computer from home or anywhere they have web access. Upon entering this world they select a personalized identity represented by a digitized character (avatar) that can freely move about, interact with the environment, and communicate with other characters. In these virtual worlds, buildings, gathering places, conference rooms, labs, science rooms, and a variety of other “real world” elements are evident. When characters move about and encounter other people (players) they may freely communicate. They can examine things, manipulate objects, read signs, watch video clips, hear sounds, and jump to other locations. Goals of critical thinking, social interaction, peer collaboration, group support, and enhanced learning can be achieved in surprising new ways with this innovative approach to peer-to-peer communication in a virtual discussion world. In this presentation, short demos will be given of several online learning environments including a virtual biology lab, a marine science module, a Spanish lab, and a virtual discussion world. Coastline College has been a leader in the development of distance learning and media-based education for nearly 30 years and currently offers courses through PDA, Internet, DVD, CD-ROM, TV, and Videoconferencing technologies. Its distance learning program serves over 20,000 students every year. sponsor Jerry Meisner
Molecular Rift: Virtual Reality for Drug Designers.
Norrby, Magnus; Grebner, Christoph; Eriksson, Joakim; Boström, Jonas
2015-11-23
Recent advances in interaction design have created new ways to use computers. One example is the ability to create enhanced 3D environments that simulate physical presence in the real world--a virtual reality. This is relevant to drug discovery since molecular models are frequently used to obtain deeper understandings of, say, ligand-protein complexes. We have developed a tool (Molecular Rift), which creates a virtual reality environment steered with hand movements. Oculus Rift, a head-mounted display, is used to create the virtual settings. The program is controlled by gesture-recognition, using the gaming sensor MS Kinect v2, eliminating the need for standard input devices. The Open Babel toolkit was integrated to provide access to powerful cheminformatics functions. Molecular Rift was developed with a focus on usability, including iterative test-group evaluations. We conclude with reflections on virtual reality's future capabilities in chemistry and education. Molecular Rift is open source and can be downloaded from GitHub.
Predicting Innovation Acceptance by Simulation in Virtual Environments (Theoretical Foundations)
NASA Astrophysics Data System (ADS)
León, Noel; Duran, Roberto; Aguayo, Humberto; Flores, Myrna
This paper extends the current development of a methodology for Computer Aided Innovation. It begins with a presentation of concepts related to the perceived capabilities of virtual environments in the Innovation Cycle. The main premise establishes that it is possible to predict the acceptance of a new product in a specific market, by releasing an early prototype in a virtual scenario to quantify its general reception and to receive early feedback from potential customers. The paper continues to focus this research on a synergistic extension of techniques that have their origins in optimization and innovation disciplines. TRIZ (Theory of Inventive Problem Solving), extends the generation of variants with Evolutionary Algorithms (EA) and finally to present the designer and the intended customer, creative and innovative alternatives. All of this developed on a virtual software interface (Virtual World). The work continues with a general description of the project as a step forward to improve the overall strategy.
Vaccaro, Christine M; Crisp, Catrina C; Fellner, Angela N; Jackson, Christopher; Kleeman, Steven D; Pavelka, James
2013-01-01
The objective of this study was to compare the effect of virtual reality simulation training plus robotic orientation versus robotic orientation alone on performance of surgical tasks using an inanimate model. Surgical resident physicians were enrolled in this assessor-blinded randomized controlled trial. Residents were randomized to receive either (1) robotic virtual reality simulation training plus standard robotic orientation or (2) standard robotic orientation alone. Performance of surgical tasks was assessed at baseline and after the intervention. Nine of 33 modules from the da Vinci Skills Simulator were chosen. Experts in robotic surgery evaluated each resident's videotaped performance of the inanimate model using the Global Rating Scale (GRS) and Objective Structured Assessment of Technical Skills-modified for robotic-assisted surgery (rOSATS). Nine resident physicians were enrolled in the simulation group and 9 in the control group. As a whole, participants improved their total time, time to incision, and suture time from baseline to repeat testing on the inanimate model (P = 0.001, 0.003, <0.001, respectively). Both groups improved their GRS and rOSATS scores significantly (both P < 0.001); however, the GRS overall pass rate was higher in the simulation group compared with the control group (89% vs 44%, P = 0.066). Standard robotic orientation and/or robotic virtual reality simulation improve surgical skills on an inanimate model, although this may be a function of the initial "practice" on the inanimate model and repeat testing of a known task. However, robotic virtual reality simulation training increases GRS pass rates consistent with improved robotic technical skills learned in a virtual reality environment.
Responses to a virtual reality grocery store in persons with and without vestibular dysfunction.
Whitney, Susan L; Sparto, Patrick J; Hodges, Larry F; Babu, Sabarish V; Furman, Joseph M; Redfern, Mark S
2006-04-01
People with vestibular dysfunction often complain of having difficulty walking in visually complex environments. Virtual reality (VR) may serve as a useful therapeutic tool for providing physical therapy to these people. The purpose of this pilot project was to explore the ability of people with and without vestibular dysfunction to use and tolerate virtual environments that can be used in physical therapy. We have chosen grocery store environments, which often elicit complaints from patients. Two patients and three control subjects were asked to stand and navigate in VR grocery stores while finding products. Perceived discomfort, simulator sickness symptoms, distance traveled, and speed of head movement were recorded. Symptoms and discomfort increased in one subject with vestibular dysfunction. The older subjects traveled a shorter distance and had greater speed of head movements compared with young subjects. Environments with a greater number of products resulted in more head movements and a shorter distance traveled.
THE VIRTUAL INSTRUMENT: SUPPORT FOR GRID-ENABLED MCELL SIMULATIONS
Casanova, Henri; Berman, Francine; Bartol, Thomas; Gokcay, Erhan; Sejnowski, Terry; Birnbaum, Adam; Dongarra, Jack; Miller, Michelle; Ellisman, Mark; Faerman, Marcio; Obertelli, Graziano; Wolski, Rich; Pomerantz, Stuart; Stiles, Joel
2010-01-01
Ensembles of widely distributed, heterogeneous resources, or Grids, have emerged as popular platforms for large-scale scientific applications. In this paper we present the Virtual Instrument project, which provides an integrated application execution environment that enables end-users to run and interact with running scientific simulations on Grids. This work is performed in the specific context of MCell, a computational biology application. While MCell provides the basis for running simulations, its capabilities are currently limited in terms of scale, ease-of-use, and interactivity. These limitations preclude usage scenarios that are critical for scientific advances. Our goal is to create a scientific “Virtual Instrument” from MCell by allowing its users to transparently access Grid resources while being able to steer running simulations. In this paper, we motivate the Virtual Instrument project and discuss a number of relevant issues and accomplishments in the area of Grid software development and application scheduling. We then describe our software design and report on the current implementation. We verify and evaluate our design via experiments with MCell on a real-world Grid testbed. PMID:20689618
Ambient clumsiness in virtual environments
NASA Astrophysics Data System (ADS)
Ruzanka, Silvia; Behar, Katherine
2010-01-01
A fundamental pursuit of Virtual Reality is the experience of a seamless connection between the user's body and actions within the simulation. Virtual worlds often mediate the relationship between the physical and virtual body through creating an idealized representation of the self in an idealized space. This paper argues that the very ubiquity of the medium of virtual environments, such as the massively popular Second Life, has now made them mundane, and that idealized representations are no longer appropriate. In our artwork we introduce the attribute of clumsiness to Second Life by creating and distributing scripts that cause users' avatars to exhibit unpredictable stumbling, tripping, and momentary poor coordination, thus subtly and unexpectedly intervening with, rather than amplifying, a user's intent. These behaviors are publicly distributed, and manifest only occasionally - rather than intentional, conscious actions, they are involuntary and ambient. We suggest that the physical human body is itself an imperfect interface, and that the continued blurring of distinctions between the physical body and virtual representations calls for the introduction of these mundane, clumsy elements.
Interaction Design and Usability of Learning Spaces in 3D Multi-user Virtual Worlds
NASA Astrophysics Data System (ADS)
Minocha, Shailey; Reeves, Ahmad John
Three-dimensional virtual worlds are multimedia, simulated environments, often managed over the Web, which users can 'inhabit' and interact via their own graphical, self-representations known as 'avatars'. 3D virtual worlds are being used in many applications: education/training, gaming, social networking, marketing and commerce. Second Life is the most widely used 3D virtual world in education. However, problems associated with usability, navigation and way finding in 3D virtual worlds may impact on student learning and engagement. Based on empirical investigations of learning spaces in Second Life, this paper presents design guidelines to improve the usability and ease of navigation in 3D spaces. Methods of data collection include semi-structured interviews with Second Life students, educators and designers. The findings have revealed that design principles from the fields of urban planning, Human- Computer Interaction, Web usability, geography and psychology can influence the design of spaces in 3D multi-user virtual environments.
Creating virtual humans for simulation-based training and planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stansfield, S.; Sobel, A.
1998-05-12
Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system formore » planning, rehearsing and training assault operations.« less
Designing a Virtual Social Space for Language Acquisition
ERIC Educational Resources Information Center
Woolson, Maria Alessandra
2012-01-01
Middleverse de Español (MdE) is an evolving platform for foreign language (FL) study, aligned to the goals of ACTFL's National Standards and 2007 MLA report. The project simulates an immersive environment in a virtual 3-D space for the acquisition of translingual and transcultural competence in Spanish meant to support content-based and…
ERIC Educational Resources Information Center
Blankenship, Rebecca J.
2010-01-01
The purpose of this study was to investigate the potential use of Second Life (Linden Labs, 2004) and Skype (Skype Limited, 2009) as simulated virtual professional development tools for pre-service teachers seeking endorsement in teaching English as a Second Official Language (ESOL). Second Life is an avatar-based Internet program that allows…
Designing 3 Dimensional Virtual Reality Using Panoramic Image
NASA Astrophysics Data System (ADS)
Wan Abd Arif, Wan Norazlinawati; Wan Ahmad, Wan Fatimah; Nordin, Shahrina Md.; Abdullah, Azrai; Sivapalan, Subarna
The high demand to improve the quality of the presentation in the knowledge sharing field is to compete with rapidly growing technology. The needs for development of technology based learning and training lead to an idea to develop an Oil and Gas Plant Virtual Environment (OGPVE) for the benefit of our future. Panoramic Virtual Reality learning based environment is essential in order to help educators overcome the limitations in traditional technical writing lesson. Virtual reality will help users to understand better by providing the simulations of real-world and hard to reach environment with high degree of realistic experience and interactivity. Thus, in order to create a courseware which will achieve the objective, accurate images of intended scenarios must be acquired. The panorama shows the OGPVE and helps to generate ideas to users on what they have learnt. This paper discusses part of the development in panoramic virtual reality. The important phases for developing successful panoramic image are image acquisition and image stitching or mosaicing. In this paper, the combination of wide field-of-view (FOV) and close up image used in this panoramic development are also discussed.
Virtual reality based surgery simulation for endoscopic gynaecology.
Székely, G; Bajka, M; Brechbühler, C; Dual, J; Enzler, R; Haller, U; Hug, J; Hutter, R; Ironmonger, N; Kauer, M; Meier, V; Niederer, P; Rhomberg, A; Schmid, P; Schweitzer, G; Thaler, M; Vuskovic, V; Tröster, G
1999-01-01
Virtual reality (VR) based surgical simulator systems offer very elegant possibilities to both enrich and enhance traditional education in endoscopic surgery. However, while a wide range of VR simulator systems have been proposed and realized in the past few years, most of these systems are far from able to provide a reasonably realistic surgical environment. We explore the basic approaches to the current limits of realism and ultimately seek to extend these based on our description and analysis of the most important components of a VR-based endoscopic simulator. The feasibility of the proposed techniques is demonstrated on a first modular prototype system implementing the basic algorithms for VR-training in gynaecologic laparoscopy.
Multi-degree of freedom joystick for virtual reality simulation.
Head, M J; Nelson, C A; Siu, K C
2013-11-01
A modular control interface and simulated virtual reality environment were designed and created in order to determine how the kinematic architecture of a control interface affects minimally invasive surgery training. A user is able to selectively determine the kinematic configuration of an input device (number, type and location of degrees of freedom) for a specific surgical simulation through the use of modular joints and constraint components. Furthermore, passive locking was designed and implemented through the use of inflated latex tubing around rotational joints in order to allow a user to step away from a simulation without unwanted tool motion. It is believed that these features will facilitate improved simulation of a variety of surgical procedures and, thus, improve surgical skills training.
ERIC Educational Resources Information Center
Voelkel, Robert H.; Johnson, Christie W.; Gilbert, Kristen A.
2016-01-01
The purpose of this article is to present how one university incorporates immersive simulations through platforms which employ avatars to enhance graduate student understanding and learning in educational leadership programs. While using simulations and immersive virtual environments continues to grow, the literature suggests limited evidence of…
Learning English with "The Sims": Exploiting Authentic Computer Simulation Games for L2 Learning
ERIC Educational Resources Information Center
Ranalli, Jim
2008-01-01
With their realistic animation, complex scenarios and impressive interactivity, computer simulation games might be able to provide context-rich, cognitively engaging virtual environments for language learning. However, simulation games designed for L2 learners are in short supply. As an alternative, could games designed for the mass-market be…
Validated robotic laparoscopic surgical training in a virtual-reality environment.
Katsavelis, Dimitrios; Siu, Ka-Chun; Brown-Clerk, Bernadette; Lee, Irene H; Lee, Yong Kwon; Oleynikov, Dmitry; Stergiou, Nick
2009-01-01
A robotic virtual-reality (VR) simulator has been developed to improve robot-assisted training for laparoscopic surgery and to enhance surgical performance in laparoscopic skills. The simulated VR training environment provides an effective approach to evaluate and improve surgical performance. This study presents our findings of the VR training environment for robotic laparoscopy. Eight volunteers performed two inanimate tasks in both the VR and the actual training environment. The tasks were bimanual carrying (BC) and needle passing (NP). For the BC task, the volunteers simultaneously transferred two plastic pieces in opposite directions five times consecutively. The same volunteers passed a surgical needle through six pairs of holes in the NP task. Both tasks require significant bimanual coordination that mimics actual laparoscopic skills. Data analysis included time to task completion, speed and distance traveled of the instrument tip, as well as range of motion of the subject's wrist and elbow of the right arm. Electromyography of the right wrist flexor and extensor were also analyzed. Paired t-tests and Pearson's r were used to explore the differences and correlations between the two environments. There were no significant differences between the actual and the simulated VR environment with respect to the BC task, while there were significant differences in almost all dependent parameters for the NP task. Moderate to high correlations for most dependent parameters were revealed for both tasks. Our data shows that the VR environment adequately simulated the BC task. The significant differences found for the NP task may be attributed to an oversimplification in the VR environment. However, they do point to the need for improvements in the complexity of our VR simulation. Further research work is needed to develop effective and reliable VR environments for robotic laparoscopic training.
Rocinante, a virtual collaborative visualizer
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, M.J.; Ice, L.G.
1996-12-31
With the goal of improving the ability of people around the world to share the development and use of intelligent systems, Sandia National Laboratories` Intelligent Systems and Robotics Center is developing new Virtual Collaborative Engineering (VCE) and Virtual Collaborative Control (VCC) technologies. A key area of VCE and VCC research is in shared visualization of virtual environments. This paper describes a Virtual Collaborative Visualizer (VCV), named Rocinante, that Sandia developed for VCE and VCC applications. Rocinante allows multiple participants to simultaneously view dynamic geometrically-defined environments. Each viewer can exclude extraneous detail or include additional information in the scene as desired.more » Shared information can be saved and later replayed in a stand-alone mode. Rocinante automatically scales visualization requirements with computer system capabilities. Models with 30,000 polygons and 4 Megabytes of texture display at 12 to 15 frames per second (fps) on an SGI Onyx and at 3 to 8 fps (without texture) on Indigo 2 Extreme computers. In its networked mode, Rocinante synchronizes its local geometric model with remote simulators and sensory systems by monitoring data transmitted through UDP packets. Rocinante`s scalability and performance make it an ideal VCC tool. Users throughout the country can monitor robot motions and the thinking behind their motion planners and simulators.« less
A Planetarium Inside Your Office: Virtual Reality in the Dome Production Pipeline
NASA Astrophysics Data System (ADS)
Summers, Frank
2018-01-01
Producing astronomy visualization sequences for a planetarium without ready access to a dome is a distorted geometric challenge. Fortunately, one can now use virtual reality (VR) to simulate a dome environment without ever leaving one's office chair. The VR dome experience has proven to be a more than suitable pre-visualization method that requires only modest amounts of processing beyond the standard production pipeline. It also provides a crucial testbed for identifying, testing, and fixing the visual constraints and artifacts that arise in a spherical presentation environment. Topics adreesed here will include rendering, geometric projection, movie encoding, software playback, and hardware setup for a virtual dome using VR headsets.
Augmenting breath regulation using a mobile driven virtual reality therapy framework.
Abushakra, Ahmad; Faezipour, Miad
2014-05-01
This paper presents a conceptual framework of a virtual reality therapy to assist individuals, especially lung cancer patients or those with breathing disorders to regulate their breath through real-time analysis of respiration movements using a smartphone. Virtual reality technology is an attractive means for medical simulations and treatment, particularly for patients with cancer. The theories, methodologies and approaches, and real-world dynamic contents for all the components of this virtual reality therapy (VRT) via a conceptual framework using the smartphone will be discussed. The architecture and technical aspects of the offshore platform of the virtual environment will also be presented.
NASA Technical Reports Server (NTRS)
Ishii, Masahiro; Sukanya, P.; Sato, Makoto
1994-01-01
This paper describes the construction of a virtual work space for tasks performed by two handed manipulation. We intend to provide a virtual environment that encourages users to accomplish tasks as they usually act in a real environment. Our approach uses a three dimensional spatial interface device that allows the user to handle virtual objects by hand and be able to feel some physical properties such as contact, weight, etc. We investigated suitable conditions for constructing our virtual work space by simulating some basic assembly work, a face and fit task. We then selected the conditions under which the subjects felt most comfortable in performing this task and set up our virtual work space. Finally, we verified the possibility of performing more complex tasks in this virtual work space by providing simple virtual models and then let the subjects create new models by assembling these components. The subjects can naturally perform assembly operations and accomplish the task. Our evaluation shows that this virtual work space has the potential to be used for performing tasks that require two-handed manipulation or cooperation between both hands in a natural manner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shawver, D.M.; Stansfield, S.
This overview presents current research at Sandia National Laboratories in the Virtual Reality and Intelligent Simulation Lab. Into an existing distributed VR environment which we have been developing, and which provides shared immersion for multiple users, we are adding virtual actor support. The virtual actor support we are adding to this environment is intended to provide semi-autonomous actors, with oversight and high-level guiding control by a director/user, and to allow the overall action to be driven by a scenario. We present an overview of the environment into which our virtual actors will be added in Section 3, and discuss themore » direction of the Virtual Actor research itself in Section 4. We will briefly review related work in Section 2. First however we need to place the research in the context of what motivates it. The motivation for our construction of this environment, and the line of research associated with it, is based on a long-term program of providing support, through simulation, for situational training, by which we mean a type of training in which students learn to handle multiple situations or scenarios. In these situations, the student may encounter events ranging from the routine occurance to the rare emergency. Indeed, the appeal of such training systems is that they could allow the student to experience and develop effective responses for situations they would otherwise have no opportunity to practice, until they happened to encounter an actual occurance. Examples of the type of students for this kind of training would be security forces or emergency response forces. An example of the type of training scenario we would like to support is given in Section 4.2.« less
Virtual environment assessment for laser-based vision surface profiling
NASA Astrophysics Data System (ADS)
ElSoussi, Adnane; Al Alami, Abed ElRahman; Abu-Nabah, Bassam A.
2015-03-01
Oil and gas businesses have been raising the demand from original equipment manufacturers (OEMs) to implement a reliable metrology method in assessing surface profiles of welds before and after grinding. This certainly mandates the deviation from the commonly used surface measurement gauges, which are not only operator dependent, but also limited to discrete measurements along the weld. Due to its potential accuracy and speed, the use of laser-based vision surface profiling systems have been progressively rising as part of manufacturing quality control. This effort presents a virtual environment that lends itself for developing and evaluating existing laser vision sensor (LVS) calibration and measurement techniques. A combination of two known calibration techniques is implemented to deliver a calibrated LVS system. System calibration is implemented virtually and experimentally to scan simulated and 3D printed features of known profiles, respectively. Scanned data is inverted and compared with the input profiles to validate the virtual environment capability for LVS surface profiling and preliminary assess the measurement technique for weld profiling applications. Moreover, this effort brings 3D scanning capability a step closer towards robust quality control applications in a manufacturing environment.
Using Five Stage Model to Design of Collaborative Learning Environments in Second Life
ERIC Educational Resources Information Center
Orhan, Sevil; Karaman, M. Kemal
2014-01-01
Specifically Second Life (SL) among virtual worlds draws attention of researchers to form collaborative learning environments (Sutcliffe & Alrayes, 2012) since it could be used as a rich platform to simulate a real environment containing many collaborative learning characteristics and interaction tools within itself. Five Stage Model (FSM)…
Tambone, V; Alessi, A; Macchi, I; Milighetti, S; Muzii, L
2009-01-01
The main difference between a virtual reality and a generic representation is to be directly involved into the action you are performing. As a matter of fact, within the shift from real to virtual world, our biological physique does not mutate but is amplified and connected to the virtual world by technological interfaces. Training using a virtual reality simulator is an option to supplement (or replace) standard training. One of the two main goals of our study is to test, at first, how much students enrolled to the Faculty of Medicine at "University Campus Bio-Medico of Rome" are familiar with synthetic worlds, how long they have been using them and how they would like their Avatar to look like. Moreover, the second aim is to collect students' opinion about the use of virtual, interactive environments to enable learning and participation in dynamic, problem based, clinical, virtual simulations. Simulations might be used to allow learners to make mistakes safely in lieu of real life situations, learn from those mistakes and ultimately to improve performances by subsequent avoidance of those mistakes. The selected approach to the study is based on a semi-structured questionnaire made of 14 questions administered to all the medical students. Most of the students appear not to be very confident with virtual worlds mostly because of a lack of interest. However, a large majority of them are likely to use a virtual world for fun or escaping from reality. Students would select and customize their Avatar by giving her/him the same sexual identity, same figure, same social class but different employment. It is important to notice that a wide majority of the students is interested in practicing on a virtual world in order to manage new experiences and being able to face them; their willing is to get benefits from the ability to make mistakes in a safe environment as well as to record a positive impact on their understanding.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCorkle, D.; Yang, C.; Jordan, T.
2007-06-01
Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less
Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems
NASA Astrophysics Data System (ADS)
Dogan, Firat; Atilgan, Yasemin
The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.
WeaVR: a self-contained and wearable immersive virtual environment simulation system.
Hodgson, Eric; Bachmann, Eric R; Vincent, David; Zmuda, Michael; Waller, David; Calusdian, James
2015-03-01
We describe WeaVR, a computer simulation system that takes virtual reality technology beyond specialized laboratories and research sites and makes it available in any open space, such as a gymnasium or a public park. Novel hardware and software systems enable HMD-based immersive virtual reality simulations to be conducted in any arbitrary location, with no external infrastructure and little-to-no setup or site preparation. The ability of the WeaVR system to provide realistic motion-tracked navigation for users, to improve the study of large-scale navigation, and to generate usable behavioral data is shown in three demonstrations. First, participants navigated through a full-scale virtual grocery store while physically situated in an open grass field. Trajectory data are presented for both normal tracking and for tracking during the use of redirected walking that constrained users to a predefined area. Second, users followed a straight path within a virtual world for distances of up to 2 km while walking naturally and being redirected to stay within the field, demonstrating the ability of the system to study large-scale navigation by simulating virtual worlds that are potentially unlimited in extent. Finally, the portability and pedagogical implications of this system were demonstrated by taking it to a regional high school for live use by a computer science class on their own school campus.
The benefits of virtual reality simulator training for laparoscopic surgery.
Hart, Roger; Karthigasu, Krishnan
2007-08-01
Virtual reality is a computer-generated system that provides a representation of an environment. This review will analyse the literature with regard to any benefit to be derived from training with virtual reality equipment and to describe the current equipment available. Virtual reality systems are not currently realistic of the live operating environment because they lack tactile sensation, and do not represent a complete operation. The literature suggests that virtual reality training is a valuable learning tool for gynaecologists in training, particularly those in the early stages of their careers. Furthermore, it may be of benefit for the ongoing audit of surgical skills and for the early identification of a surgeon's deficiencies before operative incidents occur. It is only a matter of time before realistic virtual reality models of most complete gynaecological operations are available, with improved haptics as a result of improved computer technology. It is inevitable that in the modern climate of litigation virtual reality training will become an essential part of clinical training, as evidence for its effectiveness as a training tool exists, and in many countries training by operating on live animals is not possible.
Driving performance in a power wheelchair simulator.
Archambault, Philippe S; Tremblay, Stéphanie; Cachecho, Sarah; Routhier, François; Boissy, Patrick
2012-05-01
A power wheelchair simulator can allow users to safely experience various driving tasks. For such training to be efficient, it is important that driving performance be equivalent to that in a real wheelchair. This study aimed at comparing driving performance in a real and in a simulated environment. Two groups of healthy young adults performed different driving tasks, either in a real power wheelchair or in a simulator. Smoothness of joystick control as well as the time necessary to complete each task were recorded and compared between the two groups. Driving strategies were analysed from video recordings. The sense of presence, of really being in the virtual environment, was assessed through a questionnaire. Smoothness of joystick control was the same in the real and virtual groups. Task completion time was higher in the simulator for the more difficult tasks. Both groups showed similar strategies and difficulties. The simulator generated a good sense of presence, which is important for motivation. Performance was very similar for power wheelchair driving in the simulator or in real life. Thus, the simulator could potentially be used to complement training of individuals who require a power wheelchair and use a regular joystick. [Box: see text].
Evaluation of a low-cost 3D sound system for immersive virtual reality training systems.
Doerr, Kai-Uwe; Rademacher, Holger; Huesgen, Silke; Kubbat, Wolfgang
2007-01-01
Since Head Mounted Displays (HMD), datagloves, tracking systems, and powerful computer graphics resources are nowadays in an affordable price range, the usage of PC-based "Virtual Training Systems" becomes very attractive. However, due to the limited field of view of HMD devices, additional modalities have to be provided to benefit from 3D environments. A 3D sound simulation can improve the capabilities of VR systems dramatically. Unfortunately, realistic 3D sound simulations are expensive and demand a tremendous amount of computational power to calculate reverberation, occlusion, and obstruction effects. To use 3D sound in a PC-based training system as a way to direct and guide trainees to observe specific events in 3D space, a cheaper alternative has to be provided, so that a broader range of applications can take advantage of this modality. To address this issue, we focus in this paper on the evaluation of a low-cost 3D sound simulation that is capable of providing traceable 3D sound events. We describe our experimental system setup using conventional stereo headsets in combination with a tracked HMD device and present our results with regard to precision, speed, and used signal types for localizing simulated sound events in a virtual training environment.
[Simulation in health to improve the delivery of care].
Tesnière, Antoine; Fleury, Cynthia
2017-11-01
Simulation in health care is a very effective training tool. Using mannequins, 'standardised patients' or virtual care environments, it encourages participants to reflect on nursing practices while practising in a safe and controlled space. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Ali, Saad; Qandeel, Monther; Ramakrishna, Rishi; Yang, Carina W
2018-02-01
Fluoroscopy-guided lumbar puncture (FGLP) is a basic procedural component of radiology residency and neuroradiology fellowship training. Performance of the procedure with limited experience is associated with increased patient discomfort as well as increased radiation dose, puncture attempts, and complication rate. Simulation in health care is a developing field that has potential for enhancing procedural training. We demonstrate the design and utility of a virtual reality simulator for performing FGLP. An FGLP module was developed on an ImmersiveTouch platform, which digitally reproduces the procedural environment with a hologram-like projection. From computed tomography datasets of healthy adult spines, we constructed a 3-D model of the lumbar spine and overlying soft tissues. We assigned different physical characteristics to each tissue type, which the user can experience through haptic feedback while advancing a virtual spinal needle. Virtual fluoroscopy as well as 3-D images can be obtained for procedural planning and guidance. The number of puncture attempts, the distance to the target, the number of fluoroscopic shots, and the approximate radiation dose can be calculated. Preliminary data from users who participated in the simulation were obtained in a postsimulation survey. All users found the simulation to be a realistic replication of the anatomy and procedure and would recommend to a colleague. On a scale of 1-5 (lowest to highest) rating the virtual simulator training overall, the mean score was 4.3 (range 3-5). We describe the design of a virtual reality simulator for performing FGLP and present the initial experience with this new technique. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
A Framework for Analyzing the Whole Body Surface Area from a Single View
Doretto, Gianfranco; Adjeroh, Donald
2017-01-01
We present a virtual reality (VR) framework for the analysis of whole human body surface area. Usual methods for determining the whole body surface area (WBSA) are based on well known formulae, characterized by large errors when the subject is obese, or belongs to certain subgroups. For these situations, we believe that a computer vision approach can overcome these problems and provide a better estimate of this important body indicator. Unfortunately, using machine learning techniques to design a computer vision system able to provide a new body indicator that goes beyond the use of only body weight and height, entails a long and expensive data acquisition process. A more viable solution is to use a dataset composed of virtual subjects. Generating a virtual dataset allowed us to build a population with different characteristics (obese, underweight, age, gender). However, synthetic data might differ from a real scenario, typical of the physician’s clinic. For this reason we develop a new virtual environment to facilitate the analysis of human subjects in 3D. This framework can simulate the acquisition process of a real camera, making it easy to analyze and to create training data for machine learning algorithms. With this virtual environment, we can easily simulate the real setup of a clinic, where a subject is standing in front of a camera, or may assume a different pose with respect to the camera. We use this newly designated environment to analyze the whole body surface area (WBSA). In particular, we show that we can obtain accurate WBSA estimations with just one view, virtually enabling the possibility to use inexpensive depth sensors (e.g., the Kinect) for large scale quantification of the WBSA from a single view 3D map. PMID:28045895
Menzel, Nancy; Willson, Laura Helen; Doolen, Jessica
2014-03-11
Social justice is a fundamental value of the nursing profession, challenging educators to instill this professional value when caring for the poor. This randomized controlled trial examined whether an interactive virtual poverty simulation created in Second Life® would improve nursing students' empathy with and attributions for people living in poverty, compared to a self-study module. We created a multi-user virtual environment populated with families and individual avatars that represented the demographics contributing to poverty and vulnerability. Participants (N = 51 baccalaureate nursing students) were randomly assigned to either Intervention or Control groups and completed the modified Attitudes toward Poverty Scale pre- and post-intervention. The 2.5-hour simulation was delivered three times over a 1-year period to students in successive community health nursing classes. The investigators conducted post-simulation debriefings following a script. While participants in the virtual poverty simulation developed significantly more favorable attitudes on five questions than the Control group, the total scores did not differ significantly. Whereas students readily learned how to navigate inside Second Life®, faculty facilitators required periodic coaching and guidance to be competent. While poverty simulations, whether virtual or face-to-face, have some ability to transform nursing student attitudes, faculty must incorporate social justice concepts throughout the curriculum to produce lasting change.
ERIC Educational Resources Information Center
Cela-Ranilla, Jose María; Esteve-Gonzalez, Vanessa; Esteve-Mon, Francesc; Gisbert-Cervera, Merce
2014-01-01
In this study we analyze how 57 Spanish university students of Education developed a learning process in a virtual world by conducting activities that involved the skill of self-management. The learning experience comprised a serious game designed in a 3D simulation environment. Descriptive statistics and non-parametric tests were used in the…
Three dimensional tracking with misalignment between display and control axes
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Tyler, Mitchell; Kim, Won S.; Stark, Lawrence
1992-01-01
Human operators confronted with misaligned display and control frames of reference performed three dimensional, pursuit tracking in virtual environment and virtual space simulations. Analysis of the components of the tracking errors in the perspective displays presenting virtual space showed that components of the error due to visual motor misalignment may be linearly separated from those associated with the mismatch between display and control coordinate systems. Tracking performance improved with several hours practice despite previous reports that such improvement did not take place.
Development of excavator training simulator using leap motion controller
NASA Astrophysics Data System (ADS)
Fahmi, F.; Nainggolan, F.; Andayani, U.; Siregar, B.
2018-03-01
Excavator is a heavy machinery that is used for many industries purposes. Controlling the excavator is not easy. Its operator has to be trained well in many skills to make sure it is safe, effective, and efficient while using the excavator. In this research, we proposed a virtual reality excavator simulator supported by a device called Leap Motion Controller that supports finger and hand motions as an input. This prototype will be developed than in the virtual reality environment to give a more real sensing to the user.
Shengqian Zhang; Yuan Zhang; Yu Sun; Thakor, Nitish; Bezerianos, Anastasios
2017-07-01
The research field of mental workload has attracted abundant researchers as mental workload plays a crucial role in real-life performance and safety. While previous studies have examined the neural correlates of mental workload in 2D scenarios (i.e., presenting stimuli on a computer screen (CS) environment using univariate methods (e.g., EEG channel power), it is still unclear of the findings of one that uses multivariate approach using graphical theory and the effects of a 3D environment (i.e., presenting stimuli on a Virtual Reality (VR)). In this study, twenty subjects undergo flight simulation in both CS and VR environment with three stages each. After preprocessing, the Electroencephalogram (EEG) signals were a connectivity matrix based on Phase Lag Index (PLI) will be constructed. Graph theory analysis then will be applied based on their global efficiency, local efficiency and nodal efficiency on both alpha and theta band. For global efficiency and local efficiency, VR values are generally lower than CS in both bands. For nodal efficiency, the regions that show at least marginally significant decreases are very different for CS and VR. These findings suggest that 3D simulation effects a higher mental workload than 2D simulation and that they each involved a different brain region.
2011-12-01
Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization
Novel 3D/VR interactive environment for MD simulations, visualization and analysis.
Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P
2014-12-18
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.
Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis
Doblack, Benjamin N.; Allis, Tim; Dávila, Lilian P.
2014-01-01
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced. PMID:25549300
NASA Technical Reports Server (NTRS)
Searcy, Brittani
2017-01-01
Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.
Human Motion Tracking and Glove-Based User Interfaces for Virtual Environments in ANVIL
NASA Technical Reports Server (NTRS)
Dumas, Joseph D., II
2002-01-01
The Army/NASA Virtual Innovations Laboratory (ANVIL) at Marshall Space Flight Center (MSFC) provides an environment where engineers and other personnel can investigate novel applications of computer simulation and Virtual Reality (VR) technologies. Among the many hardware and software resources in ANVIL are several high-performance Silicon Graphics computer systems and a number of commercial software packages, such as Division MockUp by Parametric Technology Corporation (PTC) and Jack by Unigraphics Solutions, Inc. These hardware and software platforms are used in conjunction with various VR peripheral I/O (input / output) devices, CAD (computer aided design) models, etc. to support the objectives of the MSFC Engineering Systems Department/Systems Engineering Support Group (ED42) by studying engineering designs, chiefly from the standpoint of human factors and ergonomics. One of the more time-consuming tasks facing ANVIL personnel involves the testing and evaluation of peripheral I/O devices and the integration of new devices with existing hardware and software platforms. Another important challenge is the development of innovative user interfaces to allow efficient, intuitive interaction between simulation users and the virtual environments they are investigating. As part of his Summer Faculty Fellowship, the author was tasked with verifying the operation of some recently acquired peripheral interface devices and developing new, easy-to-use interfaces that could be used with existing VR hardware and software to better support ANVIL projects.
State-of-the-Art of Virtual Reality Technologies for Children on the Autism Spectrum
ERIC Educational Resources Information Center
Parsons, Sarah; Cobb, Sue
2011-01-01
In the past decade there has been a rapid advance in the use of virtual reality (VR) technologies for leisure, training and education. VR is argued to offer particular benefits for children on the autism spectrum, chiefly because it can offer simulations of authentic real-world situations in a carefully controlled and safe environment. Given the…
ERIC Educational Resources Information Center
Aji, Chadia Affane; Khan, M. Javed
2015-01-01
Student engagement is an essential element for learning. Active learning has been consistently shown to increase student engagement and hence learning. Hands-on activities are one of the many active learning approaches. These activities vary from structured laboratory experiments on one end of the spectrum to virtual gaming environments and to for…
Stop Talking and Type: Comparing Virtual and Face-to-Face Mentoring in an Epistemic Game
ERIC Educational Resources Information Center
Bagley, E. A.; Shaffer, D. W.
2015-01-01
Research has shown that computer games and other virtual environments can support significant learning gains because they allow young people to explore complex concepts in simulated form. However, in complex problem-solving domains, complex thinking is learned not only by taking action, but also with the aid of mentors who provide guidance in the…
Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback.
Alaraj, Ali; Luciano, Cristian J; Bailey, Daniel P; Elsenousi, Abdussalam; Roitberg, Ben Z; Bernardo, Antonio; Banerjee, P Pat; Charbel, Fady T
2015-03-01
With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.
ERIC Educational Resources Information Center
Otamendi, Francisco Javier; Doncel, Luis Miguel
2013-01-01
Experimental teaching in general, and simulation in particular, have primarily been used in lecture rooms but in the future must also be adapted to e-learning. The integration of web simulators into virtual learning environments, coupled with specific supporting video documentation and the use of videoconference tools, results in robust…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tikotekar, Anand A; Vallee, Geoffroy R; Naughton III, Thomas J
2008-01-01
The topic of system-level virtualization has recently begun to receive interest for high performance computing (HPC). This is in part due to the isolation and encapsulation offered by the virtual machine. These traits enable applications to customize their environments and maintain consistent software configurations in their virtual domains. Additionally, there are mechanisms that can be used for fault tolerance like live virtual machine migration. Given these attractive benefits to virtualization, a fundamental question arises, how does this effect my scientific application? We use this as the premise for our paper and observe a real-world scientific code running on a Xenmore » virtual machine. We studied the effects of running a radiative transfer simulation, Hydrolight, on a virtual machine. We discuss our methodology and report observations regarding the usage of virtualization with this application.« less
Identifying postural control and thresholds of instability utilizing a motion-based ATV simulator.
DOT National Transportation Integrated Search
2017-01-01
Our ATV simulator is currently the only one in existence that allows studies of human subjects engaged in active riding, a process that is necessary for ATV operators to perform in order to maintain vehicle control, in a virtual reality environ...
A Review of Simulators with Haptic Devices for Medical Training.
Escobar-Castillejos, David; Noguez, Julieta; Neri, Luis; Magana, Alejandra; Benes, Bedrich
2016-04-01
Medical procedures often involve the use of the tactile sense to manipulate organs or tissues by using special tools. Doctors require extensive preparation in order to perform them successfully; for example, research shows that a minimum of 750 operations are needed to acquire sufficient experience to perform medical procedures correctly. Haptic devices have become an important training alternative and they have been considered to improve medical training because they let users interact with virtual environments by adding the sense of touch to the simulation. Previous articles in the field state that haptic devices enhance the learning of surgeons compared to current training environments used in medical schools (corpses, animals, or synthetic skin and organs). Consequently, virtual environments use haptic devices to improve realism. The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices. In particular we focus on stitching, palpation, dental procedures, endoscopy, laparoscopy, and orthopaedics. These simulators are reviewed and compared from the viewpoint of used technology, the number of degrees of freedom, degrees of force feedback, perceived realism, immersion, and feedback provided to the user. In the conclusion, several observations per area and suggestions for future work are provided.
A methodological, task-based approach to Procedure-Specific Simulations training.
Setty, Yaki; Salzman, Oren
2016-12-01
Procedure-Specific Simulations (PSS) are 3D realistic simulations that provide a platform to practice complete surgical procedures in a virtual-reality environment. While PSS have the potential to improve surgeons' proficiency, there are no existing standards or guidelines for PSS development in a structured manner. We employ a unique platform inspired by game design to develop virtual reality simulations in three dimensions of urethrovesical anastomosis during radical prostatectomy. 3D visualization is supported by a stereo vision, providing a fully realistic view of the simulation. The software can be executed for any robotic surgery platform. Specifically, we tested the simulation under windows environment on the RobotiX Mentor. Using urethrovesical anastomosis during radical prostatectomy simulation as a representative example, we present a task-based methodological approach to PSS training. The methodology provides tasks in increasing levels of difficulty from a novice level of basic anatomy identification, to an expert level that permits testing new surgical approaches. The modular methodology presented here can be easily extended to support more complex tasks. We foresee this methodology as a tool used to integrate PSS as a complementary training process for surgical procedures.
Hogg, Melissa E; Tam, Vernissia; Zenati, Mazen; Novak, Stephanie; Miller, Jennifer; Zureikat, Amer H; Zeh, Herbert J
Hepatobiliary surgery is a highly complex, low-volume specialty with long learning curves necessary to achieve optimal outcomes. This creates significant challenges in both training and measuring surgical proficiency. We hypothesize that a virtual reality curriculum with mastery-based simulation is a valid tool to train fellows toward operative proficiency. This study evaluates the content and predictive validity of robotic simulation curriculum as a first step toward developing a comprehensive, proficiency-based pathway. A mastery-based simulation curriculum was performed in a virtual reality environment. A pretest/posttest experimental design used both virtual reality and inanimate environments to evaluate improvement. Participants self-reported previous robotic experience and assessed the curriculum by rating modules based on difficulty and utility. This study was conducted at the University of Pittsburgh Medical Center (Pittsburgh, PA), a tertiary care academic teaching hospital. A total of 17 surgical oncology fellows enrolled in the curriculum, 16 (94%) completed. Of 16 fellows who completed the curriculum, 4 fellows (25%) achieved mastery on all 24 modules; on average, fellows mastered 86% of the modules. Following curriculum completion, individual test scores improved (p < 0.0001). An average of 2.4 attempts was necessary to master each module (range: 1-17). Median time spent completing the curriculum was 4.2 hours (range: 1.1-6.6). Total 8 (50%) fellows continued practicing modules beyond mastery. Survey results show that "needle driving" and "endowrist 2" modules were perceived as most difficult although "needle driving" modules were most useful. Overall, 15 (94%) fellows perceived improvement in robotic skills after completing the curriculum. In a cohort of board-certified general surgeons who are novices in robotic surgery, a mastery-based simulation curriculum demonstrated internal validity with overall score improvement. Time to complete the curriculum was manageable. Published by Elsevier Inc.
Virtual reality in surgical skills training.
Palter, Vanessa N; Grantcharov, Teodor P
2010-06-01
With recent concerns regarding patient safety, and legislation regarding resident work hours, it is accepted that a certain amount of surgical skills training will transition to the surgical skills laboratory. Virtual reality offers enormous potential to enhance technical and non-technical skills training outside the operating room. Virtual-reality systems range from basic low-fidelity devices to highly complex virtual environments. These systems can act as training and assessment tools, with the learned skills effectively transferring to an analogous clinical situation. Recent developments include expanding the role of virtual reality to allow for holistic, multidisciplinary team training in simulated operating rooms, and focusing on the role of virtual reality in evidence-based surgical curriculum design. Copyright 2010 Elsevier Inc. All rights reserved.
Schweller, Kenneth; Milne, Scott
2017-01-01
Abstract Virtual simulated environments provide multiple ways of testing cognitive function and evaluating problem solving with humans (e.g., Woollett et al. 2009). The use of such interactive technology has increasingly become an essential part of modern life (e.g., autonomously driving vehicles, global positioning systems (GPS), and touchscreen computers; Chinn and Fairlie 2007; Brown 2011). While many nonhuman animals have their own forms of "technology", such as chimpanzees who create and use tools, in captive animal environments the opportunity to actively participate with interactive technology is not often made available. Exceptions can be found in some state-of-the-art zoos and laboratory facilities (e.g., Mallavarapu and Kuhar 2005). When interactive technology is available, captive animals often selectively choose to engage with it. This enhances the animal’s sense of control over their immediate surroundings (e.g., Clay et al. 2011; Ackerman 2012). Such self-efficacy may help to fulfill basic requirements in a species’ daily activities using problem solving that can involve foraging and other goal-oriented behaviors. It also assists in fulfilling the strong underlying motivation for contrafreeloading and exploration expressed behaviorally by many species in captivity (Young 1999). Moreover, being able to present nonhuman primates virtual reality environments under experimental conditions provides the opportunity to gain insight into their navigational abilities and spatial cognition. It allows for insight into the generation and application of internal mental representations of landmarks and environments under multiple conditions (e.g., small- and large-scale space) and subsequent spatial behavior. This paper reviews methods using virtual reality developed to investigate the spatial cognitive abilities of nonhuman primates, and great apes in particular, in comparison with that of humans of multiple age groups. We make recommendations about training, best practices, and also pitfalls to avoid. PMID:29491967
Dolins, Francine L; Schweller, Kenneth; Milne, Scott
2017-02-01
Virtual simulated environments provide multiple ways of testing cognitive function and evaluating problem solving with humans (e.g., Woollett et al. 2009). The use of such interactive technology has increasingly become an essential part of modern life (e.g., autonomously driving vehicles, global positioning systems (GPS), and touchscreen computers; Chinn and Fairlie 2007; Brown 2011). While many nonhuman animals have their own forms of "technology", such as chimpanzees who create and use tools, in captive animal environments the opportunity to actively participate with interactive technology is not often made available. Exceptions can be found in some state-of-the-art zoos and laboratory facilities (e.g., Mallavarapu and Kuhar 2005). When interactive technology is available, captive animals often selectively choose to engage with it. This enhances the animal's sense of control over their immediate surroundings (e.g., Clay et al. 2011; Ackerman 2012). Such self-efficacy may help to fulfill basic requirements in a species' daily activities using problem solving that can involve foraging and other goal-oriented behaviors. It also assists in fulfilling the strong underlying motivation for contrafreeloading and exploration expressed behaviorally by many species in captivity (Young 1999). Moreover, being able to present nonhuman primates virtual reality environments under experimental conditions provides the opportunity to gain insight into their navigational abilities and spatial cognition. It allows for insight into the generation and application of internal mental representations of landmarks and environments under multiple conditions (e.g., small- and large-scale space) and subsequent spatial behavior. This paper reviews methods using virtual reality developed to investigate the spatial cognitive abilities of nonhuman primates, and great apes in particular, in comparison with that of humans of multiple age groups. We make recommendations about training, best practices, and also pitfalls to avoid.
Brown, Ross; Rasmussen, Rune; Baldwin, Ian; Wyeth, Peta
2012-08-01
Nursing training for an Intensive Care Unit (ICU) is a resource intensive process. High demands are made on staff, students and physical resources. Interactive, 3D computer simulations, known as virtual worlds, are increasingly being used to supplement training regimes in the health sciences; especially in areas such as complex hospital ward processes. Such worlds have been found to be very useful in maximising the utilisation of training resources. Our aim is to design and develop a novel virtual world application for teaching and training Intensive Care nurses in the approach and method for shift handover, to provide an independent, but rigorous approach to teaching these important skills. In this paper we present a virtual world simulator for students to practice key steps in handing over the 24/7 care requirements of intensive care patients during the commencing first hour of a shift. We describe the modelling process to provide a convincing interactive simulation of the handover steps involved. The virtual world provides a practice tool for students to test their analytical skills with scenarios previously provided by simple physical simulations, and live on the job training. Additional educational benefits include facilitation of remote learning, high flexibility in study hours and the automatic recording of a reviewable log from the session. To the best of our knowledge, we believe this is a novel and original application of virtual worlds to an ICU handover process. The major outcome of the work was a virtual world environment for training nurses in the shift handover process, designed and developed for use by postgraduate nurses in training. Copyright © 2012 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.
A review of the use of simulation in dental education.
Perry, Suzanne; Bridges, Susan Margaret; Burrow, Michael Francis
2015-02-01
In line with the advances in technology and communication, medical simulations are being developed to support the acquisition of requisite psychomotor skills before real-life clinical applications. This review article aimed to give a general overview of simulation in a cognate field, clinical dental education. Simulations in dentistry are not a new phenomenon; however, recent developments in virtual-reality technology using computer-generated medical simulations of 3-dimensional images or environments are providing more optimal practice conditions to smooth the transition from the traditional model-based simulation laboratory to the clinic. Evidence as to the positive aspects of virtual reality include increased effectiveness in comparison with traditional simulation teaching techniques, more efficient learning, objective and reproducible feedback, unlimited training hours, and enhanced cost-effectiveness for teaching establishments. Negative aspects have been indicated as initial setup costs, faculty training, and the lack of a variety of content and current educational simulation programs.
Influence of real and virtual heights on standing balance.
Cleworth, Taylor W; Horslen, Brian C; Carpenter, Mark G
2012-06-01
Fear and anxiety induced by threatening scenarios, such as standing on elevated surfaces, have been shown to influence postural control in young adults. There is also a need to understand how postural threat influences postural control in populations with balance deficits and risk of falls. However, safety and feasibility issues limit opportunities to place such populations in physically threatening scenarios. Virtual reality (VR) has successfully been used to simulate threatening environments, although it is unclear whether the same postural changes can be elicited by changes in virtual and real threat conditions. Therefore, the purpose of this study was to compare the effects of real and virtual heights on changes to standing postural control, electrodermal activity (EDA) and psycho-social state. Seventeen subjects stood at low and high heights in both real and virtual environments matched in scale and visual detail. A repeated measures ANOVA revealed increases with height, independent of visual environment, in EDA, anxiety, fear, and center of pressure (COP) frequency, and decreases with height in perceived stability, balance confidence and COP amplitude. Interaction effects were seen for fear and COP mean position; where real elicited larger changes with height than VR. This study demonstrates the utility of VR, as simulated heights resulted in changes to postural, autonomic and psycho-social measures similar to those seen at real heights. As a result, VR may be a useful tool for studying threat related changes in postural control in populations at risk of falls, and to screen and rehabilitate balance deficits associated with fear and anxiety. Copyright © 2012 Elsevier B.V. All rights reserved.
Efficacy of virtual reality in pedestrian safety research.
Deb, Shuchisnigdha; Carruth, Daniel W; Sween, Richard; Strawderman, Lesley; Garrison, Teena M
2017-11-01
Advances in virtual reality technology present new opportunities for human factors research in areas that are dangerous, difficult, or expensive to study in the real world. The authors developed a new pedestrian simulator using the HTC Vive head mounted display and Unity software. Pedestrian head position and orientation were tracked as participants attempted to safely cross a virtual signalized intersection (5.5 m). In 10% of 60 trials, a vehicle violated the traffic signal and in 10.84% of these trials, a collision between the vehicle and the pedestrian was observed. Approximately 11% of the participants experienced simulator sickness and withdrew from the study. Objective measures, including the average walking speed, indicate that participant behavior in VR matches published real world norms. Subjective responses indicate that the virtual environment was realistic and engaging. Overall, the study results confirm the effectiveness of the new virtual reality technology for research on full motion tasks. Copyright © 2017 Elsevier Ltd. All rights reserved.
Risk Reduction and Training using Simulation Based Tools - 12180
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Irin P.
2012-07-01
Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less
Chalil Madathil, Kapil; Greenstein, Joel S
2017-11-01
Collaborative virtual reality-based systems have integrated high fidelity voice-based communication, immersive audio and screen-sharing tools into virtual environments. Such three-dimensional collaborative virtual environments can mirror the collaboration among usability test participants and facilitators when they are physically collocated, potentially enabling moderated usability tests to be conducted effectively when the facilitator and participant are located in different places. We developed a virtual collaborative three-dimensional remote moderated usability testing laboratory and employed it in a controlled study to evaluate the effectiveness of moderated usability testing in a collaborative virtual reality-based environment with two other moderated usability testing methods: the traditional lab approach and Cisco WebEx, a web-based conferencing and screen sharing approach. Using a mixed methods experimental design, 36 test participants and 12 test facilitators were asked to complete representative tasks on a simulated online shopping website. The dependent variables included the time taken to complete the tasks; the usability defects identified and their severity; and the subjective ratings on the workload index, presence and satisfaction questionnaires. Remote moderated usability testing methodology using a collaborative virtual reality system performed similarly in terms of the total number of defects identified, the number of high severity defects identified and the time taken to complete the tasks with the other two methodologies. The overall workload experienced by the test participants and facilitators was the least with the traditional lab condition. No significant differences were identified for the workload experienced with the virtual reality and the WebEx conditions. However, test participants experienced greater involvement and a more immersive experience in the virtual environment than in the WebEx condition. The ratings for the virtual environment condition were not significantly different from those for the traditional lab condition. The results of this study suggest that participants were productive and enjoyed the virtual lab condition, indicating the potential of a virtual world based approach as an alternative to conventional approaches for synchronous usability testing. Copyright © 2017 Elsevier Ltd. All rights reserved.
Virtual and remote robotic laboratory using EJS, MATLAB and LabVIEW.
Chaos, Dictino; Chacón, Jesús; Lopez-Orozco, Jose Antonio; Dormido, Sebastián
2013-02-21
This paper describes the design and implementation of a virtual and remote laboratory based on Easy Java Simulations (EJS) and LabVIEW. The main application of this laboratory is to improve the study of sensors in Mobile Robotics, dealing with the problems that arise on the real world experiments. This laboratory allows the user to work from their homes, tele-operating a real robot that takes measurements from its sensors in order to obtain a map of its environment. In addition, the application allows interacting with a robot simulation (virtual laboratory) or with a real robot (remote laboratory), with the same simple and intuitive graphical user interface in EJS. Thus, students can develop signal processing and control algorithms for the robot in simulation and then deploy them on the real robot for testing purposes. Practical examples of application of the laboratory on the inter-University Master of Systems Engineering and Automatic Control are presented.
Virtual and Remote Robotic Laboratory Using EJS, MATLAB and Lab VIEW
Chaos, Dictino; Chacón, Jesús; Lopez-Orozco, Jose Antonio; Dormido, Sebastián
2013-01-01
This paper describes the design and implementation of a virtual and remote laboratory based on Easy Java Simulations (EJS) and LabVIEW. The main application of this laboratory is to improve the study of sensors in Mobile Robotics, dealing with the problems that arise on the real world experiments. This laboratory allows the user to work from their homes, tele-operating a real robot that takes measurements from its sensors in order to obtain a map of its environment. In addition, the application allows interacting with a robot simulation (virtual laboratory) or with a real robot (remote laboratory), with the same simple and intuitive graphical user interface in EJS. Thus, students can develop signal processing and control algorithms for the robot in simulation and then deploy them on the real robot for testing purposes. Practical examples of application of the laboratory on the inter-University Master of Systems Engineering and Automatic Control are presented. PMID:23429578
Workstations for people with disabilities: an example of a virtual reality approach
Budziszewski, Paweł; Grabowski, Andrzej; Milanowicz, Marcin; Jankowski, Jarosław
2016-01-01
This article describes a method of adapting workstations for workers with motion disability using computer simulation and virtual reality (VR) techniques. A workstation for grinding spring faces was used as an example. It was adjusted for two people with a disabled right upper extremity. The study had two stages. In the first, a computer human model with a visualization of maximal arm reach and preferred workspace was used to develop a preliminary modification of a virtual workstation. In the second stage, an immersive VR environment was used to assess the virtual workstation and to add further modifications. All modifications were assessed by measuring the efficiency of work and the number of movements involved. The results of the study showed that a computer simulation could be used to determine whether a worker with a disability could access all important areas of a workstation and to propose necessary modifications. PMID:26651540
Applied virtual reality at the Research Triangle Institute
NASA Technical Reports Server (NTRS)
Montoya, R. Jorge
1994-01-01
Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.
The effectiveness of virtual reality distraction for pain reduction: a systematic review.
Malloy, Kevin M; Milling, Leonard S
2010-12-01
Virtual reality technology enables people to become immersed in a computer-simulated, three-dimensional environment. This article provides a comprehensive review of controlled research on the effectiveness of virtual reality (VR) distraction for reducing pain. To be included in the review, studies were required to use a between-subjects or mixed model design in which VR distraction was compared with a control condition or an alternative intervention in relieving pain. An exhaustive search identified 11 studies satisfying these criteria. VR distraction was shown to be effective for reducing experimental pain, as well as the discomfort associated with burn injury care. Studies of needle-related pain provided less consistent findings. Use of more sophisticated virtual reality technology capable of fully immersing the individual in a virtual environment was associated with greater relief. Overall, controlled research suggests that VR distraction may be a useful tool for clinicians who work with a variety of pain problems. Copyright © 2010 Elsevier Ltd. All rights reserved.
Multiplexing Low and High QoS Workloads in Virtual Environments
NASA Astrophysics Data System (ADS)
Verboven, Sam; Vanmechelen, Kurt; Broeckhove, Jan
Virtualization technology has introduced new ways for managing IT infrastructure. The flexible deployment of applications through self-contained virtual machine images has removed the barriers for multiplexing, suspending and migrating applications with their entire execution environment, allowing for a more efficient use of the infrastructure. These developments have given rise to an important challenge regarding the optimal scheduling of virtual machine workloads. In this paper, we specifically address the VM scheduling problem in which workloads that require guaranteed levels of CPU performance are mixed with workloads that do not require such guarantees. We introduce a framework to analyze this scheduling problem and evaluate to what extent such mixed service delivery is beneficial for a provider of virtualized IT infrastructure. Traditionally providers offer IT resources under a guaranteed and fixed performance profile, which can lead to underutilization. The findings of our simulation study show that through proper tuning of a limited set of parameters, the proposed scheduling algorithm allows for a significant increase in utilization without sacrificing on performance dependability.
Quail, Michelle; Brundage, Shelley B; Spitalnick, Josh; Allen, Peter J; Beilby, Janet
2016-02-27
Advanced communication skills are vital for allied health professionals, yet students often have limited opportunities in which to develop them. The option of increasing clinical placement hours is unsustainable in a climate of constrained budgets, limited placement availability and increasing student numbers. Consequently, many educators are considering the potentials of alternative training methods, such as simulation. Simulations provide safe, repeatable and standardised learning environments in which students can practice a variety of clinical skills. This study investigated students' self-rated communication skill, knowledge, confidence and empathy across simulated and traditional learning environments. Undergraduate speech pathology students were randomly allocated to one of three communication partners with whom they engaged conversationally for up to 30 min: a patient in a nursing home (n = 21); an elderly trained patient actor (n = 22); or a virtual patient (n = 19). One week prior to, and again following the conversational interaction, participants completed measures of self-reported communication skill, knowledge and confidence (developed by the authors based on the Four Habit Coding Scheme), as well as the Jefferson Scale of Empathy - Health Professionals (student version). All three groups reported significantly higher communication knowledge, skills and confidence post-placement (Median d = .58), while the degree of change did not vary as a function of group membership (Median η (2) < .01). In addition, only students interacting with a nursing home resident reported higher empathy after the placement. Students reported that conversing with the virtual patient was more challenging than conversing with a nursing home patient or actor, and students appeared to derive the same benefit from the experience. Participants self-reported higher communication skill, knowledge and confidence, though not empathy, following a brief placement in a virtual, standardised or traditional learning environment. The self-reported increases were consistent across the three placement types. It is proposed that the findings from this study provide support for the integration of more sustainable, standardised, virtual patient-based placement models into allied health training programs for the training of communication skills.
Assessing Pedagogical Balance in a Simulated Classroom Environment
ERIC Educational Resources Information Center
Knezek, Gerald; Hopper, Susan B.; Christensen, Rhonda; Tyler-Wood, Tandra; Gibson, David C.
2015-01-01
simSchool, an online simulator that has been used to enhance teacher preparation since 2003, models different types of students and provides virtual practice sessions for teachers to assign tasks and interact with students. In this article the authors (a) examine changes in preservice teacher perceptions of teaching confidence and teaching…
The development of a collaborative virtual environment for finite element simulation
NASA Astrophysics Data System (ADS)
Abdul-Jalil, Mohamad Kasim
Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.
Chang, Justues; Banaszek, Daniel C; Gambrel, Jason; Bardana, Davide
2016-04-01
Work-hour restrictions and fatigue management strategies in surgical training programs continue to evolve in an effort to improve the learning environment and promote safer patient care. In response, training programs must reevaluate how various teaching modalities such as simulation can augment the development of surgical competence in trainees. For surgical simulators to be most useful, it is important to determine whether surgical proficiency can be reliably differentiated using them. To our knowledge, performance on both virtual and benchtop arthroscopy simulators has not been concurrently assessed in the same subjects. (1) Do global rating scales and procedure time differentiate arthroscopic expertise in virtual and benchtop knee models? (2) Can commercially available built-in motion analysis metrics differentiate arthroscopic expertise? (3) How well are performance measures on virtual and benchtop simulators correlated? (4) Are these metrics sensitive enough to differentiate by year of training? A cross-sectional study of 19 subjects (four medical students, 12 residents, and three staff) were recruited and divided into 11 novice arthroscopists (student to Postgraduate Year [PGY] 3) and eight proficient arthroscopists (PGY 4 to staff) who completed a diagnostic arthroscopy and loose-body retrieval in both virtual and benchtop knee models. Global rating scales (GRS), procedure times, and motion analysis metrics were used to evaluate performance. The proficient group scored higher on virtual (14 ± 6 [95% confidence interval {CI}, 10-18] versus 36 ± 5 [95% CI, 32-40], p < 0.001) and benchtop (16 ± 8 [95% CI, 11-21] versus 36 ± 5 [95% CI, 31-40], p < 0.001) GRS scales. The proficient subjects completed nearly all tasks faster than novice subjects, including the virtual scope (579 ±169 [95% CI, 466-692] versus 358 ± 178 [95% CI, 210-507] seconds, p = 0.02) and benchtop knee scope + probe (480 ± 160 [95% CI, 373-588] versus 277 ± 64 [95% CI, 224-330] seconds, p = 0.002). The built-in motion analysis metrics also distinguished novices from proficient arthroscopists using the self-generated virtual loose body retrieval task scores (4 ± 1 [95% CI, 3-5] versus 6 ± 1 [95% CI, 5-7], p = 0.001). GRS scores between virtual and benchtop models were very strongly correlated (ρ = 0.93, p < 0.001). There was strong correlation between year of training and virtual GRS (ρ = 0.8, p < 0.001) and benchtop GRS (ρ = 0.87, p < 0.001) scores. To our knowledge, this is the first study to evaluate performance on both virtual and benchtop knee simulators. We have shown that subjective GRS scores and objective motion analysis metrics and procedure time are valid measures to distinguish arthroscopic skill on both virtual and benchtop modalities. Performance on both modalities is well correlated. We believe that training on artificial models allows acquisition of skills in a safe environment. Future work should compare different modalities in the efficiency of skill acquisition, retention, and transferability to the operating room.
Rule-based modeling with Virtual Cell
Schaff, James C.; Vasilescu, Dan; Moraru, Ion I.; Loew, Leslie M.; Blinov, Michael L.
2016-01-01
Summary: Rule-based modeling is invaluable when the number of possible species and reactions in a model become too large to allow convenient manual specification. The popular rule-based software tools BioNetGen and NFSim provide powerful modeling and simulation capabilities at the cost of learning a complex scripting language which is used to specify these models. Here, we introduce a modeling tool that combines new graphical rule-based model specification with existing simulation engines in a seamless way within the familiar Virtual Cell (VCell) modeling environment. A mathematical model can be built integrating explicit reaction networks with reaction rules. In addition to offering a large choice of ODE and stochastic solvers, a model can be simulated using a network free approach through the NFSim simulation engine. Availability and implementation: Available as VCell (versions 6.0 and later) at the Virtual Cell web site (http://vcell.org/). The application installs and runs on all major platforms and does not require registration for use on the user’s computer. Tutorials are available at the Virtual Cell website and Help is provided within the software. Source code is available at Sourceforge. Contact: vcell_support@uchc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27497444
NASA Technical Reports Server (NTRS)
Frank, Andreas O.; Twombly, I. Alexander; Barth, Timothy J.; Smith, Jeffrey D.; Dalton, Bonnie P. (Technical Monitor)
2001-01-01
We have applied the linear elastic finite element method to compute haptic force feedback and domain deformations of soft tissue models for use in virtual reality simulators. Our results show that, for virtual object models of high-resolution 3D data (>10,000 nodes), haptic real time computations (>500 Hz) are not currently possible using traditional methods. Current research efforts are focused in the following areas: 1) efficient implementation of fully adaptive multi-resolution methods and 2) multi-resolution methods with specialized basis functions to capture the singularity at the haptic interface (point loading). To achieve real time computations, we propose parallel processing of a Jacobi preconditioned conjugate gradient method applied to a reduced system of equations resulting from surface domain decomposition. This can effectively be achieved using reconfigurable computing systems such as field programmable gate arrays (FPGA), thereby providing a flexible solution that allows for new FPGA implementations as improved algorithms become available. The resulting soft tissue simulation system would meet NASA Virtual Glovebox requirements and, at the same time, provide a generalized simulation engine for any immersive environment application, such as biomedical/surgical procedures or interactive scientific applications.
Virtual commissioning of automated micro-optical assembly
NASA Astrophysics Data System (ADS)
Schlette, Christian; Losch, Daniel; Haag, Sebastian; Zontar, Daniel; Roßmann, Jürgen; Brecher, Christian
2015-02-01
In this contribution, we present a novel approach to enable virtual commissioning for process developers in micro-optical assembly. Our approach aims at supporting micro-optics experts to effectively develop assisted or fully automated assembly solutions without detailed prior experience in programming while at the same time enabling them to easily implement their own libraries of expert schemes and algorithms for handling optical components. Virtual commissioning is enabled by a 3D simulation and visualization system in which the functionalities and properties of automated systems are modeled, simulated and controlled based on multi-agent systems. For process development, our approach supports event-, state- and time-based visual programming techniques for the agents and allows for their kinematic motion simulation in combination with looped-in simulation results for the optical components. First results have been achieved for simply switching the agents to command the real hardware setup after successful process implementation and validation in the virtual environment. We evaluated and adapted our system to meet the requirements set by industrial partners-- laser manufacturers as well as hardware suppliers of assembly platforms. The concept is applied to the automated assembly of optical components for optically pumped semiconductor lasers and positioning of optical components for beam-shaping
Banaszek, Daniel; You, Daniel; Chang, Justues; Pickell, Michael; Hesse, Daniel; Hopman, Wilma M; Borschneck, Daniel; Bardana, Davide
2017-04-05
Work-hour restrictions as set forth by the Accreditation Council for Graduate Medical Education (ACGME) and other governing bodies have forced training programs to seek out new learning tools to accelerate acquisition of both medical skills and knowledge. As a result, competency-based training has become an important part of residency training. The purpose of this study was to directly compare arthroscopic skill acquisition in both high-fidelity and low-fidelity simulator models and to assess skill transfer from either modality to a cadaveric specimen, simulating intraoperative conditions. Forty surgical novices (pre-clerkship-level medical students) voluntarily participated in this trial. Baseline demographic data, as well as data on arthroscopic knowledge and skill, were collected prior to training. Subjects were randomized to 5-week independent training sessions on a high-fidelity virtual reality arthroscopic simulator or on a bench-top arthroscopic setup, or to an untrained control group. Post-training, subjects were asked to perform a diagnostic arthroscopy on both simulators and in a simulated intraoperative environment on a cadaveric knee. A more difficult surprise task was also incorporated to evaluate skill transfer. Subjects were evaluated using the Global Rating Scale (GRS), the 14-point arthroscopic checklist, and a timer to determine procedural efficiency (time per task). Secondary outcomes focused on objective measures of virtual reality simulator motion analysis. Trainees on both simulators demonstrated a significant improvement (p < 0.05) in arthroscopic skills compared with baseline scores and untrained controls, both in and ex vivo. The virtual reality simulation group consistently outperformed the bench-top model group in the diagnostic arthroscopy crossover tests and in the simulated cadaveric setup. Furthermore, the virtual reality group demonstrated superior skill transfer in the surprise skill transfer task. Both high-fidelity and low-fidelity simulation trainings were effective in arthroscopic skill acquisition. High-fidelity virtual reality simulation was superior to bench-top simulation in the acquisition of arthroscopic skills, both in the laboratory and in vivo. Further clinical investigation is needed to interpret the importance of these results.
Immersive Environments - A Connectivist Approach
NASA Astrophysics Data System (ADS)
Loureiro, Ana; Bettencourt, Teresa
We are conducting a research project with the aim of achieving better and more efficient ways to facilitate teaching and learning in Higher Level Education. We have chosen virtual environments, with particular emphasis to Second Life® platform augmented by web 2.0 tools, to develop the study. The Second Life® environment has some interesting characteristics that captured our attention, it is immersive; it is a real world simulator; it is a social network; it allows real time communication, cooperation, collaboration and interaction; it is a safe and controlled environment. We specifically chose tools from web 2.0 that enable sharing and collaborative way of learning. Through understanding the characteristics of this learning environment, we believe that immersive learning along with other virtual tools can be integrated in today's pedagogical practices.
Steering Control in a Low-Cost Driving Simulator: A Case for the Role of Virtual Vehicle Cab.
Mecheri, Sami; Lobjois, Régis
2018-04-01
The aim of this study was to investigate steering control in a low-cost driving simulator with and without a virtual vehicle cab. In low-cost simulators, the lack of a vehicle cab denies driver access to vehicle width, which could affect steering control, insofar as locomotor adjustments are known to be based on action-scaled visual judgments of the environment. Two experiments were conducted in which steering control with and without a virtual vehicle cab was investigated in a within-subject design, using cornering and straight-lane-keeping tasks. Driving around curves without vehicle cab information made drivers deviate more from the lane center toward the inner edge in right (virtual cab = 4 ± 19 cm; no cab = 42 ± 28 cm; at the apex of the curve, p < .001) but not in left curves. More lateral deviation from the lane center toward the edge line was also found in driving without the virtual cab on straight roads (virtual cab = 21 ± 28 cm; no cab = 36 ± 27 cm; p < .001), whereas driving stability and presence ratings were not affected. In both experiments, the greater lateral deviation in the no-cab condition led to significantly more time driving off the lane. The findings strongly suggest that without cab information, participants underestimate the distance to the right edge of the car (in contrast to the left edge) and thus vehicle width. This produces considerable differences in the steering trajectory. Providing a virtual vehicle cab must be encouraged for more effectively capturing drivers' steering control in low-cost simulators.
Tunable aqueous virtual micropore.
Park, Jae Hyun; Guan, Weihua; Reed, Mark A; Krstić, Predrag S
2012-03-26
A charged microparticle can be trapped in an aqueous environment by forming a narrow virtual pore--a cylindrical space region in which the particle motion in the radial direction is limited by forces emerging from dynamical interactions of the particle charge and dipole moment with an external radiofrequency quadrupole electric field. If the particle satisfies the trap stability criteria, its mean motion is reduced exponentially with time due to the viscosity of the aqueous environment; thereafter the long-time motion of particle is subject only to random, Brownian fluctuations, whose magnitude, influenced by the electrophoretic and dielectrophoretic effects and added to the particle size, determines the radius of the virtual pore, which is demonstrated by comparison of computer simulations and experiment. The measured size of the virtual nanopore could be utilized to estimate the charge of a trapped micro-object. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Using EMG to anticipate head motion for virtual-environment applications
NASA Technical Reports Server (NTRS)
Barniv, Yair; Aguilar, Mario; Hasanbelliu, Erion
2005-01-01
In virtual environment (VE) applications, where virtual objects are presented in a see-through head-mounted display, virtual images must be continuously stabilized in space in response to user's head motion. Time delays in head-motion compensation cause virtual objects to "swim" around instead of being stable in space which results in misalignment errors when overlaying virtual and real objects. Visual update delays are a critical technical obstacle for implementing head-mounted displays in applications such as battlefield simulation/training, telerobotics, and telemedicine. Head motion is currently measurable by a head-mounted 6-degrees-of-freedom inertial measurement unit. However, even given this information, overall VE-system latencies cannot be reduced under about 25 ms. We present a novel approach to eliminating latencies, which is premised on the fact that myoelectric signals from a muscle precede its exertion of force, thereby limb or head acceleration. We thus suggest utilizing neck-muscles' myoelectric signals to anticipate head motion. We trained a neural network to map such signals onto equivalent time-advanced inertial outputs. The resulting network can achieve time advances of up to 70 ms.
Using EMG to anticipate head motion for virtual-environment applications.
Barniv, Yair; Aguilar, Mario; Hasanbelliu, Erion
2005-06-01
In virtual environment (VE) applications, where virtual objects are presented in a see-through head-mounted display, virtual images must be continuously stabilized in space in response to user's head motion. Time delays in head-motion compensation cause virtual objects to "swim" around instead of being stable in space which results in misalignment errors when overlaying virtual and real objects. Visual update delays are a critical technical obstacle for implementing head-mounted displays in applications such as battlefield simulation/training, telerobotics, and telemedicine. Head motion is currently measurable by a head-mounted 6-degrees-of-freedom inertial measurement unit. However, even given this information, overall VE-system latencies cannot be reduced under about 25 ms. We present a novel approach to eliminating latencies, which is premised on the fact that myoelectric signals from a muscle precede its exertion of force, thereby limb or head acceleration. We thus suggest utilizing neck-muscles' myoelectric signals to anticipate head motion. We trained a neural network to map such signals onto equivalent time-advanced inertial outputs. The resulting network can achieve time advances of up to 70 ms.
Generating Contextual Descriptions of Virtual Reality (VR) Spaces
NASA Astrophysics Data System (ADS)
Olson, D. M.; Zaman, C. H.; Sutherland, A.
2017-12-01
Virtual reality holds great potential for science communication, education, and research. However, interfaces for manipulating data and environments in virtual worlds are limited and idiosyncratic. Furthermore, speech and vision are the primary modalities by which humans collect information about the world, but the linking of visual and natural language domains is a relatively new pursuit in computer vision. Machine learning techniques have been shown to be effective at image and speech classification, as well as at describing images with language (Karpathy 2016), but have not yet been used to describe potential actions. We propose a technique for creating a library of possible context-specific actions associated with 3D objects in immersive virtual worlds based on a novel dataset generated natively in virtual reality containing speech, image, gaze, and acceleration data. We will discuss the design and execution of a user study in virtual reality that enabled the collection and the development of this dataset. We will also discuss the development of a hybrid machine learning algorithm linking vision data with environmental affordances in natural language. Our findings demonstrate that it is possible to develop a model which can generate interpretable verbal descriptions of possible actions associated with recognized 3D objects within immersive VR environments. This suggests promising applications for more intuitive user interfaces through voice interaction within 3D environments. It also demonstrates the potential to apply vast bodies of embodied and semantic knowledge to enrich user interaction within VR environments. This technology would allow for applications such as expert knowledge annotation of 3D environments, complex verbal data querying and object manipulation in virtual spaces, and computer-generated, dynamic 3D object affordances and functionality during simulations.
Dr.LiTHO: a development and research lithography simulator
NASA Astrophysics Data System (ADS)
Fühner, Tim; Schnattinger, Thomas; Ardelean, Gheorghe; Erdmann, Andreas
2007-03-01
This paper introduces Dr.LiTHO, a research and development oriented lithography simulation environment developed at Fraunhofer IISB to flexibly integrate our simulation models into one coherent platform. We propose a light-weight approach to a lithography simulation environment: The use of a scripting (batch) language as an integration platform. Out of the great variety of different scripting languages, Python proved superior in many ways: It exhibits a good-natured learning-curve, it is efficient, available on virtually any platform, and provides sophisticated integration mechanisms for existing programs. In this paper, we will describe the steps, required to provide Python bindings for existing programs and to finally generate an integrated simulation environment. In addition, we will give a short introduction into selected software design demands associated with the development of such a framework. We will especially focus on testing and (both technical and user-oriented) documentation issues. Dr.LiTHO Python files contain not only all simulation parameter settings but also the simulation flow, providing maximum flexibility. In addition to relatively simple batch jobs, repetitive tasks can be pooled in libraries. And as Python is a full-blown programming language, users can add virtually any functionality, which is especially useful in the scope of simulation studies or optimization tasks, that often require masses of evaluations. Furthermore, we will give a short overview of the numerous existing Python packages. Several examples demonstrate the feasibility and productiveness of integrating Python packages into custom Dr.LiTHO scripts.
Developing Simulated Cyber Attack Scenarios Against Virtualized Adversary Networks
2017-03-01
MAST is a custom software framework originally designed to facilitate the training of network administrators on live networks using SimWare. The MAST...or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington headquarters Services ...scenario development and testing in a virtual test environment. Commercial and custom software tools that provide the ability to conduct network
Simulation-based Testing of Control Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozmen, Ozgur; Nutaro, James J.; Sanyal, Jibonananda
It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulatormore » can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.« less
Evaluation of Pseudo-Haptic Interactions with Soft Objects in Virtual Environments.
Li, Min; Sareh, Sina; Xu, Guanghua; Ridzuan, Maisarah Binti; Luo, Shan; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar
2016-01-01
This paper proposes a pseudo-haptic feedback method conveying simulated soft surface stiffness information through a visual interface. The method exploits a combination of two feedback techniques, namely visual feedback of soft surface deformation and control of the indenter avatar speed, to convey stiffness information of a simulated surface of a soft object in virtual environments. The proposed method was effective in distinguishing different sizes of virtual hard nodules integrated into the simulated soft bodies. To further improve the interactive experience, the approach was extended creating a multi-point pseudo-haptic feedback system. A comparison with regards to (a) nodule detection sensitivity and (b) elapsed time as performance indicators in hard nodule detection experiments to a tablet computer incorporating vibration feedback was conducted. The multi-point pseudo-haptic interaction is shown to be more time-efficient than the single-point pseudo-haptic interaction. It is noted that multi-point pseudo-haptic feedback performs similarly well when compared to a vibration-based feedback method based on both performance measures elapsed time and nodule detection sensitivity. This proves that the proposed method can be used to convey detailed haptic information for virtual environmental tasks, even subtle ones, using either a computer mouse or a pressure sensitive device as an input device. This pseudo-haptic feedback method provides an opportunity for low-cost simulation of objects with soft surfaces and hard inclusions, as, for example, occurring in ever more realistic video games with increasing emphasis on interaction with the physical environment and minimally invasive surgery in the form of soft tissue organs with embedded cancer nodules. Hence, the method can be used in many low-budget applications where haptic sensation is required, such as surgeon training or video games, either using desktop computers or portable devices, showing reasonably high fidelity in conveying stiffness perception to the user.
Gutiérrez, Fátima; Pierce, Jennifer; Vergara, Víctor M; Coulter, Robert; Saland, Linda; Caudell, Thomas P; Goldsmith, Timothy E; Alverson, Dale C
2007-01-01
Simulations are being used in education and training to enhance understanding, improve performance, and assess competence. However, it is important to measure the performance of these simulations as learning and training tools. This study examined and compared knowledge acquisition using a knowledge structure design. The subjects were first-year medical students at The University of New Mexico School of Medicine. One group used a fully immersed virtual reality (VR) environment using a head mounted display (HMD) and another group used a partially immersed (computer screen) VR environment. The study aims were to determine whether there were significant differences between the two groups as measured by changes in knowledge structure before and after the VR simulation experience. The results showed that both groups benefited from the VR simulation training as measured by the significant increased similarity to the expert knowledge network after the training experience. However, the immersed group showed a significantly higher gain than the partially immersed group. This study demonstrated a positive effect of VR simulation on learning as reflected by improvements in knowledge structure but an enhanced effect of full-immersion using a HMD vs. a screen-based VR system.
Challenges and solutions for realistic room simulation
NASA Astrophysics Data System (ADS)
Begault, Durand R.
2002-05-01
Virtual room acoustic simulation (auralization) techniques have traditionally focused on answering questions related to speech intelligibility or musical quality, typically in large volumetric spaces. More recently, auralization techniques have been found to be important for the externalization of headphone-reproduced virtual acoustic images. Although externalization can be accomplished using a minimal simulation, data indicate that realistic auralizations need to be responsive to head motion cues for accurate localization. Computational demands increase when providing for the simulation of coupled spaces, small rooms lacking meaningful reverberant decays, or reflective surfaces in outdoor environments. Auditory threshold data for both early reflections and late reverberant energy levels indicate that much of the information captured in acoustical measurements is inaudible, minimizing the intensive computational requirements of real-time auralization systems. Results are presented for early reflection thresholds as a function of azimuth angle, arrival time, and sound-source type, and reverberation thresholds as a function of reverberation time and level within 250-Hz-2-kHz octave bands. Good agreement is found between data obtained in virtual room simulations and those obtained in real rooms, allowing a strategy for minimizing computational requirements of real-time auralization systems.
Connors, Erin C; Yazzolino, Lindsay A; Sánchez, Jaime; Merabet, Lotfi B
2013-03-27
Audio-based Environment Simulator (AbES) is virtual environment software designed to improve real world navigation skills in the blind. Using only audio based cues and set within the context of a video game metaphor, users gather relevant spatial information regarding a building's layout. This allows the user to develop an accurate spatial cognitive map of a large-scale three-dimensional space that can be manipulated for the purposes of a real indoor navigation task. After game play, participants are then assessed on their ability to navigate within the target physical building represented in the game. Preliminary results suggest that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building as indexed by their performance on a series of navigation tasks. These tasks included path finding through the virtual and physical building, as well as a series of drop off tasks. We find that the immersive and highly interactive nature of the AbES software appears to greatly engage the blind user to actively explore the virtual environment. Applications of this approach may extend to larger populations of visually impaired individuals.
Linking Immersive Virtual Field Trips with an Adaptive Learning Platform
NASA Astrophysics Data System (ADS)
Bruce, G.; Taylor, W.; Anbar, A. D.; Semken, S. C.; Buxner, S.; Mead, C.; El-Moujaber, E.; Summons, R. E.; Oliver, C.
2016-12-01
The use of virtual environments in science education has been constrained by the difficulty of guiding a learner's actions within the those environments. In this work, we demonstrate how advances in education software technology allow educators to create interactive learning experiences that respond and adapt intelligently to learner input within the virtual environment. This innovative technology provides a far greater capacity for delivering authentic inquiry-driven educational experiences in unique settings from around the world. Our immersive virtual field trips (iVFT) bring students virtually to geologically significant but inaccessible environments, where they learn through authentic practices of scientific inquiry. In one recent example, students explore the fossil beds in Nilpena, South Australia to learn about the Ediacaran fauna. Students interactively engage in 360° recreations of the environment, uncover the nature of the historical ecosystem by identifying fossils with a dichotomous key, explore actual fossil beds in high resolution imagery, and reconstruct what an ecosystem might have looked like millions of years ago in an interactive simulation. With the new capacity to connect actions within the iVFT to an intelligent tutoring system, these learning experiences can be tracked, guided, and tailored individually to the immediate actions of the student. This new capacity also has great potential for learning designers to take a data-driven approach to lesson improvement and for education researchers to study learning in virtual environments. Thus, we expect iVFT will be fertile ground for novel research. Such iVFT are currently in use in several introductory classes offered online at Arizona State University in anthropology, introductory biology, and astrobiology, reaching thousands of students to date. Drawing from these experiences, we are designing a curriculum for historical geology that will be built around iVFT-based exploration of Earth history.
Virtual Reality as a Distraction Technique in Chronic Pain Patients
Gao, Kenneth; Sulea, Camelia; Wiederhold, Mark D.
2014-01-01
Abstract We explored the use of virtual reality distraction techniques for use as adjunctive therapy to treat chronic pain. Virtual environments were specifically created to provide pleasant and engaging experiences where patients navigated on their own through rich and varied simulated worlds. Real-time physiological monitoring was used as a guide to determine the effectiveness and sustainability of this intervention. Human factors studies showed that virtual navigation is a safe and effective method for use with chronic pain patients. Chronic pain patients demonstrated significant relief in subjective ratings of pain that corresponded to objective measurements in peripheral, noninvasive physiological measures. PMID:24892196
Comparing two types of navigational interfaces for Virtual Reality.
Teixeira, Luís; Vilar, Elisângela; Duarte, Emília; Rebelo, Francisco; da Silva, Fernando Moreira
2012-01-01
Previous studies suggest significant differences between navigating virtual environments in a life-like walking manner (i.e., using treadmills or walk-in-place techniques) and virtual navigation (i.e., flying while really standing). The latter option, which usually involves hand-centric devices (e.g., joysticks), is the most common in Virtual Reality-based studies, mostly due to low costs, less space and technology demands. However, recently, new interaction devices, originally conceived for videogames have become available offering interesting potentialities for research. This study aimed to explore the potentialities of the Nintendo Wii Balance Board as a navigation interface in a Virtual Environment presented in an immersive Virtual Reality system. Comparing participants' performance while engaged in a simulated emergency egress allows determining the adequacy of such alternative navigation interface on the basis of empirical results. Forty university students participated in this study. Results show that participants were more efficient when performing navigation tasks using the Joystick than with the Balance Board. However there were no significantly differences in the behavioral compliance with exit signs. Therefore, this study suggests that, at least for tasks similar to the studied, the Balance Board have good potentiality to be used as a navigation interface for Virtual Reality systems.
Enhancing Navigation Skills through Audio Gaming.
Sánchez, Jaime; Sáenz, Mauricio; Pascual-Leone, Alvaro; Merabet, Lotfi
2010-01-01
We present the design, development and initial cognitive evaluation of an Audio-based Environment Simulator (AbES). This software allows a blind user to navigate through a virtual representation of a real space for the purposes of training orientation and mobility skills. Our findings indicate that users feel satisfied and self-confident when interacting with the audio-based interface, and the embedded sounds allow them to correctly orient themselves and navigate within the virtual world. Furthermore, users are able to transfer spatial information acquired through virtual interactions into real world navigation and problem solving tasks.
Enhancing Navigation Skills through Audio Gaming
Sánchez, Jaime; Sáenz, Mauricio; Pascual-Leone, Alvaro; Merabet, Lotfi
2014-01-01
We present the design, development and initial cognitive evaluation of an Audio-based Environment Simulator (AbES). This software allows a blind user to navigate through a virtual representation of a real space for the purposes of training orientation and mobility skills. Our findings indicate that users feel satisfied and self-confident when interacting with the audio-based interface, and the embedded sounds allow them to correctly orient themselves and navigate within the virtual world. Furthermore, users are able to transfer spatial information acquired through virtual interactions into real world navigation and problem solving tasks. PMID:25505796
Simulation Platform: a cloud-based online simulation environment.
Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro
2011-09-01
For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.
Reprint of: Simulation Platform: a cloud-based online simulation environment.
Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro
2011-11-01
For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.
Virtual Reality Simulation of the International Space Welding Experiment
NASA Technical Reports Server (NTRS)
Phillips, James A.
1996-01-01
Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.
Virtual Habitat -a Dynamic Simulation of Closed Life Support Systems -Overall Status and Outlook
NASA Astrophysics Data System (ADS)
Zhukov, Anton; Schnaitmann, Jonas; Mecsaci, Ahmad; Bickel, Thomas; Markus Czupalla, M. Sc.
In order to optimize Life Support Systems (LSS) on a system level, stability questions and closure grade must be investigated. To do so the exploration group of the Technical University of Munich (TUM) is developing the "Virtual Habitat" (V-HAB) dynamic LSS simulation software. The main advantages of the dynamic simulation of LSS within V-HAB are the possibilities to compose different LSS configurations from the LSS subsystems and conduct dynamic simulation of it to test its stability in different mission scenarios inclusive emergency events and define the closure grade of the LSS. Additional the optimization of LSS based on different criteria will be possible. The Virtual Habitat simulation tool consists of four main modules: • Closed Environment Module (CEM) -monitoring of compounds in a closed environment • Crew Module (CM) -dynamic human simulation • P/C Systems Module (PCSM) -dynamic P/C subsystems • Plant Module (PM) -dynamic plant simulation Since the first idea and version, the V-HAB simulation has been significantly updated increasing its capabilities and maturity significantly. The updates which shall be introduced concern all modules of V-HAB. In particular: Significant progress has been made in development of the human model. In addition to the exist-ing human sub-models three newly developed ones (thermal regulation, digestion and schedule controller) have been introduced and shall be presented. Regarding the Plant Module a wheat plant model has been integrated in the V-HAB and is being correlated against test data. Ad-ditionally a first version of the algae bioreactor model has been developed and integrated. In terms of the P/C System module, an innovative approach for the P/C subsystem modelling has been developed and applied. The capabilities and features of the improved V-HAB models and the overall functionality of the V-HAB are demonstrated in form of meaningful test cases. In addition to the presentation of the results, the correlation strategy for the Virtual Habitat simulation shall be introduced assessing the models current confidence level and giving an outlook on the future correlation strategy.
NASA Astrophysics Data System (ADS)
Jaafar, Wan Ahmad; Nur, Sobihatun
This paper is outlining the potential use of virtual environment in persuading through computer simulation. The main focus of the paper is to apply an attempt of how virtual rehearsal principle can be designed into educational material using CD ROM based multimedia application to persuade as well as to reduce children dental anxiety particularly in Malaysian children context. This paper divided in three stages. Firstly, we provide a conceptual background of virtual rehearsal principle and how the principle has been applied in designing the information interfaces and presentation of a persuasive multimedia learning environment (PMLE). Secondly, the research design was administered to measure the effects of the PMLE in reducing children dental anxiety. Primary school children age between seven and nine years old are selected as respondents. Thirdly, the result of the study has revealed the feedback from children regarding baseline test and children dental anxiety test. The results on presenting this PMLE to primary school children show how it was able to reduce children dental anxiety and could let the children have a "mentally-prepared" condition for dental visit in the future.
Christiansen, C; Abreu, B; Ottenbacher, K; Huffman, K; Masel, B; Culpepper, R
1998-08-01
This report describes a reliability study using a prototype computer-simulated virtual environment to assess basic daily living skills in a sample of persons with traumatic brain injury (TBI). The benefits of using virtual reality in training for situations where safety is a factor have been established in defense and industry, but have not been demonstrated in rehabilitation. Thirty subjects with TBI receiving comprehensive rehabilitation services at a residential facility. An immersive virtual kitchen was developed in which a meal preparation task involving multiple steps could be performed. The prototype was tested using subjects who completed the task twice within 7 days. The stability of performance was estimated using intraclass correlation coefficients (ICCs). The ICC value for total performance based on all steps involved in the meal preparation task was .73. When three items with low variance were removed the ICC improved to .81. Little evidence of vestibular optical side-effects was noted in the subjects tested. Adequate initial reliability exists to continue development of the environment as an assessment and training prototype for persons with brain injury.
Real-time visual simulation of APT system based on RTW and Vega
NASA Astrophysics Data System (ADS)
Xiong, Shuai; Fu, Chengyu; Tang, Tao
2012-10-01
The Matlab/Simulink simulation model of APT (acquisition, pointing and tracking) system is analyzed and established. Then the model's C code which can be used for real-time simulation is generated by RTW (Real-Time Workshop). Practical experiments show, the simulation result of running the C code is the same as running the Simulink model directly in the Matlab environment. MultiGen-Vega is a real-time 3D scene simulation software system. With it and OpenGL, the APT scene simulation platform is developed and used to render and display the virtual scenes of the APT system. To add some necessary graphics effects to the virtual scenes real-time, GLSL (OpenGL Shading Language) shaders are used based on programmable GPU. By calling the C code, the scene simulation platform can adjust the system parameters on-line and get APT system's real-time simulation data to drive the scenes. Practical application shows that this visual simulation platform has high efficiency, low charge and good simulation effect.
Internet-based distributed collaborative environment for engineering education and design
NASA Astrophysics Data System (ADS)
Sun, Qiuli
2001-07-01
This research investigates the use of the Internet for engineering education, design, and analysis through the presentation of a Virtual City environment. The main focus of this research was to provide an infrastructure for engineering education, test the concept of distributed collaborative design and analysis, develop and implement the Virtual City environment, and assess the environment's effectiveness in the real world. A three-tier architecture was adopted in the development of the prototype, which contains an online database server, a Web server as well as multi-user servers, and client browsers. The environment is composed of five components, a 3D virtual world, multiple Internet-based multimedia modules, an online database, a collaborative geometric modeling module, and a collaborative analysis module. The environment was designed using multiple Intenet-based technologies, such as Shockwave, Java, Java 3D, VRML, Perl, ASP, SQL, and a database. These various technologies together formed the basis of the environment and were programmed to communicate smoothly with each other. Three assessments were conducted over a period of three semesters. The Virtual City is open to the public at www.vcity.ou.edu. The online database was designed to manage the changeable data related to the environment. The virtual world was used to implement 3D visualization and tie the multimedia modules together. Students are allowed to build segments of the 3D virtual world upon completion of appropriate undergraduate courses in civil engineering. The end result is a complete virtual world that contains designs from all of their coursework and is viewable on the Internet. The environment is a content-rich educational system, which can be used to teach multiple engineering topics with the help of 3D visualization, animations, and simulations. The concept of collaborative design and analysis using the Internet was investigated and implemented. Geographically dispersed users can build the same geometric model simultaneously over the Internet and communicate with each other through a chat room. They can also conduct finite element analysis collaboratively on the same object over the Internet. They can mesh the same object, apply and edit the same boundary conditions and forces, obtain the same analysis results, and then discuss the results through the Internet.
Immersive Collaboration Simulations: Multi-User Virtual Environments and Augmented Realities
NASA Technical Reports Server (NTRS)
Dede, Chris
2008-01-01
Emerging information technologies are reshaping the following: shifts in the knowledge and skills society values, development of new methods of teaching and learning, and changes in the characteristics of learning.
Digital fabrication of multi-material biomedical objects.
Cheung, H H; Choi, S H
2009-12-01
This paper describes a multi-material virtual prototyping (MMVP) system for modelling and digital fabrication of discrete and functionally graded multi-material objects for biomedical applications. The MMVP system consists of a DMMVP module, an FGMVP module and a virtual reality (VR) simulation module. The DMMVP module is used to model discrete multi-material (DMM) objects, while the FGMVP module is for functionally graded multi-material (FGM) objects. The VR simulation module integrates these two modules to perform digital fabrication of multi-material objects, which can be subsequently visualized and analysed in a virtual environment to optimize MMLM processes for fabrication of product prototypes. Using the MMVP system, two biomedical objects, including a DMM human spine and an FGM intervertebral disc spacer are modelled and digitally fabricated for visualization and analysis in a VR environment. These studies show that the MMVP system is a practical tool for modelling, visualization, and subsequent fabrication of biomedical objects of discrete and functionally graded multi-materials for biomedical applications. The system may be adapted to control MMLM machines with appropriate hardware for physical fabrication of biomedical objects.
Ntasis, Efthymios; Maniatis, Theofanis A; Nikita, Konstantina S
2003-01-01
A secure framework is described for real-time tele-collaboration on Virtual Simulation procedure of Radiation Treatment Planning. An integrated approach is followed clustering the security issues faced by the system into organizational issues, security issues over the LAN and security issues over the LAN-to-LAN connection. The design and the implementation of the security services are performed according to the identified security requirements, along with the need for real time communication between the collaborating health care professionals. A detailed description of the implementation is given, presenting a solution, which can directly be tailored to other tele-collaboration services in the field of health care. The pilot study of the proposed security components proves the feasibility of the secure environment, and the consistency with the high performance demands of the application.
Using a Virtual Store As a Research Tool to Investigate Consumer In-store Behavior.
Ploydanai, Kunalai; van den Puttelaar, Jos; van Herpen, Erica; van Trijp, Hans
2017-07-24
People's responses to products and/or choice environments are crucial to understanding in-store consumer behaviors. Currently, there are various approaches (e.g., surveys or laboratory settings) to study in-store behaviors, but the external validity of these is limited by their poor capability to resemble realistic choice environments. In addition, building a real store to meet experimental conditions while controlling for undesirable effects is costly and highly difficult. A virtual store developed by virtual reality techniques potentially transcends these limitations by offering the simulation of a 3D virtual store environment in a realistic, flexible, and cost-efficient way. In particular, a virtual store interactively allows consumers (participants) to experience and interact with objects in a tightly controlled yet realistic setting. This paper presents the key elements of using a desktop virtual store to study in-store consumer behavior. Descriptions of the protocol steps to: 1) build the experimental store, 2) prepare the data management program, 3) run the virtual store experiment, and 4) organize and export data from the data management program are presented. The virtual store enables participants to navigate through the store, choose a product from alternatives, and select or return products. Moreover, consumer-related shopping behaviors (e.g., shopping time, walking speed, and number and type of products examined and bought) can also be collected. The protocol is illustrated with an example of a store layout experiment showing that shelf length and shelf orientation influence shopping- and movement-related behaviors. This demonstrates that the use of a virtual store facilitates the study of consumer responses. The virtual store can be especially helpful when examining factors that are costly or difficult to change in real life (e.g., overall store layout), products that are not presently available in the market, and routinized behaviors in familiar environments.
Vectors in Use in a 3D Juggling Game Simulation
ERIC Educational Resources Information Center
Kynigos, Chronis; Latsi, Maria
2006-01-01
The new representations enabled by the educational computer game the "Juggler" can place vectors in a central role both for controlling and measuring the behaviours of objects in a virtual environment simulating motion in three-dimensional spaces. The mathematical meanings constructed by 13 year-old students in relation to vectors as…
Immersive Learning Technologies: Realism and Online Authentic Learning
ERIC Educational Resources Information Center
Herrington, Jan; Reeves, Thomas C.; Oliver, Ron
2007-01-01
The development of immersive learning technologies in the form of virtual reality and advanced computer applications has meant that realistic creations of simulated environments are now possible. Such simulations have been used to great effect in training in the military, air force, and in medical training. But how realistic do problems need to be…
Student Perceptions of SocialSim for Simulation-Based Interprofessional Education in Healthcare
ERIC Educational Resources Information Center
Smith, Mary Kathryn
2016-01-01
This descriptive qualitative study investigates perceptions of students regarding the use of SocialSim, a tool designed to deliver simulation in a virtual environment using social media as a platform to facilitate inteprofessional education. There have been exponential changes in U.S. healthcare system in recent years, prompting the need for…
Andersen, Steven Arild Wuyts; Foghsgaard, Søren; Konge, Lars; Cayé-Thomasen, Per; Sørensen, Mads Sølvsten
2016-08-01
To establish the effect of self-directed virtual reality (VR) simulation training on cadaveric dissection training performance in mastoidectomy and the transferability of skills acquired in VR simulation training to the cadaveric dissection training setting. Prospective study. Two cohorts of 20 novice otorhinolaryngology residents received either self-directed VR simulation training before cadaveric dissection training or vice versa. Cadaveric and VR simulation performances were assessed using final-product analysis with three blinded expert raters. The group receiving VR simulation training before cadaveric dissection had a mean final-product score of 14.9 (95 % confidence interval [CI] [12.9-16.9]) compared with 9.8 (95% CI [8.4-11.1]) in the group not receiving VR simulation training before cadaveric dissection. This 52% increase in performance was statistically significantly (P < 0.0001). A single dissection mastoidectomy did not increase VR simulation performance (P = 0.22). Two hours of self-directed VR simulation training was effective in increasing cadaveric dissection mastoidectomy performance and suggests that mastoidectomy skills are transferable from VR simulation to the traditional dissection setting. Virtual reality simulation training can therefore be employed to optimize training, and can spare the use of donated material and instructional resources for more advanced training after basic competencies have been acquired in the VR simulation environment. NA. Laryngoscope, 126:1883-1888, 2016. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.
2006-01-01
The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed with respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the useful specifications of augmented reality displays, an optical see-through display was used in an ATC Tower simulation. Three different binocular fields of view (14deg, 28deg, and 47deg) were examined to determine their effect on subjects ability to detect aircraft maneuvering and landing. The results suggest that binocular fields of view much greater than 47deg are unlikely to dramatically improve search performance and that partial binocular overlap is a feasible display technique for augmented reality Tower applications.
Generalized interactions using virtual tools within the spring framework: cutting
NASA Technical Reports Server (NTRS)
Montgomery, Kevin; Bruyns, Cynthia D.
2002-01-01
We present schemes for real-time generalized mesh cutting. Starting with the a basic example, we describe the details of implementing cutting on single and multiple surface objects as well as hybrid and volumetric meshes using virtual tools with single and multiple cutting surfaces. These methods have been implemented in a robust surgical simulation environment allowing us to model procedures ranging from animal dissection to cleft lip correction.
Virtual reality in the assessment of selected cognitive function after brain injury.
Zhang, L; Abreu, B C; Masel, B; Scheibel, R S; Christiansen, C H; Huddleston, N; Ottenbacher, K J
2001-08-01
To assess selected cognitive functions of persons with traumatic brain injury using a computer-simulated virtual reality environment. A computer-simulated virtual kitchen was used to assess the ability of 30 patients with brain injury and 30 volunteers without brain injury to process and sequence information. The overall assessment score was based on the number of correct responses and the time needed to complete daily living tasks. Identical daily living tasks were tested and scored in participants with and without brain injury. Each subject was evaluated twice within 7 to 10 days. A total of 30 tasks were categorized as follows: information processing, problem solving, logical sequencing, and speed of responding. Persons with brain injuries consistently demonstrated a significant decrease in the ability to process information (P = 0.04-0.01), identify logical sequencing (P = 0.04-0.01), and complete the overall assessment (P < 0.01), compared with volunteers without brain injury. The time needed to process tasks, representing speed of cognitive responding, was also significantly different between the two groups (P < 0.01). A computer-generated virtual reality environment represents a reproducible tool to assess selected cognitive functions and can be used as a supplement to traditional rehabilitation assessment in persons with acquired brain injury.
Weiss, Patrice L.; Keshner, Emily A.
2015-01-01
The primary focus of rehabilitation for individuals with loss of upper limb movement as a result of acquired brain injury is the relearning of specific motor skills and daily tasks. This relearning is essential because the loss of upper limb movement often results in a reduced quality of life. Although rehabilitation strives to take advantage of neuroplastic processes during recovery, results of traditional approaches to upper limb rehabilitation have not entirely met this goal. In contrast, enriched training tasks, simulated with a wide range of low- to high-end virtual reality–based simulations, can be used to provide meaningful, repetitive practice together with salient feedback, thereby maximizing neuroplastic processes via motor learning and motor recovery. Such enriched virtual environments have the potential to optimize motor learning by manipulating practice conditions that explicitly engage motivational, cognitive, motor control, and sensory feedback–based learning mechanisms. The objectives of this article are to review motor control and motor learning principles, to discuss how they can be exploited by virtual reality training environments, and to provide evidence concerning current applications for upper limb motor recovery. The limitations of the current technologies with respect to their effectiveness and transfer of learning to daily life tasks also are discussed. PMID:25212522
Modeling mechanical cardiopulmonary interactions for virtual environments.
Kaye, J M
1997-01-01
We have developed a computer system for modeling mechanical cardiopulmonary behavior in an interactive, 3D virtual environment. The system consists of a compact, scalar description of cardiopulmonary mechanics, with an emphasis on respiratory mechanics, that drives deformable 3D anatomy to simulate mechanical behaviors of and interactions between physiological systems. Such an environment can be used to facilitate exploration of cardiopulmonary physiology, particularly in situations that are difficult to reproduce clinically. We integrate 3D deformable body dynamics with new, formal models of (scalar) cardiorespiratory physiology, associating the scalar physiological variables and parameters with corresponding 3D anatomy. Our approach is amenable to modeling patient-specific circumstances in two ways. First, using CT scan data, we apply semi-automatic methods for extracting and reconstructing the anatomy to use in our simulations. Second, our scalar models are defined in terms of clinically-measurable, patient-specific parameters. This paper describes our approach and presents a sample of results showing normal breathing and acute effects of pneumothoraces.
Virtual reality enhanced mannequin (VREM) that is well received by resuscitation experts.
Semeraro, Federico; Frisoli, Antonio; Bergamasco, Massimo; Cerchiari, Erga L
2009-04-01
The objective of this study was to test acceptance of, and interest in, a newly developed prototype of virtual reality enhanced mannequin (VREM) on a sample of congress attendees who volunteered to participate in the evaluation session and to respond to a specifically designed questionnaire. A commercial Laerdal HeartSim 4000 mannequin was developed to integrate virtual reality (VR) technologies with specially developed virtual reality software to increase the immersive perception of emergency scenarios. To evaluate the acceptance of a virtual reality enhanced mannequin (VREM), we presented it to a sample of 39 possible users. Each evaluation session involved one trainee and two instructors with a standardized procedure and scenario: the operator was invited by the instructor to wear the data-gloves and the head mounted display and was briefly introduced to the scope of the simulation. The instructor helped the operator familiarize himself with the environment. After the patient's collapse, the operator was asked to check the patient's clinical conditions and start CPR. Finally, the patient started to recover signs of circulation and the evaluation session was concluded. Each participant was then asked to respond to a questionnaire designed to explore the trainee's perception in the areas of user-friendliness, realism, and interaction/immersion. Overall, the evaluation of the system was very positive, as was the feeling of immersion and realism of the environment and simulation. Overall, 84.6% of the participants judged the virtual reality experience as interesting and believed that its development could be very useful for healthcare training. The prototype of the virtual reality enhanced mannequin was well-liked, without interfence by interaction devices, and deserves full technological development and validation in emergency medical training.
CCSDS Advanced Orbiting Systems Virtual Channel Access Service for QoS MACHETE Model
NASA Technical Reports Server (NTRS)
Jennings, Esther H.; Segui, John S.
2011-01-01
To support various communications requirements imposed by different missions, interplanetary communication protocols need to be designed, validated, and evaluated carefully. Multimission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE), described in "Simulator of Space Communication Networks" (NPO-41373), NASA Tech Briefs, Vol. 29, No. 8 (August 2005), p. 44, combines various tools for simulation and performance analysis of space networks. The MACHETE environment supports orbital analysis, link budget analysis, communications network simulations, and hardware-in-the-loop testing. By building abstract behavioral models of network protocols, one can validate performance after identifying the appropriate metrics of interest. The innovators have extended the MACHETE model library to include a generic link-layer Virtual Channel (VC) model supporting quality-of-service (QoS) controls based on IP streams. The main purpose of this generic Virtual Channel model addition was to interface fine-grain flow-based QoS (quality of service) between the network and MAC layers of the QualNet simulator, a commercial component of MACHETE. This software model adds the capability of mapping IP streams, based on header fields, to virtual channel numbers, allowing extended QoS handling at link layer. This feature further refines the QoS v existing at the network layer. QoS at the network layer (e.g. diffserv) supports few QoS classes, so data from one class will be aggregated together; differentiating between flows internal to a class/priority is not supported. By adding QoS classification capability between network and MAC layers through VC, one maps multiple VCs onto the same physical link. Users then specify different VC weights, and different queuing and scheduling policies at the link layer. This VC model supports system performance analysis of various virtual channel link-layer QoS queuing schemes independent of the network-layer QoS systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
Developing Performance Measures for Army Aviation Collective Training
2011-05-01
simulation-based training, such as ATX, is determined by performance improvement of participants within the virtual-training environment (Bell & Waag ...of the collective behavior (Bell & Waag , 1998). In ATX, system-based (i.e., simulator) data can be used to extract measures such as timing of events...to CABs. 20 21 References Bell, H. H., & Waag , W. L. (1998). Evaluating the effectiveness of flight simulators for training combat
NAVO MSRC Navigator. Fall 2001
2001-01-01
of the CAVE. A view from the VR Juggler simulator . The particles indicate snow (white) & ice (blue). Rainfall is shown on the terrain, and clouds as...the Cover: Virtual environment built by the NAVO MSRC Visualization Center for the Concurrent Computing Laboratory for Materials Simulation at...Louisiana State University. This application allows the researchers to visualize a million atom simulation of an indentor puncturing a block of gallium
Reality check: the role of realism in stress reduction using media technology.
de Kort, Y A W; Ijsselsteijn, W A
2006-04-01
There is a growing interest in the use of virtual and other mediated environments for therapeutic purposes. However, in the domain of restorative environments, virtual reality (VR) technology has hardly been used. Here the tendency has been to use mediated real environments, striving for maximum visual realism. This use of photographic material is mainly based on research in aesthetics judgments that has demonstrated the validity of this type of simulations as representations of real environments. Thus, restoration therapy is developing under the untested assumption that photorealistic images have the optimal level of realism, while in therapeutic applications 'experiential realism' seems to be the key rather than visual realism. The present paper discusses this contrast and briefly describes data of three studies aimed at exploring the importance and meaning of realism in the context of restorative environments.
Wayfinding in Aging and Alzheimer’s Disease within a Virtual Senior Residence: Study Protocol
DAVIS, Rebecca; OHMAN, Jennifer
2017-01-01
Aim To report a study protocol that examines the impact of adding salient cues within a virtual reality simulation of a senior residential building on wayfinding for older adults with and without Alzheimer’s disease. Background An early symptom of Alzheimer’s disease is the inability to find one’s way (wayfinding). Senior residential environments are especially difficult for wayfinding. Salient cues may be able to help persons with Alzheimer’s disease find their way more effectively so they can maintain independence. Design . A repeated measures, within and between subjects design. Methods This study was funded by the National Institutes of Health (August 2012). Older adults (n=40) with normal cognition and older adults with early stage Alzheimer’s disease/mild cognitive impairment (n=40) will try to find their way to a location repeatedly within a virtual reality simulation of senior residence. There are two environments: standard (no cues) and salient (multiple cues). Outcome measures include how often and how quickly participants find the target location in each cue condition. Discussion The results of this study have the potential to provide evidence for ways to make the environment more supportive for wayfinding for older adults with Alzheimer’s disease. This study is registered at Trialmatch.alz.org (Identifier 260425-5). PMID:26915997
Wayfinding in ageing and Alzheimer's disease within a virtual senior residence: study protocol.
Davis, Rebecca; Ohman, Jennifer
2016-07-01
To report a study protocol that examines the impact of adding salient cues in a virtual reality simulation of a senior residential building on wayfinding for older adults with and without Alzheimer's disease. An early symptom of Alzheimer's disease is the inability to find one's way (wayfinding). Senior residential environments are especially difficult for wayfinding. Salient cues may be able to help persons with Alzheimer's disease find their way more effectively so they can maintain independence. A repeated measures, within and between subjects design. This study was funded by the National Institutes of Health (August 2012). Older adults (N = 40) with normal cognition and older adults with early stage Alzheimer's disease/mild cognitive impairment (N = 40) will try to find their way to a location repeatedly in a virtual reality simulation of senior residence. There are two environments: standard (no cues) and salient (multiple cues). Outcome measures include how often and how quickly participants find the target location in each cue condition. The results of this study have the potential to provide evidence for ways to make the environment more supportive for wayfinding for older adults with Alzheimer's disease. This study is registered at Trialmatch.alz.org (Identifier 260425-5). © 2016 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Duncan, K. M.; Harm, D. L.; Crosier, W. G.; Worthington, J. W.
1993-01-01
A unique training device is being developed at the Johnson Space Center Neurosciences Laboratory to help reduce or eliminate Space Motion Sickness (SMS) and spatial orientation disturbances that occur during spaceflight. The Device for Orientation and Motion Environments Preflight Adaptation Trainer (DOME PAT) uses virtual reality technology to simulate some sensory rearrangements experienced by astronauts in microgravity. By exposing a crew member to this novel environment preflight, it is expected that he/she will become partially adapted, and thereby suffer fewer symptoms inflight. The DOME PAT is a 3.7 m spherical dome, within which a 170 by 100 deg field of view computer-generated visual database is projected. The visual database currently in use depicts the interior of a Shuttle spacelab. The trainee uses a six degree-of-freedom, isometric force hand controller to navigate through the virtual environment. Alternatively, the trainee can be 'moved' about within the virtual environment by the instructor, or can look about within the environment by wearing a restraint that controls scene motion in response to head movements. The computer system is comprised of four personal computers that provide the real time control and user interface, and two Silicon Graphics computers that generate the graphical images. The image generator computers use custom algorithms to compensate for spherical image distortion, while maintaining a video update rate of 30 Hz. The DOME PAT is the first such system known to employ virtual reality technology to reduce the untoward effects of the sensory rearrangement associated with exposure to microgravity, and it does so in a very cost-effective manner.
Estimating Distance in Real and Virtual Environments: Does Order Make a Difference?
Ziemer, Christine J.; Plumert, Jodie M.; Cremer, James F.; Kearney, Joseph K.
2010-01-01
This investigation examined how the order in which people experience real and virtual environments influences their distance estimates. Participants made two sets of distance estimates in one of the following conditions: 1) real environment first, virtual environment second; 2) virtual environment first, real environment second; 3) real environment first, real environment second; or 4) virtual environment first, virtual environment second. In Experiment 1, participants imagined how long it would take to walk to targets in real and virtual environments. Participants’ first estimates were significantly more accurate in the real than in the virtual environment. When the second environment was the same as the first environment (real-real and virtual-virtual), participants’ second estimates were also more accurate in the real than in the virtual environment. When the second environment differed from the first environment (real-virtual and virtual-real), however, participants’ second estimates did not differ significantly across the two environments. A second experiment in which participants walked blindfolded to targets in the real environment and imagined how long it would take to walk to targets in the virtual environment replicated these results. These subtle, yet persistent order effects suggest that memory can play an important role in distance perception. PMID:19525540
2000-01-01
for flight test data, and both generic and specialized tools of data filtering , data calibration, modeling , system identification, and simulation...GRAMMATICAL MODEL AND PARSER FOR AIR TRAFFIC CONTROLLER’S COMMANDS 11 A SPEECH-CONTROLLED INTERACTIVE VIRTUAL ENVIRONMENT FOR SHIP FAMILIARIZATION 12... MODELING AND SIMULATION IN THE 21ST CENTURY 23 NEW COTS HARDWARE AND SOFTWARE REDUCE THE COST AND EFFORT IN REPLACING AGING FLIGHT SIMULATORS SUBSYSTEMS
Using Virtualization to Integrate Weather, Climate, and Coastal Science Education
NASA Astrophysics Data System (ADS)
Davis, J. R.; Paramygin, V. A.; Figueiredo, R.; Sheng, Y.
2012-12-01
To better understand and communicate the important roles of weather and climate on the coastal environment, a unique publically available tool is being developed to support research, education, and outreach activities. This tool uses virtualization technologies to facilitate an interactive, hands-on environment in which students, researchers, and general public can perform their own numerical modeling experiments. While prior efforts have focused solely on the study of the coastal and estuary environments, this effort incorporates the community supported weather and climate model (WRF-ARW) into the Coastal Science Educational Virtual Appliance (CSEVA), an education tool used to assist in the learning of coastal transport processes; storm surge and inundation; and evacuation modeling. The Weather Research and Forecasting (WRF) Model is a next-generation, community developed and supported, mesoscale numerical weather prediction system designed to be used internationally for research, operations, and teaching. It includes two dynamical solvers (ARW - Advanced Research WRF and NMM - Nonhydrostatic Mesoscale Model) as well as a data assimilation system. WRF-ARW is the ARW dynamics solver combined with other components of the WRF system which was developed primarily at NCAR, community support provided by the Mesoscale and Microscale Meteorology (MMM) division of National Center for Atmospheric Research (NCAR). Included with WRF is the WRF Pre-processing System (WPS) which is a set of programs to prepare input for real-data simulations. The CSEVA is based on the Grid Appliance (GA) framework and is built using virtual machine (VM) and virtual networking technologies. Virtualization supports integration of an operating system, libraries (e.g. Fortran, C, Perl, NetCDF, etc. necessary to build WRF), web server, numerical models/grids/inputs, pre-/post-processing tools (e.g. WPS / RIP4 or UPS), graphical user interfaces, "Cloud"-computing infrastructure and other tools into a single ready-to-use package. Thus, the previous ornery task of setting up and compiling these tools becomes obsolete and the research, educator or student can focus on using the tools to study the interactions between weather, climate and the coastal environment. The incorporation of WRF into the CSEVA has been designed to be synergistic with the extensive online tutorials and biannual tutorials hosted by NCAR. Included are working examples of the idealized test simulations provided with WRF (2D sea breeze and squalls, a large eddy simulation, a Held and Suarez simulation, etc.) To demonstrate the integration of weather, coastal and coastal science education, example applications are being developed to demonstrate how the system can be used to couple a coastal and estuarine circulation, transport and storm surge model with downscale reanalysis weather and future climate predictions. Documentation, tutorials and the enhanced CSEVA itself will be found on the web at: http://cseva.coastal.ufl.edu.
Virtual Environment Training on Mobile Devices
2013-09-01
NONFUNCTIONAL REQUIREMENTS .............................................. 41 G. PRODUCT FEATURES...52 E. SOFTWARE PRODUCTION .............................................................. 52 F. LIMITATIONS...on Android and iOS tablets. G. PRODUCT FEATURES 1. The final product shall include interactive 3D graphics with simulated representation of actual
Software as a service approach to sensor simulation software deployment
NASA Astrophysics Data System (ADS)
Webster, Steven; Miller, Gordon; Mayott, Gregory
2012-05-01
Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.
Riesen, Eleanor; Morley, Michelle; Clendinneng, Debra; Ogilvie, Susan; Ann Murray, Mary
2012-07-01
Interprofessional simulation interventions, especially when face-to-face, involve considerable resources and require that all participants convene in a single location at a specific time. Scheduling multiple people across different programs is an important barrier to implementing interprofessional education interventions. This study explored a novel way to overcome the challenges associated with scheduling interprofessional learning experiences through the use of simulations in a virtual environment (Web.Alive™) where learners interact as avatars. In this study, 60 recent graduates from nursing, paramedic, police, and child and youth service programs participated in a 2-day workshop designed to improve interprofessional competencies through a blend of learning environments that included virtual face-to-face experiences, traditional face-to-face experiences and online experiences. Changes in learners' interprofessional competence were assessed through three outcomes: change in interprofessional attitudes pre- to post-workshop, self-perceived changes in interprofessional competence and observer ratings of performance across three clinical simulations. Results from the study indicate that from baseline to post-intervention, there was significant improvement in learners' interprofessional competence across all outcomes, and that the blended learning environment provided an acceptable way to develop these competencies.
Virtual reality simulation of fuzzy-logic control during underwater dynamic positioning
NASA Astrophysics Data System (ADS)
Thekkedan, Midhin Das; Chin, Cheng Siong; Woo, Wai Lok
2015-03-01
In this paper, graphical-user-interface (GUI) software for simulation and fuzzy-logic control of a remotely operated vehicle (ROV) using MATLAB™ GUI Designing Environment is proposed. The proposed ROV's GUI platform allows the controller such as fuzzy-logic control systems design to be compared with other controllers such as proportional-integral-derivative (PID) and sliding-mode controller (SMC) systematically and interactively. External disturbance such as sea current can be added to improve the modelling in actual underwater environment. The simulated results showed the position responses of the fuzzy-logic control exhibit reasonable performance under the sea current disturbance.
Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia
Segkouli, Sofia; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos
2015-01-01
Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282
The virtual terrorism response academy: training for high-risk, low-frequency threats.
Henderson, Joseph V
2005-01-01
The Virtual Terrorism Response Academy is a reusable virtual learning environment to prepare emergency responders to deal with high-risk, low-frequency events in general, terrorist attacks in particular. The principal learning strategy is a traditional one: apprenticeship. Trainees enter the Academy and travel through its halls, selecting different learning experiences under the guidance of instructors who are simultaneously master practitioners and master trainers. The mentors are real individuals who have been videotaped according to courseware designs; they are subsequently available at any time or location via broadband Internet or CD-ROM. The Academy features a Simulation Area where trainees are briefed on a given scenario, select appropriate resources (e.g., protective equipment and hazmat instruments), then enter a 3-dimensional space where they must deal with various situations. Simulations are done under the guidance of a master trainer who functions as a coach, asking questions, pointing out things, explaining his reasoning at various points in the simulation. This is followed by a debriefing and discussion of lessons that could be learned from the simulation and the trainee's decisions.
NASA Technical Reports Server (NTRS)
Roberts, Aaron
2005-01-01
New tools for data access and visualization promise to make the analysis of space plasma data both more efficient and more powerful, especially for answering questions about the global structure and dynamics of the Sun-Earth system. We will show how new existing tools (particularly the Virtual Space Physics Observatory-VSPO-and the Visual System for Browsing, Analysis and Retrieval of Data-ViSBARD; look for the acronyms in Google) already provide rapid access to such information as spacecraft orbits, browse plots, and detailed data, as well as visualizations that can quickly unite our view of multispacecraft observations. We will show movies illustrating multispacecraft observations of the solar wind and magnetosphere during a magnetic storm, and of simulations of 3 0-spacecraft observations derived from MHD simulations of the magnetosphere sampled along likely trajectories of the spacecraft for the MagCon mission. An important issue remaining to be solved is how best to integrate simulation data and services into the Virtual Observatory environment, and this talk will hopefully stimulate further discussion along these lines.
Vermeeren, G; Gosselin, M C; Kühn, S; Kellerman, V; Hadjem, A; Gati, A; Joseph, W; Wiart, J; Meyer, F; Kuster, N; Martens, L
2010-09-21
The environment is an important parameter when evaluating the exposure to radio-frequency electromagnetic fields. This study investigates numerically the variation on the whole-body and peak spatially averaged-specific absorption rate (SAR) in the heterogeneous virtual family male placed in front of a base station antenna in a reflective environment. The SAR values in a reflective environment are also compared to the values obtained when no environment is present (free space). The virtual family male has been placed at four distances (30 cm, 1 m, 3 m and 10 m) in front of six base station antennas (operating at 300 MHz, 450 MHz, 900 MHz, 2.1 GHz, 3.5 GHz and 5.0 GHz, respectively) and in three reflective environments (a perfectly conducting wall, a perfectly conducting ground and a perfectly conducting ground + wall). A total of 72 configurations are examined. The absorption in the heterogeneous body model is determined using the 3D electromagnetic (EM) finite-difference time-domain (FDTD) solver Semcad-X. For the larger simulations, requirements in terms of computer resources are reduced by using a generalized Huygens' box approach. It has been observed that the ratio of the SAR in the virtual family male in a reflective environment and the SAR in the virtual family male in the free-space environment ranged from -8.7 dB up to 8.0 dB. A worst-case reflective environment could not be determined. ICNIRP reference levels not always showed to be compliant with the basic restrictions.
eduCRATE--a Virtual Hospital architecture.
Stoicu-Tivadar, Lăcrimioara; Stoicu-Tivadar, Vasile; Berian, Dorin; Drăgan, Simona; Serban, Alexandru; Serban, Corina
2014-01-01
eduCRATE is a complex project proposal which aims to develop a virtual learning environment offering interactive digital content through original and integrated solutions using cloud computing, complex multimedia systems in virtual space and personalized design with avatars. Compared to existing similar products the project brings the novelty of using languages for medical guides in order to ensure a maximum of flexibility. The Virtual Hospital simulations will create interactive clinical scenarios for which students will find solutions for positive diagnosis and therapeutic management. The solution based on cloud computing and immersive multimedia is an attractive option in education because is economical and it matches the current working style of the young generation to whom it addresses.
Classification of EMG signals using artificial neural networks for virtual hand prosthesis control.
Mattioli, Fernando E R; Lamounier, Edgard A; Cardoso, Alexandre; Soares, Alcimar B; Andrade, Adriano O
2011-01-01
Computer-based training systems have been widely studied in the field of human rehabilitation. In health applications, Virtual Reality presents itself as an appropriate tool to simulate training environments without exposing the patients to risks. In particular, virtual prosthetic devices have been used to reduce the great mental effort needed by patients fitted with myoelectric prosthesis, during the training stage. In this paper, the application of Virtual Reality in a hand prosthesis training system is presented. To achieve this, the possibility of exploring Neural Networks in a real-time classification system is discussed. The classification technique used in this work resulted in a 95% success rate when discriminating 4 different hand movements.
Detecting navigational deficits in cognitive aging and Alzheimer disease using virtual reality.
Cushman, Laura A; Stein, Karen; Duffy, Charles J
2008-09-16
Older adults get lost, in many cases because of recognized or incipient Alzheimer disease (AD). In either case, getting lost can be a threat to individual and public safety, as well as to personal autonomy and quality of life. Here we compare our previously described real-world navigation test with a virtual reality (VR) version simulating the same navigational environment. Quantifying real-world navigational performance is difficult and time-consuming. VR testing is a promising alternative, but it has not been compared with closely corresponding real-world testing in aging and AD. We have studied navigation using both real-world and virtual environments in the same subjects: young normal controls (YNCs, n = 35), older normal controls (ONCs, n = 26), patients with mild cognitive impairment (MCI, n = 12), and patients with early AD (EAD, n = 14). We found close correlations between real-world and virtual navigational deficits that increased across groups from YNC to ONC, to MCI, and to EAD. Analyses of subtest performance showed similar profiles of impairment in real-world and virtual testing in all four subject groups. The ONC, MCI, and EAD subjects all showed greatest difficulty in self-orientation and scene localization tests. MCI and EAD patients also showed impaired verbal recall about both test environments. Virtual environment testing provides a valid assessment of navigational skills. Aging and Alzheimer disease (AD) share the same patterns of difficulty in associating visual scenes and locations, which is complicated in AD by the accompanying loss of verbally mediated navigational capacities. We conclude that virtual navigation testing reveals deficits in aging and AD that are associated with potentially grave risks to our patients and the community.
Shared virtual environments for telerehabilitation.
Popescu, George V; Burdea, Grigore; Boian, Rares
2002-01-01
Current VR telerehabilitation systems use offline remote monitoring from the clinic and patient-therapist videoconferencing. Such "store and forward" and video-based systems cannot implement medical services involving patient therapist direct interaction. Real-time telerehabilitation applications (including remote therapy) can be developed using a shared Virtual Environment (VE) architecture. We developed a two-user shared VE for hand telerehabilitation. Each site has a telerehabilitation workstation with a videocamera and a Rutgers Master II (RMII) force feedback glove. Each user can control a virtual hand and interact hapticly with virtual objects. Simulated physical interactions between therapist and patient are implemented using hand force feedback. The therapist's graphic interface contains several virtual panels, which allow control over the rehabilitation process. These controls start a videoconferencing session, collect patient data, or apply therapy. Several experimental telerehabilitation scenarios were successfully tested on a LAN. A Web-based approach to "real-time" patient telemonitoring--the monitoring portal for hand telerehabilitation--was also developed. The therapist interface is implemented as a Java3D applet that monitors patient hand movement. The monitoring portal gives real-time performance on off-the-shelf desktop workstations.
NASA Astrophysics Data System (ADS)
Rankin, Adam; Moore, John; Bainbridge, Daniel; Peters, Terry
2016-03-01
In the past ten years, numerous new surgical and interventional techniques have been developed for treating heart valve disease without the need for cardiopulmonary bypass. Heart valve repair is now being performed in a blood-filled environment, reinforcing the need for accurate and intuitive imaging techniques. Previous work has demonstrated how augmenting ultrasound with virtual representations of specific anatomical landmarks can greatly simplify interventional navigation challenges and increase patient safety. These techniques often complicate interventions by requiring additional steps taken to manually define and initialize virtual models. Furthermore, overlaying virtual elements into real-time image data can also obstruct the view of salient image information. To address these limitations, a system was developed that uses real-time volumetric ultrasound alongside magnetically tracked tools presented in an augmented virtuality environment to provide a streamlined navigation guidance platform. In phantom studies simulating a beating-heart navigation task, procedure duration and tool path metrics have achieved comparable performance to previous work in augmented virtuality techniques, and considerable improvement over standard of care ultrasound guidance.
Klapan, Ivica; Vranjes, Zeljko; Prgomet, Drago; Lukinović, Juraj
2008-03-01
The real-time requirement means that the simulation should be able to follow the actions of the user that may be moving in the virtual environment. The computer system should also store in its memory a three-dimensional (3D) model of the virtual environment. In that case a real-time virtual reality system will update the 3D graphic visualization as the user moves, so that up-to-date visualization is always shown on the computer screen. Upon completion of the tele-operation, the surgeon compares the preoperative and postoperative images and models of the operative field, and studies video records of the procedure itself Using intraoperative records, animated images of the real tele-procedure performed can be designed. Virtual surgery offers the possibility of preoperative planning in rhinology. The intraoperative use of computer in real time requires development of appropriate hardware and software to connect medical instrumentarium with the computer and to operate the computer by thus connected instrumentarium and sophisticated multimedia interfaces.
A numerical tool for reproducing driver behaviour: experiments and predictive simulations.
Casucci, M; Marchitto, M; Cacciabue, P C
2010-03-01
This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.
Virtual goods recommendations in virtual worlds.
Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren
2015-01-01
Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods.
Virtual Goods Recommendations in Virtual Worlds
Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren
2015-01-01
Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods. PMID:25834837
Hands-on Learning in the Virtual World
ERIC Educational Resources Information Center
Branson, John; Thomson, Diane
2013-01-01
The U.S. military has long understood the value of immersive simulations in education. Before the Navy entrusts a ship to a crew, crew members must first practice and demonstrate their competency in a fully immersive, simulated environment. Why not teach students in the same way? K-12 educators in Pennsylvania, USA, recently did just that when…
ERIC Educational Resources Information Center
Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca
2009-01-01
The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…
Immersive virtual reality platform for medical training: a "killer-application".
2000-01-01
The Medical Readiness Trainer (MRT) integrates fully immersive Virtual Reality (VR), highly advanced medical simulation technologies, and medical data to enable unprecedented medical education and training. The flexibility offered by the MRT environment serves as a practical teaching tool today and in the near future the will serve as an ideal vehicle for facilitating the transition to the next level of medical practice, i.e., telepresence and next generation Internet-based collaborative learning.
An Interactive Logistics Centre Information Integration System Using Virtual Reality
NASA Astrophysics Data System (ADS)
Hong, S.; Mao, B.
2018-04-01
The logistics industry plays a very important role in the operation of modern cities. Meanwhile, the development of logistics industry has derived various problems that are urgent to be solved, such as the safety of logistics products. This paper combines the study of logistics industry traceability and logistics centre environment safety supervision with virtual reality technology, creates an interactive logistics centre information integration system. The proposed system utilizes the immerse characteristic of virtual reality, to simulate the real logistics centre scene distinctly, which can make operation staff conduct safety supervision training at any time without regional restrictions. On the one hand, a large number of sensor data can be used to simulate a variety of disaster emergency situations. On the other hand, collecting personnel operation data, to analyse the improper operation, which can improve the training efficiency greatly.
Virtual DRI dataset development
NASA Astrophysics Data System (ADS)
Hixson, Jonathan G.; Teaney, Brian P.; May, Christopher; Maurer, Tana; Nelson, Michael B.; Pham, Justin R.
2017-05-01
The U.S. Army RDECOM CERDEC NVESD MSD's target acquisition models have been used for many years by the military analysis community for sensor design, trade studies, and field performance prediction. This paper analyzes the results of perception tests performed to compare the results of a field DRI (Detection, Recognition, and Identification Test) performed in 2009 to current Soldier performance viewing the same imagery in a laboratory environment and simulated imagery of the same data set. The purpose of the experiment is to build a robust data set for use in the virtual prototyping of infrared sensors. This data set will provide a strong foundation relating, model predictions, field DRI results and simulated imagery.
Augmented Reality for Close Quarters Combat
None
2018-01-16
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
The charged particle accelerators subsystems modeling
NASA Astrophysics Data System (ADS)
Averyanov, G. P.; Kobylyatskiy, A. V.
2017-01-01
Presented web-based resource for information support the engineering, science and education in Electrophysics, containing web-based tools for simulation subsystems charged particle accelerators. Formulated the development motivation of Web-Environment for Virtual Electrophysical Laboratories. Analyzes the trends of designs the dynamic web-environments for supporting of scientific research and E-learning, within the framework of Open Education concept.
ERIC Educational Resources Information Center
Alvarez, Nahum; Sanchez-Ruiz, Antonio; Cavazza, Marc; Shigematsu, Mika; Prendinger, Helmut
2015-01-01
The use of three-dimensional virtual environments in training applications supports the simulation of complex scenarios and realistic object behaviour. While these environments have the potential to provide an advanced training experience to students, it is difficult to design and manage a training session in real time due to the number of…
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
NASA Technical Reports Server (NTRS)
Rabelo, Luis C.
2002-01-01
This is a report of my activities as a NASA Fellow during the summer of 2002 at the NASA Kennedy Space Center (KSC). The core of these activities is the assigned project: the Virtual Test Bed (VTB) from the Spaceport Engineering and Technology Directorate. The VTB Project has its foundations in the NASA Ames Research Center (ARC) Intelligent Launch & Range Operations program. The objective of the VTB project is to develop a new and unique collaborative computing environment where simulation models can be hosted and integrated in a seamless fashion. This collaborative computing environment will be used to build a Virtual Range as well as a Virtual Spaceport. This project will work as a technology pipeline to research, develop, test and validate R&D efforts against real time operations without interfering with the actual operations or consuming the operational personnel s time. This report will also focus on the systems issues required to conceptualize and provide form to a systems architecture capable of handling the different demands.
Interface for Physics Simulation Engines
NASA Technical Reports Server (NTRS)
Damer, Bruce
2007-01-01
DSS-Prototyper is an open-source, realtime 3D virtual environment software that supports design simulation for the new Vision for Space Exploration (VSE). This is a simulation of NASA's proposed Robotic Lunar Exploration Program, second mission (RLEP2). It simulates the Lunar Surface Access Module (LSAM), which is designed to carry up to four astronauts to the lunar surface for durations of a week or longer. This simulation shows the virtual vehicle making approaches and landings on a variety of lunar terrains. The physics of the descent engine thrust vector, production of dust, and the dynamics of the suspension are all modeled in this set of simulations. The RLEP2 simulations are drivable (by keyboard or joystick) virtual rovers with controls for speed and motor torque, and can be articulated into higher or lower centers of gravity (depending on driving hazards) to enable drill placement. Gravity also can be set to lunar, terrestrial, or zero-g. This software has been used to support NASA's Marshall Space Flight Center in simulations of proposed vehicles for robotically exploring the lunar surface for water ice, and could be used to model all other aspects of the VSE from the Ares launch vehicles and Crew Exploration Vehicle (CEV) to the International Space Station (ISS). This simulator may be installed and operated on any Windows PC with an installed 3D graphics card.
Virtual Reality Training System for Anytime/Anywhere Acquisition of Surgical Skills: A Pilot Study.
Zahiri, Mohsen; Booton, Ryan; Nelson, Carl A; Oleynikov, Dmitry; Siu, Ka-Chun
2018-03-01
This article presents a hardware/software simulation environment suitable for anytime/anywhere surgical skills training. It blends the advantages of physical hardware and task analogs with the flexibility of virtual environments. This is further enhanced by a web-based implementation of training feedback accessible to both trainees and trainers. Our training system provides a self-paced and interactive means to attain proficiency in basic tasks that could potentially be applied across a spectrum of trainees from first responder field medical personnel to physicians. This results in a powerful training tool for surgical skills acquisition relevant to helping injured warfighters.
Yim, Wen-Wai; Chien, Shu; Kusumoto, Yasuyuki; Date, Susumu; Haga, Jason
2010-01-01
Large-scale in-silico screening is a necessary part of drug discovery and Grid computing is one answer to this demand. A disadvantage of using Grid computing is the heterogeneous computational environments characteristic of a Grid. In our study, we have found that for the molecular docking simulation program DOCK, different clusters within a Grid organization can yield inconsistent results. Because DOCK in-silico virtual screening (VS) is currently used to help select chemical compounds to test with in-vitro experiments, such differences have little effect on the validity of using virtual screening before subsequent steps in the drug discovery process. However, it is difficult to predict whether the accumulation of these discrepancies over sequentially repeated VS experiments will significantly alter the results if VS is used as the primary means for identifying potential drugs. Moreover, such discrepancies may be unacceptable for other applications requiring more stringent thresholds. This highlights the need for establishing a more complete solution to provide the best scientific accuracy when executing an application across Grids. One possible solution to platform heterogeneity in DOCK performance explored in our study involved the use of virtual machines as a layer of abstraction. This study investigated the feasibility and practicality of using virtual machine and recent cloud computing technologies in a biological research application. We examined the differences and variations of DOCK VS variables, across a Grid environment composed of different clusters, with and without virtualization. The uniform computer environment provided by virtual machines eliminated inconsistent DOCK VS results caused by heterogeneous clusters, however, the execution time for the DOCK VS increased. In our particular experiments, overhead costs were found to be an average of 41% and 2% in execution time for two different clusters, while the actual magnitudes of the execution time costs were minimal. Despite the increase in overhead, virtual clusters are an ideal solution for Grid heterogeneity. With greater development of virtual cluster technology in Grid environments, the problem of platform heterogeneity may be eliminated through virtualization, allowing greater usage of VS, and will benefit all Grid applications in general.
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.
2006-01-01
The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed wi th respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the use ful specifications of augmented reality displays, an optical see-thro ugh display was used in an ATC Tower simulation. Three different binocular fields of view (14 deg, 28 deg, and 47 deg) were examined to det ermine their effect on subjects# ability to detect aircraft maneuveri ng and landing. The results suggest that binocular fields of view much greater than 47 deg are unlikely to dramatically improve search perf ormance and that partial binocular overlap is a feasible display tech nique for augmented reality Tower applications.
Virtual reality for health care: a survey.
Moline, J
1997-01-01
This report surveys the state of the art in applications of virtual environments and related technologies for health care. Applications of these technologies are being developed for health care in the following areas: surgical procedures (remote surgery or telepresence, augmented or enhanced surgery, and planning and simulation of procedures before surgery); medical therapy; preventive medicine and patient education; medical education and training; visualization of massive medical databases; skill enhancement and rehabilitation; and architectural design for health-care facilities. To date, such applications have improved the quality of health care, and in the future they will result in substantial cost savings. Tools that respond to the needs of present virtual environment systems are being refined or developed. However, additional large-scale research is necessary in the following areas: user studies, use of robots for telepresence procedures, enhanced system reality, and improved system functionality.
Fusion interfaces for tactical environments: An application of virtual reality technology
NASA Technical Reports Server (NTRS)
Haas, Michael W.
1994-01-01
The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and nonvirtual concepts and devices across the visual, auditory, and haptic sensory modalities. A fusion interface is a multisensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion interface concepts. This new facility, the Fusion Interfaces for Tactical Environments (FITE) Facility is a specialized flight simulator enabling efficient concept development through rapid prototyping and direct experience of new fusion concepts. The FITE Facility also supports evaluation of fusion concepts by operation fighter pilots in an air combat environment. The facility is utilized by a multidisciplinary design team composed of human factors engineers, electronics engineers, computer scientists, experimental psychologists, and oeprational pilots. The FITE computational architecture is composed of twenty-five 80486-based microcomputers operating in real-time. The microcomputers generate out-the-window visuals, in-cockpit and head-mounted visuals, localized auditory presentations, haptic displays on the stick and rudder pedals, as well as executing weapons models, aerodynamic models, and threat models.